Dangers and Opportunities of Technology:Perspectives from the Humanities
This program provides funding for research projects that explore the societal impacts of technology, particularly focusing on the ethical, legal, and cultural implications of artificial intelligence, aimed at institutions and collaborative teams.
Description
This program supports research that examines technology and its relationship to society through the lens of the humanities, with a focus on the dangers and/or opportunities presented by technology, broadly defined. NEH is particularly interested in projects that examine the role of technology in shaping current social and cultural issues. The DOT program is part of the NEH’s American Tapestry initiative.
The program support projects led individual researchers (up to $75K) and by collaborative teams (up to $150K).
Note about Humanities Perspectives on Artificial Intelligence
This grant program is one of ten NEH programs that are part of NEH’s Humanities Perspectives on Artificial Intelligence initiative, which is encouraging research on the ethical, legal, and societal implications of AI. To learn more about the initiative, please see our page about the AI initiative.
What’s new for 2023-24:
A special encouragement for research projects that seek to understand and address the ethical, legal, and societal implications of artificial intelligence (A.I.).
Improved guidance for applicants proposing to conduct surveys, interviews, or digital ethnographic studies.
Reminders:
Applicants to this program are institutions.
Applications will be declared ineligible for review if they do not include all required sections and components (e.g., Budget form with a Budget Justification, Biographies and not CVs).
Applications will be declared ineligible for review if they do not comply with all requirements indicated with a “must” outlined in the NOFO, including page limits.
Two or more applications for federal funding and/or approved federal award budgets are not permitted to include overlapping project costs.