Democratizing social scientists’ impact on federal policy: Using the evidence act to help government and ourselves
Edited by Margaret Levi, Stanford University, Sanford, CA; received October 4, 2023; accepted January 24, 2024
Abstract
It is common for social scientists to discuss the implications of our research for policy. However, what actions can we take to inform policy in more immediate and impactful ways, regardless of our existing institutional affiliations or personal connections? Focusing on federal policy, I suggest that the answer requires understanding a basic coordination problem. On the government side, the Foundations of Evidence-based Policymaking Act (2018) requires that large federal agencies pose, communicate, and answer research questions related to their effects on people and communities. This advancement has opened the black box of federal agency policy priorities, but it has not addressed capacity challenges: These agencies often do not have the financial resources or staff to answer the research questions they pose. On the higher education side, we have more than 150,000 academic social scientists who are knowledge producers and educators by training and vocation. However, especially among those in disciplinary departments, or those without existing institutional or personal connections to federal agencies, we often feel locked out of federal policymaking processes. In this article, I define the coordination problem and offer concrete actions that the academic and federal government communities can take to address it. I also offer leading examples of how academics and universities are making public policy impact possible in multiple governmental spheres. I conclude by arguing that both higher education institutions and all levels of government can do more to help academic social scientists put our knowledge to work in service of the public good.
Sign up for PNAS alerts.
Get alerts for new articles, or get an alert when an article is cited.
A perennial question for US colleges and universities is the extent to which academic research can impact public policy. This question feels especially pressing when applied to social science research, since it often seeks to engage, diagnose, and address major social and economic challenges. However, the answers currently supplied (1) rarely specify what actions academic social scientists actually can take, regardless of our existing institutional or personal connections, to make policy impact possible.
After serving as Assistant Director of Evidence and Policy in the Biden-Harris Administration’s White House Office of Science and Technology Policy, where I helped to ensure that decisions and policies would be guided by the best-available science and data (2), I’ve developed a set of very concrete ideas. To be clear about the scope of these ideas, I mainly write regarding connections between academic social scientists and federal policymaking efforts—particularly those emerging from federal agencies (e.g., the Departments of Education (ED), Health and Human Services (HHS), or Labor (DOL), among others). I also focus on connections made possible by the Foundations for Evidence-based Policymaking Act of 2018 (3) (i.e., the “Evidence Act”), which I describe in greater detail below. While ties between academic social scientists, especially economists, and executive branch policy already occur to some extent (consider, for example, economists’ participation in the Council of Economic Advisors), they often are limited, exclusive, and elusive. Meanwhile, the Evidence Act has opened more democratic and attainable opportunities for social scientists from a variety of disciplines to connect directly with federal agencies, making it a critically important evolution and the focus of this piece.
Keeping these parameters in mind, I argue that there is a basic coordination problem between federal executive agencies and the higher education sector. We, as academics, can help to understand, characterize, and solve it.
I’ll start by providing some context for this argument, then will define the coordination problem and offer several concrete actions that academic social scientists can take to engage the brass tacks of federal policy impact. I will conclude by sharing several leading examples of how scholars and universities are making this kind of impact possible—and will argue that both higher education institutions and the federal government can do more (much more) to help us put our valuable knowledge to work in service of the public good.
The Policy Context: Understanding the Government Side of the Coordination Problem
While not well known outside of policy circles, the Evidence Act marked a paradigm shift in the way that federal agencies develop, communicate, and incorporate policy-relevant research into their operations. Co-sponsored by then-US House Speaker, Paul Ryan (R-WI), and US. Senator, Patty Murray (D-WA), and signed into law in January 2019, the Evidence Act required that the 24 largest federal agencies create the structural architecture necessary to pose and answer policy questions related to their impacts. It additionally mandated that these agencies—organizations like ED, DOL, HHS, and others—make their research plans and priorities public. In addition, it encouraged federal agencies to pass evidence requirements down to state and local governments through mechanisms like data and evidence requirements attached to the receipt of federal funds.
These requirements may seem rudimentary—why hadn’t such steps been taken before? But in fact, they have been deeply consequential for the way the federal government thinks about research. They have meant that federal law requires agencies to integrate formal research agendas into their strategic plans, budget proposals, and yearly reporting activities; share those research agendas openly and transparently with the public; and report on progress to Congress and the White House Office of Management and Budget (OMB). While some agencies had pursued practices like these prior to the Evidence Act, most had not. The Evidence Act accordingly has provided a major opportunity to advance evidence-based decision-making and to elevate social science research throughout the executive branch of government.
That said, the opportunities opened by the Evidence Act have not always come to full fruition: the Evidence Act did not come with any new funds, known as appropriations. This limitation has meant that federal agencies have had to reallocate existing resources and scramble to mobilize already-taxed employees to meet diverse needs, from reporting to Congress, to managing teams by taking on additional titles like “Evaluation Officer” or “Chief Data Officer”—roles mandated by the Evidence Act—to performing research and evaluation. It’s a lot, and many federal agencies are doing the very best that they can.
At some point, however, the resource juggling and constant need to be creative within constraints gets old. In addition, though stretched federal agency staff have managed to comply with Evidence Act requirements (e.g., producing agency-wide strategic plans with commitments to generate and use evidence (4)), the shift from “box checking” to “integrated evidence use” is a much more difficult task and requires greater support. Indeed, many agencies lack the capacity to perform the research they have planned for or to translate research insights into concrete implications for their policies, programs, and practices, despite a willingness and desire to do so (5).
The Academic Context: Understanding Our Side of the Coordination Problem
On the side of academia, we have over 150,000 academic social scientists who are knowledge producers and educators by training and vocation (6)*. However, many of us feel locked out of the black box of government (1, 7). We often are working without the skills, knowledge, time, professional incentive structures, or perhaps even desire to feed our knowledge into the federal government abyss.
Two important caveats apply to this statement.
First: the sentiment of being “locked out” may be less prevalent in certain social science disciplines and university components than in others, as briefly mentioned above. Some social sciences like economics, as well as disciplines like education and public health, engage federal policy more readily. The same can be said for certain components of universities, such as public policy schools. Yet, even when academic social scientists in these more connected disciplines and schools do engage federal policymakers, it often is at least in part because of existing elite institutional and/or personal connections. As just one example, the institutional affiliations of the Chairs of the Council of Economic Advisors over the past decade include Columbia, Harvard, and Princeton Universities, as well as the University of Chicago (I say this with full awareness of my own institutional affiliation). Only one of these leaders identified as a woman and/or person of color.
This kind of exclusivity is problematic if we wish to address the coordination problem, which requires scaling and diversifying social scientists’ connectedness to federal government agencies. It also limits the scope of academic social scientists’ policy impact, since greater institutional, disciplinary, and individual diversity is likely to increase the variety of thoughtful perspectives informing solutions to thorny social problems prioritized by federal policy leaders. Speaking of my own discipline, sociology, those who pursue interview-based, community-engaged research (8–10) are likely to bring different knowledge and ideas to a social problem than those who primarily work with and analyze survey data. Yet, both groups are trained to make sense of the social world in systematic ways, and many of us are driven by a desire to understand and pursue solutions for social and economic inequality. Bringing diverse intellectual and disciplinary perspectives to the federal policy table can bear real fruit, especially in administrations like the current one where commitments to equity have been made early and often (11).
Second: engaging with research topics that are priorities for the federal government is not risk free, especially in the current age of political polarization. This statement is especially true for topics that interest the legislative branch of government, where polarization is all-encompassing. During the summer of 2023, for example, the Republican leadership of the House Judiciary Committee accused universities and researchers of colluding with the executive branch of government to suppress conservative online speech (12). The Committee did so by claiming that academic studies of disinformation—a key priority of the Biden-Harris Administration (13)—interfered with principles of free speech. Their deliberations are ongoing.
Given this second caveat, it is even more imperative for academic social scientists to understand the potential opportunities and challenges that may accompany greater connectedness to the federal government, especially if that connectedness implicates the legislative branch in addition to the executive branch. Yet, without direct engagement with colleagues in the federal government, and strong efforts to surmount the sometimes-tricky navigation of relationships across the governmental and academic sectors, solutions to the coordination problem will remain obscured.
Framing a Solution to the Coordination Problem
Solving the coordination problem requires proactive efforts on the part of academics, increased support from our higher education institutions, and further guidance and outreach from the federal government.
As scholars, we need to spend more time formulating and activating our research in service of the public good. To be clear, high-quality, rigorous, and robust research must remain central to our work. However, too many hard questions and systematic ways of knowing are aired only in our lecture halls, and too many smart, research-informed solutions to pressing social challenges are sitting in the pages of our journals (little-known fact: many, if not most, federal agency employees do not have free access to academic journals at all).
To remedy these issues, academics should ask ourselves three key questions: What information do policymakers need from us? How does our existing research align with these needs? And what would it require—personally, organizationally, and institutionally—to become more aligned in the future?
One clear strategy is to invest more intensively in understanding what “impacting policy” means and entails from a tactical perspective. While I will focus on the federal case, this kind of meta-research endeavor can take place at any level of policy: Extra-local (our own universities, schools, community centers), local (our towns and cities), state, Tribal, or territorial, in addition to federal. Pursuing such an endeavor can provide helpful insights regarding the conditions that might make engagement in policy viable for non-government social scientists.
For example: knowing about the intensive capacity constraints of federal agency employees (14) has informed my argument that we, the academics, should act as first movers. We can assess, proactively, where our research fits into the patchwork of government endeavors. We can identify where, and with whom, it is possible to share broad agenda-setting insights and more specific research findings; we also can consider developing new research projects with the specific goal of helping to refine or address policy-related topics that government partners have made clear are their priorities. We then can connect with federal agency policymakers and their staff directly to contribute what we know, to translate our knowledge into recommended actions, and to describe why such actions may improve conditions for people across the country in their daily lives.
Concrete Actions to Jumpstart Coordination
While these tasks may seem both enormous and amorphous, there in fact are action steps that we can take right now, regardless of our institutional affiliations or personal connections, because of the Evidence Act. Here is one very concrete example of policy impact in federal agencies, though there are likely equivalents at other levels of government.
During the spring of 2022, federal agencies made their Learning Agendas (15), the documents required by the Evidence Act to identify and share their research priorities, public. These documents will remain public as they are updated over time. So, as a first action step, we can read these research agendas using the Learning Agenda dashboard (16), found under the “Explore” tab on the primary government website for issues related to evidence-informed policy, evaluation.gov.
Second, we can assess whether a) we have broad insights or specific research findings that are relevant to these and related questions and/or b) we want to perform new research that is applicable or related to these questions. Do we envision benefits to aligning our research agendas with the pressing questions that federal agencies say they want answered? Or to helping government colleagues formulate better questions based on the knowledge we have gathered and generated?
Third, we can email [email protected], which government colleagues in OMB’s Evidence Team check every day, with five or six sentences, highlighting the research that we have or the research that we want to do in support of the federal agency. For context, though government email addresses like this may seem generic, they are one of the primary tools that federal offices, departments, and agencies are able to use—legally—to connect with members of the public (17, 18). So, for example, the purpose of [email protected] is to capture thoughts, ideas, and questions from members of the public that subsequently can be routed to appropriate federal agency research leaders (despite the fact that there is, surprisingly to my mind, almost no email traffic). Scholars with expertise in the social and behavioral sciences have a specialized mailbox available to us, [email protected], which was established alongside the reconstitution of a government-wide Social and Behavioral Sciences Subcommittee (19) of the National Science and Technology Council.
Fourth, we can and should be persistent, sending reminder emails as necessary. Often, it takes time to answer messages like this among people in government not because of lack of desire but because of lack of capacity; these are people who often care deeply about the things we care about.
To make this action sequence even more concrete, I will use myself as a case study.
Given that much of my scholarship interrogates and recommends solutions for addressing race- and class-based inequalities in higher education, I first will use the Learning Agenda Dashboard to filter for “Education.” Here, I see that the Department of Education is asking: “What resources, supports, and services do students, including those at the greatest disadvantage, receive to support their successful completion of a postsecondary credential? Additionally, which policies, programs, services, and practices are effective in achieving that goal?”
Do I have ideas or research to support this inquiry? Yes.
So, I next would formulate an email to send to [email protected]. I might write something like this:
Hello, I am a scholar of inequality who focuses on lessening racial and socioeconomic gaps in bachelor’s degree completion in the United States. I see from their Learning Agenda that the Department of Education would like to focus on this issue. A recent article that I’ve written, published in the American Journal of Sociology (20), shows that the addition of financial resources to underrepresented students is insufficient to increase their likelihood of graduation directly. Instead, financial resources need to be combined with evidence-based, high-impact support systems, such as regular, tailored advising, and specific major declaration guidance. I would be pleased to brief members of the Department of Education on these findings and to support the Learning Agenda in any way that could be useful. Could you please connect me to the right person or people at the Department of Education?
Hearing nothing for a week, I would re-send this message with a kind reminder note.
Additional Coordination Solutions
Beyond emails, there are other concrete options for academics to pursue to help solve the coordination problem. For example, there are new, pilot initiatives coming directly from federal agencies to recruit academics into the fold of research-based problem-solving and policymaking. One, the Analytics for Equity Initiative (21), led by the National Science Foundation (NSF), is awarding contracts to research teams that deliver proposals for answering specific, policy-forward questions regarding equity that are priorities for the pilot agencies. The nine awardees were announced in December 2023 and included projects like “Job Satisfaction, Stress, and Burnout: Impact of Postdoctoral Experience and the Moderating Effect of Institutional- and Individual-Level Factors” (Ganta and Ynalvez, Texas A&M International University) (22) and “Building Health Equity in the Navajo Nation through Integrating Indigenous Knowledge, Community-Generated Data, and Federal Statistics (BEINGs Project)” (Wang, New Mexico Tech, and Illafe, Navajo Technical University) (23). This effort will mark the first time that academic teams are working directly with federal agencies to support equity-forward Learning Agenda questions.
In addition to efforts like Analytics for Equity, some federal agencies also have new and longstanding initiatives that incorporate academics, like the Department of Labor’s Summer Fellowship Program (24), rotator programs at NSF (25), or fellowships through the General Service Administration’s Office of Evaluation Sciences (26). Various connector organizations, such as the Day One Project (27) and the Partnership for Public Service (28), often sponsor researchers to take tours of service inside the federal government through the Intergovernmental Personnel Act (29). All of these programs are open and available for our participation.
As another action, we can pursue an idea discussed often in the growing body of literature on evidence use: Direct relationships are key for mobilizing research to inform policy (30–32). Following this thread, it is also the case that think tanks, and other, similar organizations (33) are among the most effective nongovernmental bodies contributing research-based ideas to federal government decision-making efforts (34). By investing concerted time and energy in relationship building with decision-makers and their teams, and by sharing our research with translational organizations like think tanks directly, the research is more likely to make its way into policy deliberations.
Professional associations, such as the American Sociological Association, the Population Association of America, the American Educational Research Association, and the Consortium of Social Science Associations among others, also might serve as important bridges in this work. They are positioned to mobilize the members of academic disciplines in service of federal government policy priorities and to make relationship-building between academics and policymakers more tractable. While some of this work is ongoing, professional associations can do more by organizing outreach events linking academics and federal agency staff, arranging briefings for Congressional staffers, or sharing key research findings with related federal agencies in pithy policy papers.
Bringing Higher Education Institutions along with Us
In light of the coordination problem, a question emerges: What changes are required in US colleges and universities to support faculty in helping to solve that problem?
Once again, scholars have a role to play in making advances against often-strong headwinds, providing our ideas directly. For example, what if time spent building relationships with think tanks or translating journal articles to legible reports for policymakers could count not as “extra” but as central to evaluation metrics for faculty? What if our graduate programs, across all manner of academic disciplines, incorporated training on the ethics of engaging in policy debates as scholars, as well as primers on the structure of state and federal government? What if we compelled our college and university leaders to mobilize higher education associations, places like the American Council on Education, the American Association of Universities, and the American Association of Public and Land Grant Universities, to use their Washington, DC connections to advocate not only for our institutions but also for stronger links between the academic research community and federal policy leaders? Or—though this idea would require navigating substantial political and other obstacles—what if well-resourced universities joined together to form a common fund to support scholars committed to answering government policy questions, situated across all different types of colleges and universities?
These are just a few of many ideas; and I am certainly not the first social scientist to think about this issue (35, 36). Consider, for example, places like Stanford and Brown Universities, among others, which have developed various “policy lab” models with the explicit purposes of cultivating and activating holistic social science research focused on social problem solving (37, 38). Or Pennsylvania State University, where their Evidence-to-Impact Collaborative is training young scholars and regularly engaging with state-level and Congressional officials to connect research to policy discussions, building important relationships in the process (39). Or the various universities working with the William T. Grant Foundation through the Foundation’s “Institutional Challenge Grants” program, which connects universities with public agencies and nonprofit organizations in service of reducing youth inequality (40). These are great exemplars.
However, there is more to do, building on the fundamental idea that scholars can serve as creative problem-solvers for the coordination problem. Academics can and should advocate for our colleges and universities to provide the incentive structures to make it possible.
Academics Can’t Do It Alone
The coordination problem has two sides. Scholars and our colleges and universities represent one, but governments (federal, state, local, Tribal, and territorial) represent the other.
At the level of federal agencies, despite progress made since the Evidence Act, and even though capacity challenges can prove stultifying, the US government can do more. It can actively welcome research insights, and the researchers who produce them, into its often-closed system, through more routinized briefings and convenings, through sustained support for comprehensive evidence synthesis and translation across federal agencies, and through large-scale commitments, just like those made over the past year (41), to opening public access to research findings. It can expand on and develop more programs like Analytics for Equity, creating formal structures to knit together the academic and government sectors. It can double down on commitments to protecting researchers’ academic freedom and personal safety when pursuing politically fraught topics, regardless of our area of academic inquiry. It can communicate more clearly and adamantly—through channels that academics frequent, such as professional associations—about what the research needs are and what extramural researchers can do to help. In addition, it certainly can continue to press for more funding for evidence-based efforts on the Hill. The list is longer for sure, but efforts such as these would prove a strong start.
To Conclude
Given that a wide variety of concrete actions are available to us, as academic social scientists, to impact federal policy, the reasons not to mobilize our skills, knowledge, and expertise in service of this goal are few. We are, at heart, a problem-solving bunch. It is well worth our effort to harness that instinct in service of the public good.
Data, Materials, and Software Availability
There are no data underlying this work.
Acknowledgments
Author contributions
C.C.E. wrote the paper.
Competing interests
The author declares no competing interest.
References
1
K. Oliver, P. Cairney, The dos and don’ts of influencing policy: A systematic review of advice to academics. Palgrave Commun. 5, 21 (2019).
2
J. R. Biden, Memorandum on restoring trust in government through scientific integrity and evidence-based policymaking. https://www.whitehouse.gov/briefing-room/presidential-actions/2021/01/27/memorandum-on-restoring-trust-in-government-through-scientific-integrity-and-evidence-based-policymaking/ (2021). Accessed 22 February 2024.
3
H.R.4174–115th Congress (2017-2018), Foundations for evidence-based policymaking act of 2018. https://www.congress.gov/bill/115th-congress/house-bill/4174/text. 14 January 2019.
4
U.S. Performance.gov Team, A commitment to results: Federal agency strategic plans now available. https://www.performance.gov/blog/agency-strategic-plans-available/ (2022). Accessed 22 February 2024.
5
C. Ciocca Eller, “The power of evidence to drive America’s progress: A decade of results and potential for the future” (Results for America, Washington, D.C., 2024). https://results4america.org/wp-content/uploads/2024/02/The-Power-of-Evidence-to-Drive-Americas-Progress-Results-for-America.pdf. Accessed 22 February 2024.
6
American Academy of Arts and Sciences, Number of faculty members in humanities and other fields. https://www.amacad.org/humanities-indicators/workforce/number-faculty-members-humanities-and-other-fields#31692 (2019). Accessed 22 February 2024.
7
M. C. Evans, C. Cvitanovic, An introduction to achieving policy impact for early career researchers. Palgrave Commun. 4, 88 (2018).
8
L. M. Vaughn, J. R. Jones, E. Booth, J. G. Burke, Concept mapping methodology and community-engaged research: A perfect pairing. Eval. Program Plann. 60, 229–237 (2017).
9
N. Wallerstein et al., Engage for equity: A long-term study of community-based participatory research and community-engaged research practices and outcomes. Health Edu. Behav. 47, 380–390 (2020).
10
G. Wong-Parodi, Community-engaged research is stronger and more impactful. Nat. Hum. Behav. 6, 1601–1602 (2022).
11
J. R. Biden, Executive order on advancing racial equity and support for underserved communities through the federal government. https://www.whitehouse.gov/briefing-room/presidential-actions/2021/01/20/executive-order-advancing-racial-equity-and-support-for-underserved-communities-through-the-federal-government/ (2021). Accessed 22 February 2024.
12
S. L. Myers, S. Frenkel, G.O.P. Targets researchers who study disinformation ahead of the 2024 election, New York Times, 20 June 2023, Section A, p. 1. https://www.nytimes.com/2023/06/19/technology/gop-disinformation-researchers-2024-election.html.
13
Information Integrity Research & Development Interagency Working Group of the National Science and Technology Council, Roadmap for researchers on priorities related to information integrity research and development. https://www.whitehouse.gov/wp-content/uploads/2022/12/Roadmap-Information-Integrity-RD-2022.pdf (2022). Accessed 22 February 2024.
14
T. Temin, Is federal employee burnout a thing?, Federal News Network, 28 April 2022. https://federalnewsnetwork.com/federal-report/2022/04/is-federal-employee-burnout-a-thing/. Accessed 22 February 2024.
15
D. S. Nightingale, K. Fudge, W. SchupmannEvidence Toolkit: Learning Agendas (The Urban Institute, Washington, DC, 2018).
16
U.S. Office of Management and Budget Evidence Team, New tool to explore agency learning agenda questions!. https://www.evaluation.gov/2022-11-10-learning-agenda-questions/ (2022). Accessed 22 February 2024.
17
Digital.gov., A guide to the paperwork reduction act. https://pra.digital.gov/. Accessed 22 February 2024.
18
U.S. General Services Administration, The federal advisory committee act (FACA) brochure. https://www.gsa.gov/policy-regulations/policy/federal-advisory-committee-management/advice-and-guidance/the-federal-advisory-committee-act-faca-brochure. Accessed 22 February 2024.
19
National Science and Technology Council, Policy development in the social and behavioral sciences subcommittee. https://www.whitehouse.gov/wp-content/uploads/2022/09/09-2022-Policy-Development-in-the-Social-and-Behavioral-Sciences-Subcommittee.pdf (2022). Accessed 22 February 2024.
20
C. Ciocca Eller, What makes a quality college? Re-examining the equalizing potential of higher education in the United States. Am. J. Sociol. 129, 637–714 (2023).
21
National Science Foundation, The analytics for equity initiative. https://beta.nsf.gov/od/oia/eac/analytics-equity-initiative. Accessed 22 February 2024.
22
D. Ganta, M. Ynalvez, Job satisfaction, stress, and burnout: Impact of postdoctoral experience and the moderating effect of institutional- and individual-level factors. https://nsf-gov-resources.nsf.gov/2023-11/AE-T1.pdf?VersionId=Hbg6ubTeI.lTkZbpsXcOaL1PyqeC_tbo (2023). Accessed 22 February 2024.
23
H. Wang, M. Illafe, Building health equity in the navajo nation through integrating indigenous knowledge, community-generated data, and federal statistics (BEINGs project). https://nsf-gov-resources.nsf.gov/2023-11/AE-T4-I.pdf?VersionId=jGclyOM7gioV1.N77RTmkZDShBIwE5B3 (2023). Accessed 22 February 2024.
24
U.S. Department of Labor, Office of the assistant secretary for policy. CEO summer fellowship program. Accessed online: https://www.dol.gov/agencies/oasp/evaluation/CEO-summer-fellowship-program. Accessed 22 February 2024.
25
National Science Foundation, Rotator programs. https://beta.nsf.gov/careers/rotator-programs#:~:text=Take%20advantage%20of%20a%20rare,temporary%20program%20didirecto%20%2D%20called%20rotators. Accessed 22 February 2024.
26
U.S. General Services Administration, Office of evaluation sciences. https://oes.gsa.gov/. Accessed 22 February 2024.
27
The Day One Project. https://fas.org/day-one-project/. Accessed 22 February 2024.
28
The Partnership for Public Service. https://ourpublicservice.org/. Accessed 22 February 2024.
29
U.S. Office of Personnel Management, Intergovernment personnel act. https://www.opm.gov/policy-data-oversight/hiring-information/intergovernment-personnel-act/. Accessed 22 February 2024.
30
A. Gamoran, Evidence-based policy in the real world: A cautionary view. ANNALS Am. Acad. Political. Soc. Sci. 678, 180–191 (2018).
31
J. Parkhurst, The Politics of Evidence: From Evidence-Based Policy to the Good. Governance of Evidence (Taylor & Francis, New York, 2017).
32
V. Tseng, C. Coburn, “Using evidence in the US” in What Works Now? Evidence-Informed Policy and Practice, A. Boaz, H. Davies, A. Fraser, S. Nutley, Eds. (Policy Press, Bristol, 2019), pp. 351–368.
33
D. E. Abelson, Do Think Tanks Matter? Assessing the Impact of Public Policy Institutes (McGill-Queen’s University Press, Montreal & Kingston, ed. 3, 2018).
34
J. G. McGann, The Fifth Estate: Think Tanks, Public Policy, and Governance (Brookings Institution Press, Washington, DC, 2019).
35
T. A. DiPrete, B. N. Fox-Williams, The relevance of inequality research in sociology for inequality reduction. Socius 7, 1–30 (2021).
36
M. Jackson, How is it to be done? Building a social science of radical reform. Socius 8, 1–6 (2021).
37
Stanford Impact Labs. https://impact.stanford.edu/. Accessed 22 February 2024.
38
The Brown Policy Lab. https://thepolicylab.brown.edu/. Accessed 22 February 2024.
39
The Penn State, Evidence to Impact Collaborative. https://evidence2impact.psu.edu. Accessed 22 February 2024.
40
The William T. Grant Foundation, Institutional challenge grants. https://wtgrantfoundation.org/grants/institutional-challenge-grant (2023). Accessed 22 February 2024.
41
A. R. Nelson, Ensuring Free, Immediate, and Equitable Access to Federally Funded Research (White House Office of Science and Technology Policy, 2022).
Information & Authors
Information
Published in
Classifications
Copyright
Copyright © 2024 the Author(s). Published by PNAS. This article is distributed under Creative Commons Attribution-NonCommercial-NoDerivatives License 4.0 (CC BY-NC-ND).
Data, Materials, and Software Availability
There are no data underlying this work.
Submission history
Published online: March 8, 2024
Published in issue: March 26, 2024
Keywords
Acknowledgments
Author contributions
C.C.E. wrote the paper.
Competing interests
The author declares no competing interest.
Notes
This article is a PNAS Direct Submission.
*
The number cited of “more than 150,000” incorporates the categories “social sciences”, “education,” and “business” from AAAS’s calculations, which are drawn from data produced by the Bureau of Labor Statistics.
Authors
Metrics & Citations
Metrics
Altmetrics
Citations
Cite this article
Democratizing social scientists’ impact on federal policy: Using the evidence act to help government and ourselves, Proc. Natl. Acad. Sci. U.S.A.
121 (13) e2306890121,
https://doi.org/10.1073/pnas.2306890121
(2024).
Copied!
Copying failed.
Export the article citation data by selecting a format from the list below and clicking Export.
Cited by
Loading...
View Options
View options
PDF format
Download this article as a PDF file
DOWNLOAD PDFLogin options
Check if you have access through your login credentials or your institution to get full access on this article.
Personal login Institutional LoginRecommend to a librarian
Recommend PNAS to a LibrarianPurchase options
Purchase this article to access the full text.