From Macro to Micro: Deconstructing Stakeholder Engagement for Decision-Making

By Christopher Murrell 

Stakeholder engagement improves the quality and responsiveness of government processes and resources, including activities like formulating and fulfilling federal evaluation plans and learning agendas. The White House Office of Management and Budget (OMB) emphasizes the role of engagement with key stakeholders in evidence-building activities (see Circular A-11, M-19-23, M-21-27), yet engagement is still limited in practice for many agencies. 

During a joint virtual event hosted by the Data Foundation and Results for America, participants learned about discrete examples which highlighted the importance of effective engagement from practitioners and resources available to improve engagement moving forward. 

What does it mean to engage stakeholders for evidence-building activities?

The Data Foundation’s Stakeholder Engagement Toolkit describes engagement across various contexts, taking into consideration stakeholder’s relevance to program, policies, and activities. The toolkit serves as a practical guide to understanding and implementing stakeholder engagement. 

In the Data Foundation’s toolkit, stakeholders are first identified along two spectrums – specialist-generalist or and internal-external – which recognize that expertise or specialization can include lived experience. 

In order to consider types of engagement, five categories of engagement are identified in the Data Foundation’s Stakeholder Engagement Toolkit

  • Informing: Stakeholders are provided information without being asked for comment in return.

  • Consulting: Similar to informing, but where the stakeholders have a mechanism to provide feedback or commentary. 

  • Involving: This level of engagement requires two-way communication and that the voice of the stakeholders appears in the final findings.

  • Collaborating: Collaboration requires that stakeholders be a part of the engagement process itself, whether through designing the project, or through being more active in the decision-making that occurs.

  • Empowering: Empowerment is when the stakeholders have control over the final decisions of the project.

What is an example of effective community engagement in evaluation that highlights the benefits?

Dr. Aleta Meyer from the US Department of Health and Human Services’ Administration for Children and Families (ACF) provided an overview during the virtual event of the Multisite Implementation Evaluation of the Tribal Home Visiting Program. This evaluation plan included community engagement and a co-creation of the plan between the evaluators and the community. The process of creating the plan with the stakeholders led to modifications of the evaluation, including a model that better incorporated the family as part of the community in measuring program impacts which was an important part of the theory of change. For the evaluation, stakeholders were involved in shaping research questions, data collection plans, ethical plans, and participated in a “data party” to provide direct feedback on the evaluation results.

Because of this example, Dr. Meyer emphasized the need for community engagement to be built-in to program and evaluation design from the beginning, with the appropriate resources available for that work. She described how, when done intentionally and meaningfully, engagement can:

  • increase the rigor of evaluation activities

  • improve the understanding, acceptance and use of findings

  • promote equity

  • support communities in expressing their priorities and needs

  • shape how evaluation is conducted and how programs are improved

What is an example of effective engagement in learning agenda formulation that highlights key benefits?

In 2023, the U.S. Department of the Treasury’s Office of Recovery Programs (ORP) created a multi-year learning agenda. Unlike the main Treasury Department, ORP was not required by law (the Evidence Act) to formulate a learning agenda, or a strategic plan for research and evidence. Laura McDaniels, Policy Outreach Lead for the State and Local Fiscal Recovery Fund with the Treasury Department described at the virtual event that ORP’s agenda was nested within the two relevant strategic objectives from the Treasury Department’s learning agenda:

  • Strategic Objective 1.3: Economically Resilient Communities:

    • To what extent are American Rescue Plan (ARP) programs being implemented equitably?

    • What is the impact and/or outcomes of ARP programs on households, businesses, and governments?

  • Strategic Objective 1.4: Resilient Housing Market:

    • What strategies deployed in the recovery from COVID-19 best prevented evictions and foreclosures?

    • How can we track evictions nationwide?

In developing the ORP learning agenda, the program engaged stakeholders by sharing drafts of the learning agenda for feedback and specifically conducting dialogues for input. ORP staff conducted virtual and in-person sessions with over a hundred organizations and individuals, and the feedback was incorporated by ORP in the final document, which led to a better-structured learning agenda and refinement of the initial research questions. Collectively, according to ORP, the engagement helped ensure that research priorities were informed and considered actionable by interested parties outside of the agency – while still aligned with strategic priorities of the Treasury Department. 

How can engagement be successfully integrated in program design?

While laws like the Evidence Act can establish expectations and requirements for engagement, which may initiate the stakeholder engagement process, embedding engagement in processes, procedures, and program culture is the strategy to realize long-term benefits. With this strategy, program and project managers achieve success and uptake, and the increased perception and trust that may come with meaningfully incorporating stakeholder input in government activities.Senior leaders in agencies can facilitate these activities and effective engagement by encouraging program managers to identify stakeholders and consider the range of evidence that can benefit from participatory processes. 

CHRISTOPHER MURRELL is a manager of evidence and evaluation capacity at the Data Foundation. 

View the full Data Foundation-Results for America Event from December 2023: