Why and How Family Foundations Can Advance Equity Through Participatory Measurement
Courtesy of Pace Center for Girls
The recent investment by MacKenzie Scott and Dan Jewett in funder coalitions Fund for Shared Insight and Equitable Evaluation Initiative—both dedicated to increasing participant feedback in evaluation—is the latest vote of support for gathering evidence of program impact in ways that engage program participants as experts in assessing their own experience.
Participatory evaluation (PE), invoking tools like feedback, surveys, focus groups, testimonials, diaries, and participant councils, gives program participants, staff, and other stakeholders ownership in designing and managing the evaluation process itself. PE has radically shifted how to gauge social programs’ effects on participants and their sense of power. As opposed to traditional, experimental methods, which make participants subjects of a study, PE always asks, “What answers are we seeking? Why? By whom? For whom?”
This year, Pace Center for Girls, a Florida-based multiservice nonprofit serving middle- and high-school-age girls with histories of trauma, and MilwayPLUS social impact advisors, undertook research to better understand participatory evaluation methods. We sought to uncover how participatory evaluation methods contribute to 1) stronger outcomes and 2) shaping organizations in ways that foster equity and inclusion. The research surfaced causal links between participatory methods and participants’ life changes, as well as positive ripple effects that reshaped nonprofits’ processes and empowered participants to influence systems reform.
Pace Center for Girls and MilwayPLUS collected information on organizational attributes and approach to internal processes like HR, strategy development and IT before and after using participatory methods through a survey, and gathered stories of impact and challenges via interviews and focus groups with 15 nonprofits embracing participatory measurement. The findings from this research surfaced connections to outcomes, and a case for blending empirical measures with participant feedback.
For example, thanks to participant feedback, Pace Center for Girls found a strong link between teacher retention and girls feeling more respected and staying longer in the program. This is significant because tenure in the program is statistically proven to positively influence girls’ grades and graduation rates.
Another nonprofit surveyed was the Center for Employment Opportunities, which helps formerly incarcerated people get jobs and resume their lives. The Center for Employment Opportunities told us that participant feedback revealed complementary components of a program that helped them succeed. “We learned it doesn’t help to have a mentor without a core program, but we wouldn’t have learned that through purely quantitative outcome evaluation,” says Ahmed Whitt, director of learning and evaluation.
Our survey found participatory methods produced positive ripples across the organizations that employed them. Sixty percent of our study respondents said the use of participatory approaches in evaluation had led them to hire differently—specifically, to prioritize candidates who had experienced the issues their program aimed to solve, were good listeners, respected the community, and were participatory-tool savvy. And all respondents reported increasing communications with program participants, often by SMS or text. These shifts in capacity bolstered organizational adaptability and development of workplace tools that facilitate taking stock—and implementing change—quickly.
Participatory methods also led to advocacy wins that changed systems and society. Think of Us, a service-delivery and advocacy group for foster youth, hires foster care alumni as members of its research teams. When Think of Us recently reported direct testimonials from 78 foster youth of abusive experiences in group homes, the insights prompted child welfare leaders in several states to support policy changes to eliminate institutional placements. Pace used direct input from its girls to identify local, state, and federal policies that needed reform—and the community members who must be involved to implement these reforms. One of Pace’s successful community campaigns lobbied for misdemeanor and civil-citation legislation so that law enforcement could censure girls for petty crimes without arresting them. Florida has also funded other prevention measures to keep girls out of the juvenile justice system. As a result, over the past decade, the number of girls arrested annually in Florida has dropped by about 65%.
None of these insights obviate the need for experimental tests to prove specific effects (such as how well an activity boosted participants’ health). But the organizations in our study found that blending participant-informed and empirical measures bolstered their ongoing insight and adaptability, which was tested through the tangle of health, racial, and economic crises in 2020.
For now, as funders and their grantees work to better engage participants in assessing the effectiveness of their interventions, they all should ask to what extent they can affirm the following:
- Do we have a process for connecting with and bringing the voices of those most affected by our interventions to influence our approach to evaluating impact?
- Are we disaggregating the data we gather by race, age, and any other relevant criteria to understand the disparate experiences of groups of participants within the overall results?
- Are we approaching our evidence building through a racial equity and inclusion lens by identifying and testing questions through appropriate focus groups or panels of participants?
- Are we calibrating outcomes to ensure they are equitable and not determined or predictable by other innate factors (e.g., gender, disability, or race)?
- Are we using the data to inform programs and strategies that are themselves in the service of equity?
Those making intentional and ongoing progress toward affirming all five questions are well on their way to building more‑equitable evidence.
Mary Marx is CEO of Pace Center for Girls and Teddy Thompson is Pace’s chief advancement officer. This article summarizes and excerpts findings from Building Equitable Evidence of Social Impact by Marx, Lymari Benitez, senior director of program information and impact at Pace Center for Girls; Yessica Cancel, Pace’s chief operating officer; and Katie Smith Milway, principal of MilwayPLUS and a senior advisor at The Bridgespan Group. The authors thank Fund for Shared Insight, Project Evident and Feedback Labs for their help curating focus groups.
The views and opinions expressed in individual blog posts are those of the author(s) and do not necessarily reflect the official policy or position of the National Center for Family Philanthropy.