Now is Better Than Later

Democratizing Evaluation: Lessons Learned in the Youth Development Sector

Praxis Institute
14 min readFeb 10, 2021
Image from Chicago Wears Condoms Campaign (2015)

After every run through there is a meeting. And it doesn’t matter how many months we’ve done it, how many times we’ve done it. There are notes every time, because there is always something we can improve upon.

— Beyonce (Homecoming)

We are excited to compile years of lessons learned in the youth development sector about how we can democratize evaluation. We know not everyone has had the best experiences with evaluation but it can be an incredible tool for continuous improvement and a way to increase equity and participation in your program.

The article talks about: What we’ve done right; What we *didn’t* get so right; & What all we could be doing better

TWO BIG LEARNINGS:
1) If we are not asking questions in real-time we’re missing the chance to improve as we go. 2) We are not leaving enough records behind to replicate our successes along the way.

So we must: Ask. Evaluate. Fix. Improve…All in real time!

Introduction

We demand quality from our youth workers, educators, and healing practitioners. Organizationally, they are tasked with managing complicated and demanding situations. Personally, they are pressured to adapt to the varying emotional and social demands of their youth and community partners. This delicate balance is consistently challenged and has been put to the test as individuals and organizations, confronted with the daily realities of responding to the COVID-19 pandemic, continue the critical work of engaging young people as leaders and problem solvers.

To meet the evolving needs of youth and communities, organizations around the country are embracing dynamic strategies to differentiate supports, connect resources, track outcomes, and monitor progress. We frame this process as Continuous Improvement. A widely used term that has been adopted across sectors to help frame the pro-active nature necessary to implement change.

To do continuous improvement well does require a mindset shift and being good at it doesn’t happen overnight. But most good things in life don’t happen instantly. On-going feedback loops, help programs and products get better. This means prioritizing the on-going documentation of the design and implementation process through process evaluation strategies. Even more importantly it requires a more inclusive and participatory approach that involves all stakeholders of a program in the evaluation process: Evaluation design, data collection & analysis. When you do it right the process will have a transformative impact on the quality of your work. And… it’s lots of fun. Brings new focus and purpose to the work. It’s a beautiful thing to collectively commit to getting better. Through this process, we can better understand things like:

  • The engagement and satisfaction of youth participants
  • Which type of implementation approaches work best and which do not
  • Identify areas for improvement, training, or adaptation
  • Which parts of a program to expand, modify or cut

Prioritizing this approach provides practitioners and youth participants ownership of the process and a way to gauge their engagement and satisfaction on an on-going basis. Unfortunately, funding and human resources (e.g. coaching) are not prioritized by philanthropy, organizations or practitioners to fully actualize the benefit of a process evaluation. When structured properly, a process evaluation enables practitioners to engage in data-driven decisions and stay at the pulse of the diverse needs of program participants. At the core this process is accountability to key stakeholders, including youth, to shape the experience by analyzing and improving the quality of program components. This article attempts to demonstrate the importance of continuous improvement through 3 parts:

Part 1: Some of the defined challenges we must overcome

Part 2: Chicago Wears Condoms - What we could have been done better

Part 3: Two case studies where we got it right

PART I Challenges We Must Overcome

Evaluation: Members Only

Evaluation can be perceived as an exclusive club or a specialized field with esoteric language and methods that require years of academic training and credentials to understand. Evaluative processes can be intimidating and are generally associated with assessing job performance.

Whether the evaluation is done by an internal or external person it is often not integrated into the program implementation process. If the evaluation is done internally it can often be the research or fundraising department who leads the evaluation. External evaluators are a great resource, but “you get what you pay for.” Unless you have a significant amount of your budget dedicated to evaluation, external evaluators will not spend a lot of time guiding practitioners through the evaluation process.

Practitioner & Evaluator: Out of Sync

Traditionally the role and interactions between program and evaluators is transactional at best. Restrictions on time and priorities within a programs culture limits the interaction and sharing of expertise. Many researchers and evaluators lack the first hand experience of working directly with young people at the program level. At the same time, Practitioners lack the methods and background in research and evaluation. Unfortunately, this way of doing things does not fully support on-going program improvements that engage programs in organizing, prioritizing, and taking action based on relevant program data.

Inevitably, the two sides are often on different wavelengths when discussing an evaluation plan. They are not designing the plan together. Often, evaluation components are presented to practitioners with little explanation on the purpose of the data that is being collected. The practitioners come to the table feeling slightly intimidated by evaluation terminology and how it works and don’t feel comfortable giving feedback. The lack of engagement, or participation in the evaluation design limits the insights from practitioners about the context of the program that can help inform the overall evaluation plan.

A Mandate Without Purpose

What follows is a familiar dance we have all experienced in the youth development sector. The designated evaluator is assigned and does all the heavy lifting. The evaluation tools, timeline, and guidelines are identified prior to the program launch. Weeks into the program and it is survey time. Prioritizing the program implementation the practitioners put it on their “to do list.” Not really knowing why the survey data is important they de-prioritize the process. In this case the evaluation is only a box to check for compliance, a menial, annoying task. Just one more thing to do on a list of hundreds of things to do while implementing the program. The lack of understanding and enthusiasm for the survey rubs off on the youth participants. Given a few minutes the participants fill out yet another survey as they discount that any changes will come from their feedback. Constrained by time, data is reviewed after the program cycle, with no opportunity to help maximize the current plan.

The two sides don’t understand each other’s work, don’t communicate, and operate in different worlds with different end goals. Sadly this leads to evaluation and program implementation becoming separate items vs. one informing the other. The dissonance of the two leads to massive challenges with improving the quality of the program.

Process Evaluation Requires Funding

Most commonly funders require an outcome evaluation. Outcome evaluations assess the effectiveness of a program in producing change. Often, it comes in the form of a report produced once the program cycle has ended. These expensive glossy reports do a great job at showcasing outputs through action shots of youth participants, quotes of how the program changed lives, participant data (e.g. demographics, number of participants served, events produced), and in some cases a page dedicated to how the program has positively impacted youth’s outcomes. Lacking are the core details of how the work was done. Historically, funders have utilized evaluation activities to track compliance with deliverables and monitoring of expenditures (e.g. allowable vs unallowable expenses). Unfortunately this narrow focus over-emphasizes outcome evaluations at the cost of documenting the implementation process and can act as a disservice to non-profits with the goal of expanding or sustaining their work in communities.

The strategies associated with a well structured process evaluation take time, training, consistency, and commitment from key stakeholders. At the end of the day, a quality process evaluation requires funding. If funders do not require and provide technical support for a process evaluation then there is little benefit for programs to prioritize the data collection that measures progress in real time, engages stakeholders as key drivers of quality improvement, and captures key insights on how to alter programs to maximize impact.

Fund it!

An obvious solution is for philanthropy to fund and support implementation of a process evaluation. Process evaluations are beneficial to funders for many reasons.

When a program fails, funders usually want to know why it failed. If a program works, funders often want to understand why it works — particularly if there is interest in expanding the program or attempting to implement a similar program in other locations. A process evaluation provides that important information.

A structured process evaluation, especially when integrated across the stages of program implementation requires an organizational culture shift. Research, tools, systems and vendors must exist to support organizations to better understand and implement this approach for program improvements. Funding would have to be prioritized for training, coaching, planning and implementation.

Part II: Chicago Wears Condoms

What gets measured gets done.

If you don’t measure results, you can’t tell success from failure.

If you don’t recognize failure, you can’t correct it.

If you can’t see success, you can’t learn from it.

— Evaluation Proverb

In late 2015 a group of high school students launched the Chicago Wears Condoms campaign to address sexually transmitted infections and teen pregnancy throughout the region. The campaign was developed by a Youth Health Council (YHC) in response to Chicago having one of the highest rates of gonorrhea and chlamydia in the United States; and a teen pregnancy rate 1.5 times that of the rest of the country.

“How might we create exciting and inspirational health marketing campaigns to educate our schools and communities?”

The YHC conducted an action research project to address the issues head on. Over the course of a summer youth learned about social marketing and were trained on health communications. The YHC created different ad campaigns covering various areas in public health. The Health Council successfully pitched their ideas to the Commissioner of Public Health and other key decision makers and received a $300,000 commitment from the Department of Public Health to run the advertisements across the city.

“Stay safe when you go downtown”

A fantastic marketing agency donated their services to the YHC and the campaign was launched. The YHC unveiled a dynamic and clever campaign on billboards, trains, and bus stops across Chicago. “Stay safe when you go downtown,” “Get in Get off safely,” “Protect your John Hancock,” and “It doesn’t take balls to wear a condom,” were everywhere. Tens of thousands of condoms were distributed across CPS, and clinics with the #ChicagoWearsCondoms logo. The campaign reached millions across the city, went through two iterations, and lasted nearly four years. All credit to the authentic voice and perspective of the YHC. Chicago noticed, the media noticed, and eventually the philanthropic world took note and the YHC received hundreds of thousands of dollars of funding over the next five years. Fast forward to May 2017 and Chicago Department of Public Health announces that Chicago’s teen pregnancy rate has hit an all-time low. The Chicago Wears Condoms campaign is listed as one of the key programs that led to the public health milestone.

Final Version of the Ad Campaign for Chicago Wears Condoms

Beneath the Surface…

On the surface Chicago Wears Condoms (CWC) was a huge success, but there were so few documented lessons learned from the experience. CWC is a terrific case study of an engaging social marketing campaign designed by young people and a product of a robust public-private partnership between a non-profit, government agency, and a large marketing agency. Unfortunately, there is a lack of documented “lessons learned” across the funding, planning, or implementation of the campaign. Lacking the funding for a process evaluation, CWC lives in artifacts and stakeholder anecdotes.

If we were asked to replicate the campaign: in a new city, with a new staff and a new group of youth we couldn’t share which inputs and practices worked best or provide other documented lessons learned to make the next campaign even better.

Chicago Wears Condoms campaign is a common story. Youth programs leveraging energy, sweat, and passion to implement a critical initiative with limited funding. Prioritizing funds to maximize the program, without documentation of key strategies or data driven improvement strategies that could help replicate, improve, sustain, or scale a successful youth-driven public health campaign.

Part III: Two Case Studies Where We Got It Right

321 Education Foundation: The Big Impacts of Practitioner-Led Evaluation

The 321 Education Foundation is a not for profit organization in India. The aim is to improve primary student learning in reading and math through high teacher performance. The organization works with 2000 teachers and 20,000 students across three cities in India.

Primary School Teacher In her Classroom in Hyderabad, India
Primary School Teacher with Her 2nd Grade Classroom in Hyderabad, India

In just one year 321 saw a 40% & 50% increase in student mastery of reading and math. This is unheard of and three times India’s national average.

321 creates the curriculum, then provides individualized training and support to all its teachers. Usually that means observing and coaching teachers 2–3 times per month. The process evaluation was led by the program team and was integrated into how the program was implemented. During each support visit the practitioners would collect key data to assess the progress of the students and teachers in achieving their goals. The practitioners would then take the data, and work with the teacher to prioritize 1–2 small fixes they could make to improve. Through quantitative data, observation, and conversations with the teacher frequent improvements in the classroom were made. 321 was also able to share best practices from other classrooms with teachers in the program. A great deal of intentionality was put into having the teacher understand, value and use the data. This took some time and lots of trust needed to be built. In the end the teacher was a full participant in the process. All data collected was immediately communicated to the teachers in order to give decision making power and put control in the hands of the teacher.

Collaboration with Key Funders

The evaluation was fully funded, supported and… required by the Godrej Foundation, a well-endowed Indian corporation. Every 2–3 months 321 and Godrej would have a scheduled “low stakes” conversation about lessons learned from the data and improvements made in program implementation. 321 was honest about things going well and challenges and in turn Godrej better understood the work they were funding and the key inputs and activities to have an impact. In some cases this led to Godrej giving 321 more funding. Due to the process evaluation 321 was able to maintain or improve the program quality and as well as demonstrate accountability to key stakeholders.

Truth, Racial Healing & Transformation Youth Institute: A Participatory Approach

“While evaluation has in the past worked, albeit unintentionally, to silence the voices of racial/ethnic and gender minorities, more recently evaluation approaches have been developed that elevate these voices.”

-Dr. Rodney Hopson

The Truth, Racial Healing & Transformation (TRHT) Youth Institute was designed as a virtual program and worksite to train youth, ages 18–24, as Racial Healing Practitioners and Community Organizers. The Youth Institute’s mission was to proliferate healing and equity within individuals, neighborhoods, and communities to change the prevailing race narrative, fuel transformation, erase the belief in racial hierarchy, and drive towards racial equity.​

The program was a pilot & the Directors of the institute knew there would be lots of room for improvement as well as lessons learned that could be used to improve future programs. In addition, this was the summer of COVID and the institute was implemented remotely. Some of the observational cues and understanding you might get from an in-person program would not exist, making even more important to get ongoing youth feedback.

We also knew the importance of developing clear evaluation guidelines that were in sync with the mission of the Youth Institute.

  1. Build buy-in to the evaluation process and anticipation of the results with all practitioners and youth participants.
  2. Foster trust in the data and information used, but ask for feedback early in the process on the approach to collecting data.
  3. Be very conscious and aware at every step in the process of the implicit biases at play in the interpretation of any information or data.
  4. All survey questions, tools, and systems to analyze the data must be open to feedback and change if it led to improvement of the program and experience of youth participants.

Implementing the guidelines meant that:

  • The assessment system needed to be simple & easy to implement
  • Lots of dialogue of WHY & HOW the evaluation would be mutually beneficial for both practitioners and youth participants
  • After each survey, within one week, results & next steps were communicated to youth participants.

Each week, through the evaluation the Directors & Practitioners were able to understand engagement & satisfaction of youth participants, types of unique support they needed in a remote learning environment, knowledge & understanding of content material, staff engagement & satisfaction and confidence of youth to lead their own racial healing circles.

Halfway through the program, 40% of youth participants were comfortable leading a Healing Circle. By the end of the program that increased to 95%. In addition, participants who showed mastery of key TRHT concepts increased from 32% to 83%. What’s most important is that all Practitioners involved in implementation could tell you WHY this growth happened.

“This process truly served our staff and our youth fellows well. It provided us the data that we needed to make real-time programmatic decisions concerning curriculum, program policies and practices, based on the student feedback. This supported the fidelity of our implementation and created a strong sense of community and ownership with our youth fellows. Fellows knew that they were being heard and their voices mattered.”

— TRHT Co-Director

Each week the evaluators and program team analyzed the data together. The program team, based on participant feedback and other information prioritized 1–2 things to improve. The things that were going well were continued and some weeks new things were introduced to enhance the youth participant experience. The idea was to continue to strive to make things better. Most importantly, each week facilitators dedicated time to present the results to youth participants and explain how they were using the data to make program improvements. In some cases improvements could not be made but it was always explained why those changes could not occur. This led to both Youth and Staff valuing the data & using the feedback loop to improve.

Conclusion

Practitioners are required to be nimble, adapt and make quick decisions based on intuition or informal feedback. They want to become better at their jobs and improve the wellbeing of the youth they work with. CEOs and senior managers must constantly understand both the effectiveness of the inputs and the true costs of the program. Funders want to see that their money is spent wisely and leads to large scale social impact. We must place the same focus on the process of implementation and learning as we have on impact and accountability. A concerted focus on process evaluation will help us achieve faster, more meaningful, and replicable results that will lead to transformative impact. At every level we must fund, build, support and maintain a culture of continuous improvement.

Stay Tuned…

For future posts on practical HOW TO strategies for implementing continuous improvement.

Written by: Joshua Prudowsky

Special Thanks to: Fausto Lopez, Terrence Pruitt, B Loewe, Steven Rosado, Mia Henry, Michelle Morales & Jackie Rosa

--

--

Praxis Institute
Praxis Institute

Written by Praxis Institute

Praxis integrates theory and practice to engage, empower, and embolden communities to be agents in the creation of their individual and collective wellbeing.

Responses (1)