Now is Better Than Later

Democratizing Evaluation: Lessons Learned in the Youth Development Sector

Image from Chicago Wears Condoms Campaign (2015)

Introduction

We demand quality from our youth workers, educators, and healing practitioners. Organizationally, they are tasked with managing complicated and demanding situations. Personally, they are pressured to adapt to the varying emotional and social demands of their youth and community partners. This delicate balance is consistently challenged and has been put to the test as individuals and organizations, confronted with the daily realities of responding to the COVID-19 pandemic, continue the critical work of engaging young people as leaders and problem solvers.

  • Which type of implementation approaches work best and which do not
  • Identify areas for improvement, training, or adaptation
  • Which parts of a program to expand, modify or cut

PART I Challenges We Must Overcome

Evaluation: Members Only

Evaluation can be perceived as an exclusive club or a specialized field with esoteric language and methods that require years of academic training and credentials to understand. Evaluative processes can be intimidating and are generally associated with assessing job performance.

Practitioner & Evaluator: Out of Sync

Traditionally the role and interactions between program and evaluators is transactional at best. Restrictions on time and priorities within a programs culture limits the interaction and sharing of expertise. Many researchers and evaluators lack the first hand experience of working directly with young people at the program level. At the same time, Practitioners lack the methods and background in research and evaluation. Unfortunately, this way of doing things does not fully support on-going program improvements that engage programs in organizing, prioritizing, and taking action based on relevant program data.

A Mandate Without Purpose

What follows is a familiar dance we have all experienced in the youth development sector. The designated evaluator is assigned and does all the heavy lifting. The evaluation tools, timeline, and guidelines are identified prior to the program launch. Weeks into the program and it is survey time. Prioritizing the program implementation the practitioners put it on their “to do list.” Not really knowing why the survey data is important they de-prioritize the process. In this case the evaluation is only a box to check for compliance, a menial, annoying task. Just one more thing to do on a list of hundreds of things to do while implementing the program. The lack of understanding and enthusiasm for the survey rubs off on the youth participants. Given a few minutes the participants fill out yet another survey as they discount that any changes will come from their feedback. Constrained by time, data is reviewed after the program cycle, with no opportunity to help maximize the current plan.

Process Evaluation Requires Funding

Most commonly funders require an outcome evaluation. Outcome evaluations assess the effectiveness of a program in producing change. Often, it comes in the form of a report produced once the program cycle has ended. These expensive glossy reports do a great job at showcasing outputs through action shots of youth participants, quotes of how the program changed lives, participant data (e.g. demographics, number of participants served, events produced), and in some cases a page dedicated to how the program has positively impacted youth’s outcomes. Lacking are the core details of how the work was done. Historically, funders have utilized evaluation activities to track compliance with deliverables and monitoring of expenditures (e.g. allowable vs unallowable expenses). Unfortunately this narrow focus over-emphasizes outcome evaluations at the cost of documenting the implementation process and can act as a disservice to non-profits with the goal of expanding or sustaining their work in communities.

Part II: Chicago Wears Condoms

“Stay safe when you go downtown”

A fantastic marketing agency donated their services to the YHC and the campaign was launched. The YHC unveiled a dynamic and clever campaign on billboards, trains, and bus stops across Chicago. “Stay safe when you go downtown,” “Get in Get off safely,” “Protect your John Hancock,” and “It doesn’t take balls to wear a condom,” were everywhere. Tens of thousands of condoms were distributed across CPS, and clinics with the #ChicagoWearsCondoms logo. The campaign reached millions across the city, went through two iterations, and lasted nearly four years. All credit to the authentic voice and perspective of the YHC. Chicago noticed, the media noticed, and eventually the philanthropic world took note and the YHC received hundreds of thousands of dollars of funding over the next five years. Fast forward to May 2017 and Chicago Department of Public Health announces that Chicago’s teen pregnancy rate has hit an all-time low. The Chicago Wears Condoms campaign is listed as one of the key programs that led to the public health milestone.

Final Version of the Ad Campaign for Chicago Wears Condoms

Beneath the Surface…

On the surface Chicago Wears Condoms (CWC) was a huge success, but there were so few documented lessons learned from the experience. CWC is a terrific case study of an engaging social marketing campaign designed by young people and a product of a robust public-private partnership between a non-profit, government agency, and a large marketing agency. Unfortunately, there is a lack of documented “lessons learned” across the funding, planning, or implementation of the campaign. Lacking the funding for a process evaluation, CWC lives in artifacts and stakeholder anecdotes.

If we were asked to replicate the campaign: in a new city, with a new staff and a new group of youth we couldn’t share which inputs and practices worked best or provide other documented lessons learned to make the next campaign even better.

Chicago Wears Condoms campaign is a common story. Youth programs leveraging energy, sweat, and passion to implement a critical initiative with limited funding. Prioritizing funds to maximize the program, without documentation of key strategies or data driven improvement strategies that could help replicate, improve, sustain, or scale a successful youth-driven public health campaign.

Part III: Two Case Studies Where We Got It Right

321 Education Foundation: The Big Impacts of Practitioner-Led Evaluation

The 321 Education Foundation is a not for profit organization in India. The aim is to improve primary student learning in reading and math through high teacher performance. The organization works with 2000 teachers and 20,000 students across three cities in India.

Primary School Teacher In her Classroom in Hyderabad, India
Primary School Teacher In her Classroom in Hyderabad, India
Primary School Teacher with Her 2nd Grade Classroom in Hyderabad, India

Truth, Racial Healing & Transformation Youth Institute: A Participatory Approach

  1. Foster trust in the data and information used, but ask for feedback early in the process on the approach to collecting data.
  2. Be very conscious and aware at every step in the process of the implicit biases at play in the interpretation of any information or data.
  3. All survey questions, tools, and systems to analyze the data must be open to feedback and change if it led to improvement of the program and experience of youth participants.
  • Lots of dialogue of WHY & HOW the evaluation would be mutually beneficial for both practitioners and youth participants
  • After each survey, within one week, results & next steps were communicated to youth participants.

Conclusion

Practitioners are required to be nimble, adapt and make quick decisions based on intuition or informal feedback. They want to become better at their jobs and improve the wellbeing of the youth they work with. CEOs and senior managers must constantly understand both the effectiveness of the inputs and the true costs of the program. Funders want to see that their money is spent wisely and leads to large scale social impact. We must place the same focus on the process of implementation and learning as we have on impact and accountability. A concerted focus on process evaluation will help us achieve faster, more meaningful, and replicable results that will lead to transformative impact. At every level we must fund, build, support and maintain a culture of continuous improvement.

Stay Tuned…

For future posts on practical HOW TO strategies for implementing continuous improvement.

Praxis integrates theory and practice to engage, empower, and embolden communities to be agents in the creation of their individual and collective wellbeing.