Blog

Interview

VPP Staff Spotlight: Isaac Castillo, Director of Outcomes, Assessment and Learning

April 30, 2020

Isaac Castillo is VPP’s Director of Outcomes, Assessment and Learning, leading the organization’s work in measurement and evaluation. As we celebrate VPP’s 20th anniversary, we asked Isaac to reflect on how his career led him to VPP and how the organization has grown over time – and continues to evolve as we look ahead.

How did your upbringing shape you?

I grew up in a lower middle-class neighborhood in El Paso, TX right on the border with Mexico. My grandparents were born in Mexico and immigrated over and my parents and I were all born in the United States. When I was growing up, there was a lot of travel back and forth across the border, so I felt both American and Mexican growing up.

A big part of my life in high school was participating in competitive debate and speech, which gave me the opportunity to travel a lot. In addition to building my critical thinking and communication skills, it exposed me to communities in different areas – high schools and colleges – and for my own college experience I wanted to experience what was beyond the area where I grew up. I ended up choosing to go one of the furthest places from the deserts of El Paso, into the snows of upstate New York at Syracuse University. It was both a physical and culture shock. Beyond just adjusting to the weather, I had to adapt to the very different demographics. I grew up in a very minority-dominated community and transitioned to a primarily white-dominated community.

How did you get into the evaluation field? And working in the nonprofit sector?

After graduating from Syracuse, I went to the University of Rochester  for a Master’s degree in public policy and took several qualitative and quantitative methods courses and statistics classes. I had an economics and econometrics professor who really pushed me to start thinking about pursuing a career in evaluation.

What drew you to working on youth-related issues?

Based on my experience coaching young debaters, I had an interest in working with young people in some way. When I finished my graduate degree, I received a job offer from a research and evaluation organization who wanted me to work on youth and young adult evaluation issues and it seemed like a good fit. While at this organization, I worked on a project which evaluated the effectiveness of gang intervention programs and came across the Latin American Youth Center (LAYC). I was really interested in their mission and work. In a real coincidence, they happened to be searching for a Director of Learning and Evaluation at the time – it turns out that it was the position that VPP had funded as part of its investment in LAYC (from 2003 – 2008).

What was your time at LAYC like?

I worked at LAYC for seven years. When I started, I was the only full-time in-house evaluation staffer there – one of the few full-time internal evaluators at a nonprofit in the country. I built up a team of five full-time staff over my time there. I really felt a strong sense of accomplishment having developed the organization’s internal evaluation infrastructure and, after seven years, I was confident that the team could sustain the work without me, and I was ready for a new challenge.

What was the next step in your professional journey?

While I was at LAYC we shared a lot of knowledge and best practices with other nonprofits who wanted to learn from our success. When I was ready to take my next career step, I took a position at Child Trends, At Child Trends, I helped nonprofits across the county strengthen their measurement and evaluation capacity. While at Child Trends, I served on a few federal advisory groups, including the U.S. Department of Education’s Promise Neighborhood Program during the Obama Administration. At the time, a lot of people were thinking about how to best create change at the neighborhood level, but not many were thinking about how you best measure and evaluate that work in terms of attribution and causation – how you determine if the actions that the project is doing are making a real difference for communities and their residents.

The Promise Neighborhood grants were awarded, and DC received one of the grants through the DC Promise Neighborhood Initiative (DCPNI). DCPNI’s  executive director  then recruited me to lead their evaluation work and it felt like an exciting challenge for my career. The evaluation aspect of place-based and collective impact work was a developing field at the time and it was really exciting to be a part of its development. It was the beginning of a new branch of evaluation work. During my time at DCPNI, I eventually became the Deputy Director with oversight of the education-related programming and the evaluation department.

What was your journey to VPP?

In 2016, VPP reached out to me for advice on filling an open evaluation and learning position. As we discussed what the organization was looking for it became clear to both me and the organization that I might be a great fit for the role and the position would be another exciting challenge for me. I’ve been here at VPP now for four years – but I’ve been connected to VPP for much longer, since my job at LAYC.

How does VPP approach measurement and evaluation as part of its mission?

At VPP, we think about evaluation in a few ways. First, we are trying to understand if our investments in an organization or network made a difference and, when possible, determine how it made a difference. For many of our investments, we are trying to move the needle in very complex organizations and systems. Part of what I have to do is figure out  the best way to measure  effectiveness within the real world of limited and messy data and information.   Because of that, our measurement and evaluation work operates at multiple levels. Sometimes the populations that we are evaluating might be an entire school or a group of schools, sometimes it is participants of an individual nonprofit organization. There are different approaches, tactics, and tools that we use depending on the populations and outcomes that we are measuring.

For example, the work that we have been doing at Suitland High School to measure the effectiveness of our youthCONNECT network that is embedded in the high school involves interrelated evaluation approaches. We are analyzing and merging three different sets of data – data from the school (grades, attendance, standardized tests scores, etc.), data collected through voluntary surveys (student impressions of their school environment, their confidence in their own abilities, their own academic self-efficacy, and behavioral questions including sexual health, substance abuse, dating violence, feelings of depression, or suicidal ideation), and data provided by our nonprofit partners that serve students at Suitland.

What’s unique about the evaluation work at Suitland is our ability to connect the data from these three sources at the individual student level, allowing us to tell a complete story of the programming students receive as well as the resulting outcomes. For example, we may see that a student has a very low attendance rate and then see from their survey responses that they don’t feel safe in school – this gives us a much more nuanced picture of the needs and challenges of the young people that we and our nonprofit partners are trying to help. Very few groups have been able to  access and utilize data from multiple academic and non-academic sources at the student level.

Having access to data collected from our nonprofit partners is a critical part of the evaluation work at Suitland High School.  Our nonprofit partners track the services provided to students as well as non-academic outcomes that students achieve.  They collect and share things we could never obtain through traditional school administrative data sources. For example, a partner may have helped a student get into a more stable housing situation – and then we can see what that student’s impressions of the school and their own behaviors are and match that against the data on their academic outcomes. This allows us to describe a more holistic and realistic picture of what is going on with a student. And that is really rare in the social sector evaluation field right now.

The evaluation of this work is a test case of what is possible if you have different kinds of data sets and you think strategically about how to use the data. However, this type of effort takes a lot of work  – we had to get everyone on board with sharing this level of information and get agreement on everything from addressing privacy concerns to how the data would beanalyzed and shared.

There is an increasing realization that you cannot get improved academic outcomes for many students who face serious challenges without addressing non-academic needs.  Given what many students are experiencing (and will continue to experience) in a COVID-19 world, the need for non-academic supportive services will grow. And we will need to evolve our data and evaluation thinking to identify, address, and measure non-academic growth (or stagnation) among students. We hope that others will see  that an evolved and holistic view of data and evaluation is possible,  and that it might be one of the best ways to address large-scale systems change for youth faced with increasingly difficult challenges.

What has VPP contributed to the field of nonprofit measurement and evaluation?

During our first portfolio, VPP’s biggest contribution to the field was recognizing and funding the need for nonprofit organizations to build their own internal evaluation capacity. Funding that work twenty years ago was groundbreaking. Then, VPP realized that if you get a network of organizations that have strong internal evaluation capacity to start working together, you can get some evaluation work that is bigger and more meaningful than what any of those nonprofits could do alone. And most recently we have been collecting, combining, and analyzing data from multiple sources to tell a more holistic story of program effectiveness and systems change. In many ways VPP’s journey of supporting and developing evaluation capacity of the region is evolutionary – it’s getting to a very sophisticated place.

As we mark VPP’s 20th anniversary, what is most important for people to know about VPP and how VPP values and approaches evaluation and learning?

Twenty years ago, no one even thought it was possible that a nonprofit could have sophisticated internal evaluation departments. VPP’s work helped to demonstrate that nonprofits, if given enough support and time, can develop evaluation people who are as skilled and qualified as external evaluators.

VPP has evolved a lot over the past twenty years and we are looking forward to continuing to evolve and grow over the next twenty years. In the evaluation space, I am excited about continuing to figure out ways to increase our sophistication in measuring the effectiveness of collective impact and place-based work. This work is incredibly complicated and requires a level of skill and sophistication that I think VPP will be able to lead the way on over the next twenty years.

What are we doing right now that’s exciting as we look to the organization’s future?

VPP’s role has always been regional and VPP has always had outcomes and data as part of its core philosophy. That’s why VPP was well positioned to release the Capital Kids Report earlier this year.  VPP’s focus, stance, and credibility allowed the organization to take a truly regional lens and look at over a hundred outcomes for 0-24 year olds. That’s the uniqueness of Capital Kids Report. We intend to continue to do sophisticated data collection to identify important trends and mobilize the nonprofit and funding communities to get out ahead of them to help young people as much as possible.