Skip to main content
Expand AU Menu


In the Community

Public Sociology Alumna Utilizes Methods Training

By Ariana Stone

Photo credit: Ariana Stone

Photo credit: Ariana Stone

Since the beginning of the economic downturn, nonprofit organizations have had to find new ways to stay afloat.

Leah Galvin, CAS/MA ’08, knows this first-hand. As a learning evaluations specialist at the Latin American Youth Center (LAYC) in Columbia Heights, Galvin is responsible for tracking the organization’s success. Most importantly, that means ensuring that LAYC is having a positive impact on the young people they work with and ensuring that participants are protected from potential harm. Additionally, evaluation includes explicitly showing—not just telling—how effective the organization is in reaching its goals.

“Given the financial constraints of today’s economic situation, funders are looking to put their money into programs that can prove they’re actually making a difference,” says the public sociology MA alumna. “In the past, many nonprofits have relied on good intentions—this idea that their work feels good, and therefore must be making a difference. Unless an organization is measuring its performance, its members can’t know the impact they’re having.”

LAYC is a multi-service organization with over 50 programs dedicated to helping low-income youth successfully transition into young adulthood by providing education, social services, workforce skills training, arts and media exposure, and advocacy. With so many programs, it takes four full-time staff members in the organization’s Learning and Evaluations Division to map LAYC’s progress.

All programs at LAYC collect demographic data at a minimum. From there, data and data collection methods differ based on the goals and objectives of each program. At the most basic level, programs collect attendance data. Other programs collect information such as results from rubrics and tests measuring program participants’ knowledge acquisition. The Learning and Evaluations team then analyze the data to determine whether the programs are reaching desired outcomes—a process Galvin says can be compared to a dieter evaluating their own progress. “If your goal was to be healthier and lose 20 pounds, then you’d want to have ways to help you know if you’re reaching your goal,” she says. “Did you lose weight this week? Was it five pounds? 10 pounds? How do you know? What did you do? Did it happen immediately? All of this information lets the dieter know if they are making progress in reaching that goal of being healthier.”

After analyzing the data to determine the program’s impact, Galvin and the other members of the Evaluations Division present their findings in a variety of formats that will resonate with program staff and funders alike—a skill that she said she honed while earning her degree at AU. “What has really been helpful is knowing how to write for different kinds of audiences,” says Galvin. “Instead of writing scientifically and using statistical jargon that not everybody is familiar with, I write my reports and findings so a variety of people can understand in a way that helps them remain interested and engaged in the process. Ultimately, they will use the findings to help inform program and organization decisions.”