Topic: Evaluation

Nonprofit Data: The Case of YMCA of Metro Chicago

January 7, 2013
The YMCA of Metro Chicago is a big organization, with a $110 million budget and a physical infrastructure that includes 23 gyms, five camps and 14 community schools, all supported by nearly 600 full-time staff and thousands of part-time fitness instructors, afterschool staff, and other human services personnel. In addition to gym visitors, the nonprofit provides services for hundreds of Chicago public school students and as many as 700 seniors each week.

Nonprofit Data: The Case of the Alliance for Climate Education

January 7, 2013
When the Alliance for Climate Education wanted to use data to measure impact and engagement, among other uses, it turned to consultants for help—first Seattle-based Groundwire Consulting, and then Percolator Consulting, a Groundwire spinoff. The award-winning national nonprofit is dedicated to using fun-based methods to educate America's high school students about the science behind climate change, and inspiring them to do something about it. The organization is based in Oakland, Calif., with teams in New York, Los Angeles, Chicago, Washington, DC, Atlanta, New England, Colorado, Nevada, North Carolina and Wisconsin.

Nonprofit Data: The Case of York County Library System

January 7, 2013
Data is dynamic, not static, and so are organizations—which means that over time, the data they need to track changes as they change. Identifying those evolving data points takes effort and thoughtfulness, especially for an organization whose staff is already burdened tracking data for funder-mandated reporting requirements.

Listen. Act. Learn. Repeat. How Feedback Can Drive Social Impact

January 7, 2013
There’s a dictum in software development attributed to Microsoft that any organization developing a product or service needs to eat their own dog food. What’s remarkable about this guidance is that it actually describes a pretty unnatural act: a human being eating something expressly not designed for their palate. But the recommendation is made because humans buy dog food, not dogs. So, what it's really saying is that it's worth taking extraordinary measures to close that feedback loop—because if the purchaser remains in the dark about the quality of what they are buying, better dog food will never be made. There is a related problem in philanthropy.

What is the Funder’s Role in Supporting Good Measurement?

October 19, 2012
"Data Scientist" is a newly defined job that requires being part data geek, analyst, communicator, visualizer, storyteller, and interpreter. This individual works with program experts to apply what is learned from the data. But they have also made data literacy an essential competency for everyone in the organization. The result is a shift from ad hoc data analysis and simple (though detailed) record-keeping to a systematic approach to improving their programs and campaigns by using data.

Predictive Program Evaluation: Don’t React, Predict

October 2, 2012
Any type of program evaluation tends to be reactive: something went right, so we keep on doing it, or something went wrong, so we try to fix it. Reacting is better than not acting at all, but relying solely on the after-the-fact "what happened?" -- often gathered ad-hoc in spreadsheets and anecdotal stories -- fails to take into account the hidden patterns that lie within the data as well as the dynamic factors that accompany reality. By leveraging all of our available data and analyzing it, we can move from reaction to prediction, uncovering program inhibitors and drivers to predict how our programs will perform and what to do now to make them better. We call this predictive program evaluation, which is fueled by predictive analytics.

Learn From Each Other, Not Just the Data

September 19, 2012

A simple rule of thumb is that you should spend more time learning from your data than gathering it. When you get your data, make sure you understand why things happen. This doesn't mean that you need to prove causality, most of the time a simple correlation will do. The important thing is that you think through the results, analyze them for insights and learning, and, most of all, look for failures

Make sure you educate through examples. Show how adding a data-informed approach to your social media or other media or programs can avoid ineffective campaigns and increase audience satisfaction.

What Are We Accomplishing and How Do We Know?

June 5, 2012
"What does your organization do?" That is a question most nonprofit folks can answer pretty easily. We have our elevator pitches ready to go. "What is your organization accomplishing?" Now that's a different question, one that's not as simple to answer for most of us. It's also one that is increasingly being asked by potential donors big and small.