Tag: metrics

  • Temple, TX
  • 165 Staff
  • Administers 16 Head Start centers in four cities

In four neighboring cities, Central Texas 4C administers Head Start, the federal program to promote school readiness for children five and younger from low-income families, to a population that includes a lot of transients as a result of a nearby military base; which means theres a lot of client turnover. To help keep track of that changing population, the organization relies on data, said Executive Director Janell Frazier.

“I think as far as data tracking is concerned, we track everything,” she said, “all the family demographics, what the child’s needs are, his learning goals or objectives for the week, and whether he is making progress on them.”

The nonprofit is required to track and report on services for federal funding requirements, as well, Janell said, which end up taking a good amount of staff time and energy.

“All our money is from the federal government, and it better be spent on what its supposed to or they take it away,” she said. “Taxpayer dollars are precious. We have to be able to prove that were using that money wisely. They want details about every kind of data we have, and we couldn’t do it without the data, or, we couldn’t prove to anybody that we could do it. And if were going to go after funding, we have to do that.”

To meet those requirements, as well as to improve the efficiency and effectiveness of the organizations programs and service, staff track all sorts of information about each participating child. Data about their progress is evaluated and fed back to teachers on an ongoing basis to facilitate improved ability to help the child develop. That data is tracked for each student, in all more than 700 a year.

“That’s a challenge,” Janell said. “We measure outcomes for children, that’s primary, and what I believe has led to our excellence. We’ve gotten international attention for our work with babies up to age three just based on the data we gather and analyze. We developed a database for outcomes on infants and toddlers. We have people who work in the homes with the families and the teachers and the special needs people, and it’s all tracked. Have we served the children well? How many have been diagnosed? Are they getting services? Do they need dental care? Can they eat? Are they healthy? We need all that data.”

Almost as important as the data itself is the process of collecting it, she said.

“Every year (Head Start) has a program informational report that captures untold data on our children, families and staff, and we have to send that information in every year,” she said. You can’t get that information at the end of the year, its too big, you have to track it all along.”

“None of the organizations 165 staff members is exempt from participating in the data effort,” she said, “everyone is involved, regardless of title or role.”

“Everybody gathers data,” she said. “Everybody. Even the cooks; we cook for all our centers and transport the meals, so how many ounces of food each child is getting, temperature logs, all that stuff, inventory control. We have vehicles we have to keep up with, our drivers have to understand each vehicles drivers log, it doesn’t matter who it is, they have to capture the data for that trip.

“Even my maintenance guys keep inventory control, which centers need fire marshal inspection within 30 days, is everything in the First Aid kit appropriate and not past the expiration date, are the medications in the fridges logged properly and signed off on, is there one person in charge of giving.”

To track data, the organization uses a mixture of tools and methods that range from a proprietary Head Start database to Microsoft Excel. Janell said some data is even recorded the old-fashioned way – by hand. In her experience, she’s been with Head Start for 20 years, 13 of them at Central Texas 4Clow-tech methods aren’t necessarily a bad thing.

“I’m sorry, but my spreadsheet does a really fine job,” she said. “I love technology, but as far as making it easier? No sir. When you’re training people to use a computer, a new system, there’s also an untold amount of paperwork and data you’ve got to do at the end of the day.”

Overall, she said, technology has made it possible to track incredible amounts of data. Head Start in Washington overlooks programs in 50 states and Puerto Rico, plus migrant-related programs, and that kind of oversight wouldn’t be manageable without technology, she knows.

But in most cases it leads to information overload, she said. “We’re always getting hit with new frameworks to look at. How do you know you’re meeting your goals? Every time there’s a new goal to look at, there’s new data to collect for it, and you’ve got to retrain your people midstream, and there’s something else they have to learn.

“I know how data can be used to help the big picture,” she said. “As far as tracking it myself, I have to physically work my body into some weird shape just to track data. I think one of the problems with data is that it’s so rare to see valid data, reliable data. If the data says something is so, if it says a kid can do something, when I come in I better be able to see it myself, or that’s not valid data.”

Despite her staff’s best efforts, she said her organization is always at peak capacity for the types and amounts of data it can reasonably manage. Part of that is the burden it puts on staff time, and part of it is funding-related, she said.

“Until education is funded like a business, were not going to be able to do much more than were doing, she said. Funding is a big problem. It’s expensive to get data. We need the funding to get more data, but we need the data to get more funding. The real catch is, if we had the extra funding, I wouldnt spend it on data.”

This case study is part of the research project in 2012 conducted by NTEN with the help of Idealware. See the State of Nonprofit Data report for more information about how nonprofits are–and aren’t–making data part of their decision-making processes, and the key challenges that affect an organization’s ability to be more effectively “data-driven.”

  • New York/Kalamazoo, MI/Cambridge, UK
  • 25 Staff

The Arcus Foundation focuses it’s grant making on two areas: social justice and conservation. The Foundation has 25 staff members spread across offices in New York City and Cambridge, UK and Kalamazoo, Michigan. In social justice, the foundation is particularly interested in global lesbian, gay bisexual and transsexual (LGBT) rights; in conservation, Arcus’ focus is on the protection of the great apes and their habitats.

Arcus uses data in a number of ways to advance both missions, according to Grants Management Associate Kerry Ashforth, and looks to other organizations in its field—including the grantees that rely on the foundation for funding—to help guide it through the process of establishing and refining its data practices.

Kerry said the Foundation has been participating “to an extraordinary degree” in efforts to improve the knowledge it gathers through data collection, and the process involves adapting to what works and tweaking what proves not to.

“There are a lot of questions,” he said. “We haven’t cracked a lot of these nuts yet—we’re still struggling with it. We’re in a transitional time where management of data is starting to be the core skill for doing grant making work effectively. There’s a big gap between skills needed and skills owned, and a bit of a learning curve, but I think were in good company—we might have a lot to learn from our nonprofit partners that are doing this well.”

One way Arcus uses data is to assess how well a grant under consideration might perform in terms of “moving the needle,” both in terms of the immediate grant and the bigger picture of the overall mission.

“We try to assess not only whether that project will succeed, but we are also interested in the health and capability of the organization itself and, how it fits into a field of organizations,” Kerry said. “It’s been an incredibly iterative process.”

“Ideally what we want to have in place, and what we have in place, is an overarching strategy for each of our impact areas,” he said. “We use data differently on each side of the house, and each program is in a different stage of evolution. The great apes program is an environmental, science-driven focus that incorporates economic development and sustainable industry—things that are measurable in many ways.”

“The great apes program is currently at the midpoint of a five-year strategy arc. Along the way, the team has developed some different understandings about data and how to gather it,” he said. The conservation program has what he called an “incredibly evolved evaluative framework” that will lead to success conserving the great apes, with a longer-term strategy that includes benchmarks and indicators of success.

“There are many different types of measures for what were trying to achieve,” he said. “You can track protected range, number of great apes in concrete things, but you can also look at social change and economic development. We’re gathering a lot of data on the organizations and the field itself. We’ve got indicators that reflect progress along different axes, and we place that information into the context of our strategic framework. Along the way, and at the end of this five-year stage, well look again and see if we need to adapt what we are measuring.”

The social justice team is still developing a methodology for identifying the data it will need to collect and the indicators that must be tracked to make sure Arcus is meeting its mission.

“With social justice work, it can seem a little more amorphous, and can feel incredibly broad and difficult to narrow down,” Kerry said. “We’ve been making many different kinds of grants through that program, doing movement-building work, policy change work, capacity-building grants, etc. Our task is to figure out which bits of data are the bellwether data for each type of grant.”

“As part of the overall data strategy, the foundation involves grantees in the process to build consensus around our goals, and to confirm that we are moving in the right direction, and looking at the best indicators of forward movement,” he said. “We’re in our 12th year of operation. When we started out, we asked for more basic information. As we’ve continued to make grants and learn from them, and to build strategies and evaluate their success, new questions have emerged. It’s definitely changed the kinds of data were looking for.”

By involving grantees, Arcus not only improves the quality of data thats tracked and analyzed—it also frees the grantees from unnecessarily onerous reporting and tracking requirements, which allows them to devote more time to advancing their own missions.

“There’s value in having streamlined processes and trying to not be overly burdensome, Kerry said. There’s value to all involved in prioritizing the data we require so that were not being needlessly demanding.”

There’s another good reason to involve grantees: in many cases, they’re the boots-on-the-ground people doing the actual work toward the overall mission.

“The partners were working with know an incredible amount about this work,” he said. “They’ve been able to not only answer our questions, but also to help us learn what we need to know. We’re learning to ask better, more productive questions. I’m hoping well reach a point where were able to share this grant-related data in a way that helps the fields in which we work great apes conservation, social justice, and the philanthropic sector.”

Arcus is doing a lot with data, but Kerry said theres a lot of room for growth and improvement. “Grants management falls under organizational learning and evaluation at Arcus,” he said. “We are currently working toward building an organization-wide knowledge-management system, and we’re reviewing the landscape of tools and products out there. The Arcus Foundation is going through a process that I can see a lot of our grantees struggling with—we’re struggling with it ourselves.”

“Arcus is a small organization. We want to make sure our operating costs are manageable so were not eating into grant money. It takes a long time. We’ve been working on our data systems for quite a number of months. The process moves forward, but not at a rate we’d like, because so many questions come up, and answering them is important. Ultimately we want to achieve our mission, and we need to know if were doing that.”

The Foundations board of directors monitors evidence of progress toward that change, Kerry said. “We actually believe that the investment in our new system will be more than offset by the gain in results.” With limited resources, Arcus wants to make smart decisions about how to use them not just to make grants, but to make grants with the most impact.

This case study is part of the research project in 2012 conducted by NTEN with the help of Idealware. See the State of Nonprofit Data report for more information about how nonprofits are–and aren’t–making data part of their decision-making processes, and the key challenges that affect an organization’s ability to be more effectively “data-driven.”

Archers don’t aim at the point halfway between the arrow tip and the target. If they did, they would never hit the bullseye.

In the same way, many nonprofits still focus too much on counting Facebook likes and Twitter followers as if these metrics are the end goal, and then feel frustrated when they’re not getting the results they expected.

Going beyond counting likes and followers means asking a number of quantitative and qualitative questions about:

Reach: 10,000 Facebook fans doesn’t mean you’re reaching 10,000 people. In fact, a Page with 10,000 fans reaches only about 1,700 of their fans with updates.

Surprise, surprise! Additionally, you want to be asking:

  • Who are you reaching?
  • How are you reaching them?
  • How frequently do you reach them per week or month?

Reaction: 10,000 Facebook fans means nothing if they aren’t talking about your nonprofit. And that’s the whole point of your using social media, right?

People who are talking about you are usually a subset of people you’re reaching. Some questions you want to ask about people talking about you are:

  • Who is reacting?
  • Where are they reacting?
  • What are they saying?
  • What are we saying that get’s them talking?

Action: No amount of followers and fans have any value unless you’re converting people. New members, subscribers, donors, etc.

Some questions you want to ask about people you’re converting:

  • Where did they come from?
  • What actions did they take previously?
  • Who is taking these actions?
  • What actions are most important?
  • Are they also talking about us?

Always put goals before metrics

Defining and measuring reach, reaction and actions becomes much easier if your goals are crystal clear. Interestingly, lack of clarity is a big reason why many organizations don’t go beyond measuring fans and followers.

Instead of focusing on metrics first, ask yourself what you ultimately want people to do. Then the metrics come into view.

This article was originally published at http://www.socialbrite.org/2012/05/29/beyond-counting-likes-and-followers/ and is reprinted with permission.

Strategic, mission-critical decisions require a clear understanding of program performance, but for most nonprofits, making decisions on facts rather than gut feelings is easier said than done. As a way to better understand program performance and become more responsive to changes, organizations are increasingly turning to dashboards—custom utilities that gather, organize and present information in an accessible way—that let them more effectively measure, monitor and manage the way they meet their missions.

There are a number of ways to create dashboards, including using such common tools such as Excel or Access, or proprietary systems embedded in databases. But these approaches can lead to dashboards that are not very user-friendly and difficult to update.

Business Intelligence tools, or “BI” tools, take the dashboard idea to the next level. More than simply graphically displaying static data, they offer trend analysis, forecasting and drill-down capabilities that can dramatically expand your insight into program performance. With a good BI tool, you can combine data from multiple sources, view it from different perspectives and distribute it more easily. Beyond simple reporting, BI tools allow a more comprehensive analysis of your organization’s data1.

Not every organization needs a BI tool, but if you have a solid strategy for program evaluation and monitoring and already capturing the raw data you need, but struggling to analyze data and make use of it, one of these tools might be a good fit. We’ll look at a few of the better known options—but first, let’s take a closer look at what they can offer, and how they work.

The Benefits of Business Intelligence Tools

Business Intelligence tools have grown in popularity with nonprofits for a number of reasons. The most prominent are better data integration, more flexibility, easier distribution of data and better visuals.

Many organizations have data that is compartmentalized in separate information silos—you might have donor information in one database, client information in a second database, and financial information in a third. BI tools let you integrate data from numerous sources, and many can even pull data from Excel spreadsheets and Access databases, or any database that has an Application Programming Interface, essentially a way for programmers to access and export your data.

Most BI tools move the data into a cache—a virtual storage space—or a separate data warehouse, effectively creating a separate database. This allows the data to be manipulated for analysis without affecting the data in the original databases. Data can be loaded manually, or set to load automatically at predetermined times or when certain actions occur.

If you want more flexible reporting than you can easily get from your existing systems, BI tools can offer even more flexibility than something like Excel or Access, and can display data dynamically from a variety of perspectives and in near real-time. (Because the data is loaded from the original source into the warehouse, it’s not fully “real time,” but it’s close.) For example, it’s possible to see a report on the number of people admitted to your program, view the success rate at different stages, then see a trend analysis of the success rates over time.

They also offer more diverse ways of accessing and distributing reports and dashboards. You can email staff up-to-the-minute program data on scheduled days and times, and easily view and manipulate dashboards from a smartphone or tablet or anywhere you have an internet connection. Many also let you set up custom portals for certain audiences (such as your board of directors) that grant unique access to view predefined performance measures and reports.

All the data in the world won’t help if it cannot be displayed in a way that is easily interpreted. BI tools go far beyond dials and bar graphs, and can produce sophisticated graphics like scatter plots that move with time, spark-lines that show thousands of data points, and forecasts that assume different user-inputted scenarios. Not only can these graphics display more information, but they can also be customized with colors and themes that match your organization’s brand.

The Considerations 

Though BI tools vary in cost, most organizations can find an affordable system for their price range that satisfies their specific requirements. However, there are other things to consider before you invest any time or money, including your organization’s evaluation strategy, data collection, and personnel.

Your organization needs to have a solid strategy for evaluation in place. Do you already know how your organization wants to analyze its programs, and what metrics to track? No BI tool will give you these answers, and spending time outlining the requirements early in the process will save a lot of time and money later. These metrics need to be very specific. Outline the data fields you want to use, and how you want to analyze them. However you choose to define them, make sure you know the basics of what you want to analyze and how you plan to do it before you begin shopping for a BI tool—even if it means consulting with someone who can help define an effective and appropriate plan for measuring your data.

After implementing a BI tool, many organizations struggle with “operationalizing” the data to build a strong performance culture—in other words, finding ways to improve performance based on what the data show. Your organization may want to change processes once it begins analyzing program data from different angles, but if the data in the BI tool is not organized the same way as in your primary system, these changes might be difficult to enact. Data warehousing creates multiple data models, and can result in making key data points difficult to operationalize—if this is one of your goals for purchasing a BI tool, you may want to first consider reorganizing your original data model. The greater the difference in the data models, the more difficult or complex the project will be.

You also need to weigh the human side of the equation. What skills do your organization’s staff already have? Every new tool will require learning some new skills, and training is a consideration. A consultant can help get your organization up and running, but be ready to invest in IT staff training for configuration and ongoing maintenance—doing so during the implementation process can engage them and help them learn from the consultants.

A Few Good BI Tools

There are a number of BI tools on the market, with new ones being developed all the time. More and more of these systems are adopting web-based platforms or using in-memory technology, which means they offer increased accessibility and responsiveness over installed systems, but can suffer when scaling to very large datasets, and require a fast, consistent internet connection. These tools tend to be less expensive than other types, but these are all complex systems that will require someone with data expertise to set up and maintain.

GoodData’s cloud-based platform offers great visuals, a very fast data engine, and data pivoting—a quick way of summarizing data based on different variables—from multiple data-sets updated in near real-time. The tool offers canned reports, which might not apply to all nonprofits, and adds a reports-and-dashboards sharing feature for easy collaboration among colleagues. GoodData’s tool is very quick to deploy, but setting up and administering the data model can require more technical expertise, and accessing the data using a cloud-based user interface requires a solid and consistent internet connection. If you’re looking for a cloud-based solution that allows quick ad-hoc analysis and have some more technical people on staff, this is a great option.

iDashboards is a very slick data visualization tool that allows organizations to monitor and analyze their programs while also creating “what if” scenarios for strategic planning. The user interface does more than just look good; it allows staff to personalize individual dashboards with very little training. A built-in wizard helps facilitate its connection to multiple data sources and automatic refreshes every minute. If your organization wants staff to create personalized dashboards with great visualizations for analysis and planning, iDashboards might be a good fit. (The vendor offers discounted licensing and training for nonprofits.)

Birst is an extremely powerful and flexible tool with a responsive interface that stores data in-memory or on-disk, which allows it to scale quickly while easily performing sophisticated calculations with very large datasets. Birst comes pre-packaged with standard reports; these might not meet the needs of most nonprofits, but the tool does allow end-users to make quick Excel-like calculations on their own. Setup is comparatively easy due to Birst’s new graphical interface for logical modeling and “data warehouse automation technology,” which automatically creates a data scheme. Birst is a great option for mid-sized to large nonprofits with dedicated IT staff who are working with a lot of data and seeking scalability and flexibility in deployments.

QlikView is a tool that is quick to deploy and relatively easy to administer. The tool is designed for exploration of data by leveraging their “associative model” that allows users to click on a piece of data and see all relevant data associated with it. This type of visualization is perfect for analytical end users who might not have a pre-defined question. QlikView is also very fast; the tool uploads and compresses your data in its memory to allow for very quick analysis and drill-down performance. Organizations should expect to send their IT staff to training as the complex tool does use propriety scripting, a unique interface design tool and management console. If you’re looking for a tool that is quick to deploy, looks great, is cost-effective and very fast, and have a strong IT staff prepared to attend a week of training, QlikView is an excellent option. QlikView provides the software free to nonprofits, including training and consulting. (http://www.qlikview.com/us/company/community-service)

Tableau can run straight from your database with no additional modeling required, assuming that all the data you want to report on is in a single database. This allows for very quick deployments—just five clicks—if your current data structure is sufficient. Organizations can also choose to re-build their data models in-memory. Because everyone knows that “a picture is worth 1,000 words,” Tableau used that concept as the basis for its product; the tool is extremely visual, allowing organizations to quickly see their data when answering questions. The tool’s proprietary language, VizQL, quickly translates your data into graphics that can be manipulated in the drag and drop interface. Tableau provides all of its training materials online for free, and hosts regular webinars to keep everyone up to date. Tableau is an excellent fit for organizations that do not have large IT departments, but are looking for a tool that offers a strong visualization suite and quick deployment. Tableau offers reduced cost licensing for nonprofits.

Wrapping up

Business intelligence tools can relieve a lot of the headaches associated with monitoring and evaluating programs and outcomes. Your organization’s need will depend on what (and how many) systems it’s currently using and how sophisticated you want your analytical capabilities to be. Not every organization needs a BI tool, but if you have a solid strategy for program evaluation and monitoring and are already capturing the raw data you need, but struggling to analyze data and make use of it in your organization, these tools might be a good fit.

Every tool offers connectivity and visuals, but the devil is always in the details. Setup, customization and scalability can greatly effect ease of use and cause even more headaches. Make sure your organization has thought through its data strategy, evaluates the tool completely, and purchases the one that fits the needs of future data analysis and the current capacity of the IT department.

This article was originally published at http://www.idealware.org/articles/beyond-dashboards-business-intelligence-tools-program-analysis-and-reporting and is reprinted with permission.


1 For more on APIs, see NTEN’s report: “How Open APIs Can Change How Nonprofits Manage Data”. NTEN, 2007.

Thanks to Danielle D’Antuono at the Center for Employment Opportunities for providing recommendations, advice and other help.