Tag: data visualization

Many people may not make the connection between technology and food/agriculture, but when we merge the two together, we can transform the way we connect with, produce, and consume the foods that end up on our plates. And my team at the Bainum Family Foundation’s Food Security Initiative had a small taste of this potential when we created an online platform called the Food Learning Locator, which we just relaunched in March.

The Food Learning Locator originally went live in September 2017, shortly after I joined the team. The intention of the tool was to address a large information gap. No one (our foundation and our partners included) seemed to understand the full scope and types of food-related education and job training opportunities available in the Washington, D.C., metropolitan area. As we began to collect the data, we quickly realized a greater need. We weren’t the only ones unaware of so many opportunities around us. Our community was missing out on them as well. This spurred the idea to create an easy online tool to help people find and participate in food-education and job-training programs, many of which are offered by nonprofits and other community organizations that have limited resources and time to market their programs.

After the site’s beta launch, we learned from our partners that we could add even greater value by improving the Food Learning Locator’s UX and UI. I then took the lead on getting feedback on the site, and we used it as the backbone of an eight-month web development process to add to and improve the data structure, add an administrative portal for organizations on the map, and find ways to measure the use of the tool.

But what all was involved with each of those steps? Let me share our experience with you.

User research

Since this tool had the potential to benefit many audiences (i.e., community members, advocates, funders, and program providers), we encountered a major challenge — how to create a tool that addresses numerous needs and interests. For such a wide-reaching site, we realized we had to meet people where they were to get the necessary feedback.

We sought feedback from as many people as possible, hosting a focus group with community members and one-on-one feedback sessions with healthcare providers and program providers to accommodate everyone’s schedules. We also attended community events and conferences to market the site and ask for thoughts and suggestions. With this invaluable feedback, we were able to both identify and carry out the next steps for improving the Food Learning Locator. For example, we completely changed the UI from having food-education and job-training programs on the same map to splitting those categories into two maps and we added new data points to further improve user experience.

Updates to data structure

Our challenge was to create a tool that encompassed the key program offerings we knew about in 2017 but to also structure them in a way that allowed new data points and content areas to be added over time. The food-education and job-training space is constantly evolving and adapting, so developing an agile tool was key to its long-term success.

The web developers we worked with were incredibly helpful in structuring the back end of the site to allow us to easily add new data filters or content areas requested by community members (e.g., program cost and registration information). We also made sure to structure the front-end map and back-end Program Manager Portal so that it could be easily used as a template for other cities or regions that may eventually be interested in bringing this tool to their communities.

Development of the Program Manager Portal

With the data structure refined and the UI designed, we developed a Program Manager Portal for program providers to dynamically update, add, or remove organizational and programmatic data. The addition of the portal was a huge improvement over the beta version of the site, where update requests had to be submitted via SurveyMonkey and then manually changed by the Food Learning Locator team on a semiannual basis. This portal simplified the data-maintenance process, which was particularly helpful given how quickly the site has been expanding since its launch. Since February, we’ve added more than 10 new organizations to the Food Learning Locator, bringing the total to more than ten organizations in Washington, D.C., Maryland, and Virginia. Across these organizations, there are nearly 90 food education programs and more than 30 job training programs — all displayed and searchable on our now easy-to-use site.

The portal still has room for improvement but having it has helped both our team and program managers maintain accurate information, consequently creating a more reliable source of information for community members interested in the food space.

Data collection

Like most websites, we’re able to easily track site traffic and other metrics using Google Analytics. However, this tool doesn’t help us measure in-person user outcomes of the Food Learning Locator (i.e., the number of users attending programs they found via the map or the number of new collaborations or partnerships among program providers.)

We’re still in the early stages of determining the best method of tracking these connections, as they play a large role in evaluating the success of the tool. For now, we’re relying on qualitative feedback from site users — whether directly (online or in-person) or through program providers or community advocates — to understand budding connections from the tool. We’re looking into how we can track referrals among program providers, but that’s likely something we’ll have to implement down the road. For now, we’re relying on qualitative feedback and testimonials to assess that success metric.

Key Lesson

So, technology and food/agriculture — not so separate after all now, right? While I myself am new to merging program needs and technological tools (at least on this scale), I see endless possibilities for so many fields to leverage online platforms to create real, lasting offline connections and impact. And my hope is that the Food Learning Locator will cultivate and strengthen our local food-interested community and will eventually expand beyond. With tech, the sky is truly the limit.

A special thank you to the web development team at Alley Interactive and the Food Learning Locator team — Andrew Curtis, Laura Hepp, Katie Jones, Ann Egan, and Morgan Maloney — for helping to develop this tool.

More than just a digital analogue of our road maps of old, digital maps can transcend serving as interface devices and become vehicles for data visualization and powerful visual storytelling. I’ve worked with teams to develop rich, multi-dimensional data-driven maps that engage viewers in historical events and citizen science observations across both time and geography.

In one project, I helped a Digital Humanities team bring historical literature to life, stepping through a timeline-based tour of London with the characters in Daniel Defoe’s A Journal of the Plague Year. In another project, I supported the nonprofit Journey North, leveraging their 25-year—and ongoing—database of observations and sightings, allowing viewers to compare species’ migration patterns or arrival, departure, and emergence dates.

The majority of digital maps you encounter depend upon one of those map providers listed above. There’s simply too much overhead in hosting your own map tiles on a performant server that can respond to all those requests as users pan and zoom. It can be done, but unless you can allocate resources to maintain a responsive map server, it’s better left to the professionals and platform providers. Which one to choose?

If you’re working on a nonprofit project trying to make the most of your budget, I’d like to share the toolkit I’ve put together, which leverages robust, developer-friendly APIs, open source libraries, and free-for-non-commercial-use service plans.

Choosing a direction

Before looking at which combination of platforms and services provides the most mileage, I established an important distinction in the types of maps we were producing and the support we would need.

As you navigate a digital map at a particular zoom level, your web browser is continually loading square-shaped images—map tiles. If you change your zoom level, in some cases you receive a whole new set of map tiles. (This may not be true in many cases today, where map images are vector images rather than bitmap, but is typically true for those described here.) Some of the more detailed maps you see merge your visual data into those sets of map tiles. For large datasets and one-time projects, this can result in a much more performant map, as the browser is not rendering a multitude of markers using JavaScript, and can empower a greater level of control over your map design, or integrated map and data visualization styles. Those benefits come with a price, though. If you use a service like Mapbox Studio or ArcGIS to upload data and design and generate map tiles, you’re dependent on that service to host those map tiles on a server that can respond to requests to zoom and pan those tiles. That service necessarily costs money.

In contrast, the maps I’ve worked on plot data on a standard map as markers (or pins, or shapes, or paths). In the Digital Humanities project (“Itinerary“), this is all that was needed. We also went this route with the Journey North data for a number of reasons, including but not limited to pricing considerations. We do have maps that feature more than 3,000 pins, which is something of a performance threshold. Things start to lag when you exceed that volume of markers and flattening your data points as map tiles mitigates that overhead. But our maps feature real-time data every season, with the latest approved sightings appearing on maps as those sightings are posted. Periodically manually uploading data to generate map tiles to render snapshot maps was not logistically realistic.

Thus I make a distinction between two categories of maps: custom maps whose map tiles must be hosted on a paid-for platform, which have their place and offer many benefits, and “live” maps, where we’re plotting points, shapes, or paths on standard maps, writing JavaScript code to leverage a provider’s API. Here, I focus on the latter.

Bring your own data

Your map-making journey begins with data: data residing in a database you have access to; data that you’ve compiled from other sources; or data that you’re recording to display in the future or in real time. The only real requirement to put that data on a map is that each data point has latitude and longitude values, or a corresponding set of latitude and longitude coordinates to define a path or shape. Data points in the maps I’ve worked on also include dates, which we use to render those points on a time scale. Because these data points represent a location at a specific time, I call our data points events.

In both the Itinerary and Journey North projects I’ve process data to output it in the GeoJSON format. Because our data points—events—have additional properties, they’re formatted as feature objects in GeoJSON. We can store any additional data attributes as properties for each feature object, allowing us to pass image URLs and comments for the popup; date and time values for our timeline; and design cues to our map layer.

Our use of GeoJSON on the Itinerary project was limited to reference files we manually created to define the boundaries of parishes or paths taken, which a corresponding event would light up. We used Jekyll to generate the shell of the map application, and looped through CSV files to generate JSON data files that we looped through to create markers on the map or to highlight a corresponding parish.

Journey North maintains a fairly simple sightings database, containing latitude and longitude, sighting date and time, comments, and image URL data for each sighting event. When we fetch that data, we format it for transmission as GeoJSON in PHP before handing it off to our application’s JavaScript for rendering on our maps. Our JavaScript code does not leverage any automatic GeoJSON functions in our mapping library of choice, but using a recognized, widely-embraced format for our data means it’s ready for sharing via API with others in the future, and that the tools we’re building have adopted an existing, common language.

Because we’re using JavaScript to interact with a mapping API and put events on a map, your data will need to be consumable by JavaScript. JSON is a logical choice, and GeoJSON is a lightweight standard for mapping applications.

Making your map

Your map application is comprised of JavaScript leveraging a library that can interact with one of the map providers’ APIs.

If Google Maps API was completely free and not subject to pay-as-you-go pricing thresholds, I might be compelled to go that route despite my reservations about placing all our eggs in Google’s basket. But I think there are better choices out there, from providers dedicated to developing and supporting mapping platforms, and have been rather determined to make those work. Ultimately that mission paid off for Journey North from a pricing perspective as well. (Note: We replaced a number of legacy Google Maps API dependencies for geolocation and location picker tools from Journey North in 2018 with the following solutions to reduce costs due to pay-as-you-go. Journey North receives many map views, and API hits for associated services, during peak season updates.)

Esri‘s offerings were suggested for Journey North, as they’re well-established in the scientific and GIS communities. I’ve always been a big fan of Mapbox and recommend reading their values statement.

Which came first: Choosing a mapping platform compatible with the library you like, or choosing a library compatible with the mapping platform you’ll use? In our case, it was a little bit of both, and we ended up with something of a combination of Mapbox and Esri offerings.

To put Journey North’s data on the map, we use the open source Leaflet.js library. I’d previously worked with Leaflet.js using a Mapbox base layer on the Itinerary project. I consider Leaflet.js to be most closely associated with Mapbox, as that is where its developer is based, but you can specify base layer maps from other providers. We use Esri’s Esri Leaflet plugin for Leaflet.js to instantiate our Esri basemap layer. It is via the Esri Leaflet project’s Terms of Use that we were able to identify it as a developer-friendly free option. The combination of Leaflet.js—a robust, well-documented, “open-source JavaScript library for mobile-friendly interactive maps”—and Esri’s free base layer maps gives us a full-featured, free package to support Journey North’s mapping application.

This is the “live” map route I outlined above, where data is plotted directly on a standard map instance using JavaScript. This enables Journey North to feature the latest approved data in real time. We’re also able to slice that data into date intervals that we load in layer groups on the map so that we can scrub a timeline back and forth to rewind and advance the story of a species’ migration or the advance of a season. The Itinerary project similarly revolved around a timeline that one can step through or play to see the story unfold.

Charting a course

Exploratory projects like Itinerary, accessed by a smaller audience, can fly under the radar when it comes to pricing, with any of the mentioned platforms at your disposal as long as one is willing to put a credit card on file. As is often the case in web and application development, though, the choices we make tend to comprise our go-to toolkit—we become well-versed in those platforms and libraries we choose for our first project and they facilitate our efforts in the future, along with efficiencies that help us work effectively amidst budget and time constraints.

The toolkit I’ve outlined here has served Journey North well, supporting a few hundred thousand views a month during peak sightings season. As any nonprofit can relate to, Journey North is under continual pressure to do a lot with a little, and while maps are a cornerstone of the website and community, there are other priorities and objectives. Saving money on recurring expenses and third party services affords more time to work on enhancements and optimization.

Where there are limitations, as with performance on high volume maps, there are options. If custom map tile maps are not viable, many in the mapping community recommend clustering pins to reduce the number of elements managed by JavaScript and reduce lag when panning and zooming. Clustering pins is supported by a handful of Leaflet.js plugins, one of “hundreds” of plugins available and listed on the Leaflet.js site.

A great thing about developing digital maps is that all of the platforms behave pretty similarly, and you can translate your experience with one platform into diving into another. I encourage you to experiment with all of them—and with different types of maps—as you build your own toolkit. What’s worked well in my case may not suit your needs, and you may not be subject to the same constraints as my projects. At the same time, I can attest that the approach outlined here has empowered us with an affordable, stable, robust platform for rich, data-driven interactive maps that we’ve come to rely on to engage a community of users and tell their stories.

I hope your map-making journey is as rewarding.

So, you have a new project that requires a lot of research and a deadline that doesn’t fit. Scrambling, you set off to find resources and a search engine is your first go-to resource. You wait half a second and then your confidence falters: 137,000 results.

We live in an information-rich era, but how can you filter the research that’s available and judge its merits for your nonprofit project?

Trust the information experts

Wondering where to find good information resources? Consider reviewing the online research guides provided by a college or university library. Curated by information professionals, plentiful and free, these guides can help you figure out where to look to find international, national, state and sometimes local information.

Senovia Guevara quote for NTENConsider academic sources. Several universities maintain digital repositories that include images, video, reports or datasets available for users to download for free. Research from students, faculty and research-focused organizations associated with the university are often included. Examples include the University of Michigan’s Deep Blue, Harvard’s DASH and more.

Investigate government sources

Depending on what you need, there may be a government-produced report or dataset that can help you. USA.gov, Data.gov, Govinfo.gov and the US Government Publishing Office are several options to consider. Take a look at whether your city provides an open data portal and if they do, take advantage of it.

For example, if you’re an advocate for responsible policing in Lansing, MI, you should know that the city has an Open Data Portal that provides detailed traffic stop information for the area.

Know what to search for

Your front-door search might not give you what you need, but consider other options your search engine provides – like Google’s Dataset Search. Researchers can filter by date, download format and a host of other attributes.

No matter your source, you will still have to vet the sources, but these tips should help you get on the right path and find the data you’re looking for. Good luck!


Main image: Creative Commons, Eden, Janine and Jim.

Data is an incredibly valuable tool. It allows us to observe, understand and assess our work. Importantly, data can also be used to tell our work’s story. Through metrics, measures, charts, and graphs, we can effectively highlight the impact of our work in ways that make current and potential donors take note.

As with any tool, we have an ethical responsibility to use it correctly. When it comes to using data for storytelling, we must find a way to be persuasive without being deceptive.

Do most organizations intend to deceive their audiences with data? Absolutely not! That said, it is easy to fall into some common traps when it comes to interpreting results or displaying information for our audience.

Fortunately, with just a few easy steps, we can ensure that we are crafting our stories responsibly and effectively.

Actively combat confirmation bias.

Confirmation bias is the tendency to favor information that agrees with what you already believe, and to discount opinions and data that disprove your beliefs. Whenever you embark on an analytical journey, keep an open mind and understand that the data may or may not confirm your hypothesis, and it may not always tell the story for which you are hoping.

To avoid bias, you can:

  • Enlist colleagues to help you do a bias check. Have them challenge your assumptions and the data you use to support them.

Choose the right way to display your data.

Using data to tell the story of your work relies heavily upon your ability to visualize your data.  There are many ways to do this, and even the same data can be visualized differently depending on the point you are trying to make.  This is the most helpful tool I’ve found in helping to decide what type of visualization will work best for your goal.

Once you choose how you will display your data, the first thing to consider is scale.

Take a look at the chart below. Which pie segment is the largest?

a pie chart split into 3 equal segments, with the bottom orange segment appearing largest because the image is 3D

Did you say the orange segment? In fact, each segment is exactly one-third of the pie.

A 2-dimensional pie chart split into three equal parts, so that all parts appear equal to the eye

This 3D visual distorts your perception and encourages you to mistakenly interpret the data.

Let’s look at another example. In the column charts below, it appears that there is a significant different between the columns on the left, but a small difference between the columns on the right.

2 2-column charts next to each other, with the one on the left appearing to show a greater difference

In fact, both of these charts us the same data but a different scale. The chart on the left overemphasizes the difference between the two columns by using a smaller range of numbers for the vertical axis.

2 2-column charts next to each other, this time with number showing the percentage difference

If someone were to use the scale on the left without explaining the actual small difference between the two columns, the discrepancy could be misinterpreted by the audience as much more substantial than it actually is.

Provide the appropriate context.

Are you being irresponsible if you use a chart that emphasizes a difference? Absolutely not! But you would need to provide the appropriate context so the audience knows how to interpret what they are seeing. That’s the job of the data visualizer, to clearly communicate what the audience should walk away knowing.

Here is an example of how to provide appropriate context for your story:

A column chart depicting a 5% difference, with accompanying text: This program was a success! We increased our donor engagement by 5% without spending a dime!

Data has tremendous potential to help purpose-driven organizations make smarter decisions, gain new insights, and effectively tell a story that is both persuasive and ethically sound. For more tips on how to effectively tell your story with data, check out the Data Playbook, and specifically the section on Communicating Results.

Are you using data to share the story of your work? Share your ideas with the Charles and Lynn Schusterman Family Foundation to help expand this community resource.


When we think about using data, we often think of it in terms of external audiences:

  • How can we use data to help prospective funders see why our work is important?
  • How can we use data to show donors what we have done with their money and how their gifts have made an impact?
  • How can we use data to build the case for investing in this program or initiative?

How much time and effort do we spend on doing the same for our internal audiences?

I don’t mean benchmarking, or monitoring and evaluation towards data-driven decision-making. Nor do I mean annual goals and reviews. They’re important, but they are not what I’m talking about.

Somewhere along the way, we seem to assume that, because our team members are on board with the mission, we don’t need to inspire them. They already know the impact they’re making, right? They’re here, and they compiled the data for the grant report, or they saw that e-mail that went out to our supporters, etc. They understand that it’s the outcomes that are important.

Raise your hand (here’s mine!) if you get satisfaction out of crossing something off a list, out of seeing a freshly painted wall, or out of the increasing number of signatures on a petition?

When working towards changing the world, we often have projects and programs that are also ambitious. They take a long time before you see any effects. Or maybe they involve slogging through lots of mud as you try to build something new, or figuring out what exactly to build as you’re building it. We all get tired when we’re in the middle. We question why we are doing what we’re doing, and whether what we’re doing is even making a dent, or ever going to lead to that desired outcome.

Those are all important questions. But we should ask them to ensure we’re on track, not in reaction to being exhausted from hacking through all the weeds. And it’s easy to forget that sometimes the connections that are clear to us are not to everyone else. Maybe you can see the path up that mountain, while everybody else sees underbrush and thinks the group lost the trail a long time ago.

Think back to when you were kid, trying new things. Think about an activity that came easily and quickly to you. Now think about something that was much more difficult. Which activity did you stick with?

In change management, this concept is called “small wins.” But you don’t necessarily have to be working on a big change; this concept can be helpful to any long-term or large-scale effort. It can also be helpful even in shorter term projects when there are a lot of little pieces and most people only get to see their own section.

So how do we build this into projects without creating a lot more work?

Once upon a time, I was in charge of sending out direct mail appeals. Which meant asking everyone in the office to come help stuff envelopes. People joined as their schedules allowed—a couple hours here, a half hour in between conference calls there, etc. To most, it seemed an endless stream of letters, envelopes, and mailing labels. (Okay, it did to me, too!)

A week or two after the mailing had dropped and donations began rolling in, I e-mailed everyone who had helped, letting them know the total number of envelopes they had stuffed and the donations received so far as a result of this appeal. People were excited to know what they’d done together as a group and the preliminary results.

Doing this did not involve data that wasn’t already being collected or that I didn’t already have access to—but it was data that not everyone else working on the project had access to see. It doesn’t need to be some sophisticated dashboard. It can be if that’s your thing, but don’t allow perfecting that to be another excuse to procrastinate on sharing this information. And that’s my point: you rarely need to collect extra data. You merely need to help others see the rest of the picture, then move on.

But what if I look at this data and we are off track? Then course correct. Use the data to show others where the gap is, create that sense of urgency, and get their help in fixing it.

Picture the week before summer camp. Schedules have conspired such that most of the experienced staff was out of the office for other events. One of the biggest pieces is getting registration completed for the 200+ campers and chaperones we had invited from our program sites across the country. People were plugging away, but nobody had a clear idea of how much stood between us and the finish line or what to focus on, in large part because a lot of the data from paperwork had not been entered in yet.

I looked in the camp roster (an Excel spreadsheet) and quickly tallied up the number of complete registrations, incomplete registrations, and spots to be determined (i.e. held spots with no data entered). We held a team meeting in the morning. I shared these numbers, explaining that our goal was to close the gap, but that the priority for the day was entering the rest of the paperwork so we could tell how big of a gap it really was. Then I sent out an e-mail update including everyone on the team in and out of the office. I also posted the numbers on a piece of paper on the conference room door.

Throughout the day, the interns crossed out and updated the numbers on the door. They were excited to see the number of completed registrations go up while the other categories went down. At the end of the day, I e-mailed an update to the team comparing our morning numbers with our end of day numbers so that we could see 1) how far we’d come, and 2) what we needed to accomplish the next day. We closed that gap by the end of the week.

I can’t claim credit for getting all this paperwork in—everyone did their part to close the gaps once they could see clearly where the biggest gaps were and where to focus their efforts. All I did was filter and count up rows in Excel, then put that information in front of the people who needed to see it. End result? We got what we needed for our kids to come to camp and have a great time!

It doesn’t have to be complicated. Many times, it is not that these small wins are not happening so much as that only a few can see them. Data is simply another way to help our teammates see both the forest and the trees.

Think about the data you have and how you can use it for internal audiences:

  • How can we use data to demonstrate to team members why the program or project is important?
  • How can we use data to show our colleagues how their contributions have made an impact?
  • How can we use data to build the case for investing staff time in this program or initiative?

Just as we hope to make our supporters feel like they are a part of something greater by sharing data on what they’ve helped us achieve, let us also remember the people by our side who want the same thing.

Photo credit: Carlos Muza

This article was originally published in Evergreen Data blog. It is republished here with permission.

Annual reports are where nonprofits and foundations pull out their designer big guns. This is where they show off their muscles. The annual report is the place where an organization oils its accomplishments ’til they shine. The annual report is so important, most organizations still put in the cash to publish it in hard copy. Graphic designers load the report with beautiful pictures of success stories. But the data visualization sucks. Every time. And by “sucks,” I mean it’s nonexistent.

Take, for example, my past client Global Communities. Gorgeous report. Dozens of pages of their work around the world with big, compelling pictures. Then you turn to the last page, The Financials. And it is the exact opposite of every other page in the report.


Why not visualize this? (Well, I suspect the short answer is because most graphic designers are not trained to handle data.) Here’s a first draft:


One of these is going to grab a lot more eyeballs, don’t you think? Tables do not engage because they are simultaneously really boring and really overwhelming—it’s too much information for a brain to process and make interpretations. But visualization digests the information and pulls out key points, like the fact that net assets are on the rise and administrative costs comprise 10.15% of expenses.

Some nonprofits and foundations throw in even more data, comparing each line item to the year before. Here’s one from The Bill and Melinda Gates Foundation, developed by guys who do the same for many large foundations.


Sure, a super accounting nerd will be fine with this display. But the point of producing an annual report is to speak to the public, which includes a lot more folks than accounting nerds. So how about a display more like this:


Now it’s much easier to see at a glance where Gates increased and decreased. One makes me stop and look and the other makes me think they are trying to bore me so much that I don’t notice how performance is actually changing.

Beyond being way more accessible to the public—your potential donor base!—a visualized financials page also goes in your board book. Think of how your board of directors will kiss your face when you show them a financials page that makes them drool just as much as the glossy photo case study pages.

“Show me the impact.” Have you ever uttered these words? Do they stir in you motivation, excitement, perhaps a little nerdy thrill? Spin that excitement into 800-1,200 words in Connect next month! March’s theme is “Measuring Impact.”

Do you adore data visualization? Are you carrying some serious how-to action on better grant reporting in your pocket? Do you have some burning dashboarding successes to share? This is a chance to shine!

Connect articles generally include the following (you can check out our guidelines and monthly themes here):

  • 800-1,200 words
  • At least one evocative image (700px wide or greater) that you either own rights to or are under Creative Commons license
  • 2-3 sentence bio
  • Profile photo

Kindly note that Connect articles are about ideas, not products or services.

If you have an idea for a guest article, please email me and let me know! The deadline for March Connect articles is February 21.

And thanks to all you wonderful impact nerds!


Infographics are all the rage these days, and not just because they’re pretty.

The popularity of infographics reflects some core truths that nonprofit communicators know well:

  • Your supporters want information that’s easy to understand
  • People are bombarded by messages, and there’s limited time to grab and keep their attention
  • Humans process visuals much easier and faster than text
  • Images and graphics are more memorable than words

Cognitive scientist Alan F. Blackwell has shown that pictures are worth 84.1 words. That may not be 1,000, but it’s pretty significant if you want to communicate effectively to your supporters.

On one hand, creating an infographic can be easier than writing a bunch of text if you know what you want to say, you have compelling statistics or visuals, and you understand your audience and what they need. On the other hand, it can be easier to write pages and pages of text, with smart-sounding words, analysis, statistics, and stories. The truth is, neither will be effective without a compelling message that’s relevant and interesting to your audience.

Nonprofit professionals are focusing more on visual communications to convey results, impact, and need. But I recently learned that infographics can also be incredibly valuable in strategic planning.

After spending months developing a strategic communications plan for my organization, I was quite pleased with my 76-page, wire-bound document. It was filled with benchmarking, analysis, and (I think) smart recommendations. I presented the plan to my bosses over the course of two hours. We talked so much about the analysis that we barely got to the recommendations, which was the whole point.

Their feedback? The benchmarking, analysis, and general recommendations were fine, but there wasn’t enough about what we were actually going to do. And they didn’t have what they needed to be able to explain our strategy to other executives and board members.

With the help of an outside perspective—from Carrie Fox at C.Fox Communications, whose team has done excellent strategic and creative work for us—I realized that the substance was good, but I needed a different way to package it. Working together, we created a single infographic to summarize the key concepts of the 76-page document.

Communications strategic plan


In the end, we were able to communicate our team’s overall purpose, four strategic pillars, and our annual priorities. When my boss said, “Our board chair will really like this,” I realized it was what she had been looking for all along.

Because the analysis and planning had been done, boiling it down to its core wasn’t too complicated. What we ended up with straightforward and simple – so simple that to me, it’s almost common sense. Yet it reflects a bigger plan, and it has substance to back it – and now others can understand where we’re going.

This approach worked so well that we’re now using our one-pager as a guide to develop communications strategies for our internal clients. Working to create a one-page plan doesn’t replace the need for thoughtful, comprehensive strategies and tactics – but this approach is helping us ask the right questions, stay focused, and create practical roadmaps that can be shared with others.

At a gathering in Bellevue late last year, documentary filmmaker Tim Matsui’s acclaimed “Leaving the Life” project about sex trafficking in the Northwest turned out police detectives, trafficking survivors, social workers, concerned citizens, and even an ex-pimp.

Also in the house: our digital sidekick Harvis.

Harvis is an interactive web app we developed at A Fourth Act to help storytellers integrate authentic community engagement into their creative productions. It works like this: First, audience members point their mobile devices to our mobile web app and self-identify by category (e.g. police officer, survivor, social worker, etc.). Then, once the film begins, participants swipe up when they feel motivated to act and swipe down when they feel helpless, generating digital data that helps guide an analog discussion following the screening.

Get more tips! Subscribe to Connect Monthly to receive more content like this in your inbox.


We’ve now taken Harvis on the road about a dozen times with several different projects, but Bellevue remains the site of one of our favorite ‘huzzah’ moments, a stirring example of what’s possible when all the perspectives and expertise and emotions in a room are given a vehicle for expression. It started with a chilling scene in the 17-minute video chapter: Lisa, 19 years old and high on heroine, arrives at a Seattle shelter for sexually exploited women, picked up by police only minutes earlier while giving a client oral sex in his car. Viewers learn that she’s been working the streets since 13 years old, stuck in a cycle of sexual exploitation to support her drug addiction.

If the scene offers any silver lining, it’s that Lisa appears to be in good hands. A police officer at the center wraps a fresh robe around her shoulders for warmth and lets her know that she can drop by anytime. “We care about you,” he says. “If you ever had a dream, we’re going to work night and day to help you achieve that dream.”

As all this unfolded on the big screen in Bellevue, the mobile web app captured the audience’s emotional responses in real time. The police officers, we found, weren’t particularly moved. It appeared that for them, the film clip depicted business as usual—a rather ordinary example of officers doing their best to comfort a victim. Not much emotional tug. So not many swipes.

But the mobile web app also revealed another insight, captured in colorful spikes on our interactive data visualization shared with the audience immediately after the screening: The social workers were swiping en masse. In their eyes, the police officers (both of them male) were towering ominously over the female survivor, perpetuating a power imbalance that hinders trust. Using Harvis’s comment submission feature, several of the social workers explained their reactions.

“This whole film snippet just feels so paternalistic,” one wrote. “It is patriarchy that created this whole problem in the first place, where girls and women are seen as helpless victims who need to be saved… These police have so much power and privilege that is not acknowledged.”

Added another: “The police officers need to ASK FOR CONSENT before they touch someone! That scene with the bathrobe was very disturbing.”

At a traditional film screening, these are the types of insights that exist only as passing thoughts—or, at best, as isolated comments in an open mic discussion dominated by the most outspoken voices. We created an app that creates a space where both introverts and extroverts (and everyone in between) are empowered to share an opinion.

It also creates compelling points of entry for the post-screening dialogue. In Bellevue, the social workers’ critical comments sparked a lively conversation about how police can more effectively support survivors of sexual exploitation. The police officers and social workers in attendance both offered their perspectives, allowing mutual understanding to emerge between two communities united by their shared purpose—but often separated by rigid silos in their professional practice.

Beyond Bellevue, one of our long-term goals in our collaboration with Tim Matsui is for the app to act as a training tool for police departments and other community organizations that serve vulnerable populations. The feedback we’ve received so far is heartening. The professional facilitator in Bellevue — initially a tech skeptic—raved about the app’s real-time data visualization and mobile-app feedback. And Tim gave the app a ringing endorsement: “It exposes the differences in opinion and helps create starting points for facilitated discussion,” he told us. “I’m integrating Harvis into my model for audience engagement, and [I’m] turning my movie into a movement.”

We love putting our app in the hands of creative minds who share our vision for connecting storytelling with social change. When the defining metric of success shifts from “awards won” to “ideas generated” and “change inspired,” exciting new opportunities emerge for storytellers.

Imagine, for example, what would be possible if city planners and urban residents came together to watch a stirring documentary about gentrification—and then discussed the possible solutions? Or if a film about criminal justice policy became the seed for an inclusive community conversation featuring the voices of police, prosecutors, community members, advocates, victims, and inmates?

What’s possible is collaborative social action—and not just for communities with a resident documentary filmmaker. We see our app as a tool for book writers, university professors, professional mediators, community organizers, and nonprofit activists—basically anyone aiming to convene conversations that are focused, inclusive and productive.

You can check out our video walk-through of Harvis or drop us a line.

Why would mapping communities be important for communications efforts?

Mapping is important because it can help us see where we are (current boundaries), where we’ve been (past reach), and where we’re going to go (gaps and opportunities).

It can share insights that might be buried in a narrative write-up that’s too dense or a data set that’s complex and hard to decipher.

A 1 1 0
B 1 1 1
C 1 1 0
D 0 1 0

Figure 1 Example of a relationship matrix.

Mapping can help us explore new ways of approaching our work by showing where boundaries lie or patterns exist.

Figure 2 Image Source: Wikimedia Commons
Figure 2 Image Source: Wikimedia Commons

You can map anything that involves two or more actors—organizations, donors, employees, machines, etc.—that have (or lack) relationships.

It can be done by hand, using desktop software, or more robust tools.

At the Leading Change Summit (LCS) in September, we will be exploring several different tool options and frameworks. The connection among these different approaches is creating a visual representation to look at data in a new way.

One very powerful framework to do that is using a framework called Social Network Analysis, also called Network Analysis.

Math and Maps

Social Network Analysis (SNA) is a methodological approach that measures and maps network relationships. It is a conceptual framework that applies graph theory to sociological studies.

Get more tips! Subscribe to Connect Monthly to receive more content like this in your inbox.


For organizations, individuals, groups, and so much more, SNA can be a powerful approach to understanding your various relationships. Possibilities of network analysis application are almost endless—network analysis can examine audience connections, innovation diffusion, disease outbreaks, sales—anything where there is an interaction between two or more network players.

You can create maps to see:

  • information flows (where does information get collected or pass through?)
  • learning or knowledge-sharing (who do people seek out to ask questions? where are cross-discipline connections sharing within an organization?)
  • attendees of events (who’s been to multiple events?)
  • donations to fundraising campaigns (which donor has contributed to several campaigns and by how much?)
  • key nodes within a network (what’s an actor’s Kevin Bacon number?)

There are many ways to you can get data to explore social networks. Digital channels like Facebook and Twitter can be especially reach sources of information.

Curious about which individuals and organizations are using your hashtag?

Map it!

Figure 3 Snapshot-in-time map of 100 tweets on July 17, 2015, of Twitter accounts using the hashtag #endpoverty. Nodes labeled by Twitter image and sized based on number of followers. Tweets limited to 100. Map created with NodeXL.
Figure 3 Snapshot-in-time map of 100 tweets on July 17, 2015, of Twitter accounts using the hashtag #endpoverty. Nodes labeled by Twitter image and sized based on number of followers. Tweets limited to 100. Map created with NodeXL.

Want to see who is connected to you on Twitter?

Map it!

Figure 4 Snapshot-in-time map of NTENorg Twitter account on July 17, 2015. Limited to follows, mentions, and replies on that date. Boundary of 200 recent tweets per user. Map created with NodeXL.
Figure 4 Snapshot-in-time map of NTENorg Twitter account on July 17, 2015. Limited to follows, mentions, and replies on that date. Boundary of 200 recent tweets per user. Map created with NodeXL.

Looking to see who donated to your various campaigns and the size of their donations?

Map it!

Figure 5 Map of donation network comprising four campaigns. Each node represents a donor and is scaled by aggregated size of donation. Red indicates donors donated to multiple campaigns; other colors indicate different campaigns. Map created with NodeXL.
Figure 5 Map of donation network comprising four campaigns. Each node represents a donor and is scaled by aggregated size of donation. Red indicates donors who donated to multiple campaigns; other colors indicate different campaigns. Map created with NodeXL.

Note: these visualizations build off of data sets built in NodeXL (a free, open-source tool) that contain even more robust information on these individual actors’ attributes and relationships. These images serve just as a “tip of the iceberg” to the relationship data.

Is Mapping Everything?

Maps are visualizations of data to help see patterns and explore new ways of understanding information.

Connections are key to communities for social capital, identity construction, and information flows. SNA visualizes these network flows through nodes (individuals, organizations, events, etc.) and links (connections/interactions). These links can vary depending on context, even with the same actors.

It is important to note that network actors are complex. For instance, people can have a multiplicity of expressions—dynamic, evolving roles that change overtime—within the same network.

Think of your family network. Sometimes you may be the “leader” of your family network, say, if you are organizing a family reunion; but other times, you may be the invitee when someone else takes on that organizer role for a birthday party or a hike.

Also, networks are not based on one actor of course; they may well have multiple leaders as other roles. People are complex. Organizations as well. Naturally, networks ought to reflect this complexity.

This network richness is an important element for organizations and individuals alike to grasp as they examine networks and relationship structures.

Limitations to Mapping

Just as mapping an unknown terrain, maps are only as good as the data that are imputed into them. Limitations of mapping can include a small sample size, respondents may not be representative of entire audience, questionnaires may have been confusing yielding inaccurate results, and so on.

Also, mapping is not a strategy unto itself—it is a tool to better see patterns, boundaries, and gaps in relationships. It allows us to help inform our strategies, not create them.

Mapping allows us to see the landscape of relationships over time. As an NGO’s online community grows, it reaches new audiences by tapping into its members’ networks and leveraging them to in turn tap into further new networks.

Relationships, of course, are dynamic and can be actively shaped. This gives us much power and also necessitates us to repeat the mapping cycle by issuing a new survey, collecting new data, and analyzing the findings.

What have you mapped? What would you like to map? What tools and techniques have you found useful? Would love to hear your thoughts in the comments below.