Tag: Collaboration

Coming to the NTC? Good stuff. Do you have a laptop, tablet, or smartphone? Great. Then you should attend the “Hour of Code” workshop to learn how stuff works, like our smartphones and the software that runs them.

Through the Hour of Code, a global movement reaching 190,000,00 people so far, students from the age of 4 to 104 have learned about code using hands-on tutorials offered in over 40 languages. The event is so popular that 200,000 events have been held! Even President Obama has participated in this one-hour introduction to computer science, with the intent of demystifying code and teaching the basics.

LongshoreMissy.img1

So what will you learn? The focus will be on problem-solving skills, logic, and creativity–skills we all use in our nonprofit work already every day! While you won’t walk away with the ability to code a website or app from scratch, you’ll know more about the code behind almost everything we use or touch in our modern world, and some of the coding concepts that will make you successful in your career, whether you choose to become a developer or not. Most importantly, you’ll learn that anyone can code! If you have any doubts in your abilities (as I did), or even questions about what code is and what it looks like and what it means, this is a great place to begin exploring those questions.

What does this workshop represent at the NTC? A few things: an opportunity to try something new, or something you haven’t done in a while. It also means, particularly in 2016, there’s no longer a need for a divide between a ‘program’ staff person and a ‘technical’ staff person–you can be both! At the very least, program, fundraising, and technical/operations staff should all be able to speak with each other using the same language, and the Hour of Code can help with that. For example, let’s say you’re working on migrating your constituent data into a new database. Once your donor, volunteer, and funder data is in the database, you’d like to be able to send emails to each of those groups, but you don’t want any one person to receive both emails. The developer comes to you and says this will take many hours to accomplish. If you’re the Communications Director working on this project with a limited budget and tight timeline, being able to ask questions such as, “Will the code be documented?” and, “What coding language will you be using?” and, “Will I be able to update the code myself?” can help your organization decide if you should change your requirements or proceed with using custom code. And we all want to know exactly what we’re getting into with expensive custom projects, right?

Photo Credit: #WOCinTech Chat, cc v 2.0

What do you need to know? No experience needed. If you don’t have a laptop or browser-enabled device, come anyway! We will pair you with someone who has a device–it’s the best way to learn!

Take me to your leader: This workshop will be led by a coder and a non-coder. Kieren Jameson teaches women to code in the Salesforce language of Apex through the course she co-founded and runs (along with many other amazing volunteers), ‘RAD Women.’ RAD stands for Radical Apex Developers. Missy Longshore is a graduate of RAD Women and also learned the basics of her first coding language, JavaScript, through Girl Develop It. Kieren’s blog, Women Code Heroes, has also helped hundreds of people of all backgrounds understand core coding concepts in bite size, easy to understand articles with helpful illustrations and graphics. To top it off, Kieren works at ETR, a respected nonprofit, where she’s been for over 15 years and is currently their Digital Solutions Manager. Missy has been working as a nonprofit staff member since 1999 and founded Longshore Consulting in 2012 to demystify the Salesforce process for as many nonprofits as possible through collaborating with non-technical staff. She does not have a computer science background but has loved the informal training and skills she has learned through available nonprofit and volunteer-led coding courses and peer support.

Want more info? Want to start your own Hour of Code? The Hour of Code website has everything you need to create your own event at your local nonprofit, community center, conference, synagogue, meet-up, etc! You do not need to be a developer, and there are even ways you can host an activity with poor wireless access or even no Internet access at all!

A word on Diversity in Tech: Code.org, the great nonprofit behind Hour of Code, is dedicated to expanding access to computer science, especially among women and underrepresented people of color. Come learn how you can further this effort to increase diversity in computer science, but most importantly, come learn!

“If you build it, they will come” is a frequently quoted but also frequently misguided philosophy for the design and development of technology and software. This approach puts the developer or organization creating the product in front of the needs and desires of the community and intended users. As organizations look to increase engagement with communities they often think technology is the answer, and many move quickly to choosing a platform. In the nonprofit sector often, the result is wasted resources in an already resource-scarce environment; and, worse, the creation of a product that few use.

How do you avoid creating a tool that ends up in the Internet graveyard of good intentions? My participatory research on the process of technology development by and for communities revealed a principle that I want to shout from the mountain tops: Voice matters. The community of intended users, the developers, and the leadership are all stakeholders that contribute to the success of the tool, and a listening strategy to account for these voices has to be incorporated from the beginning. We must listen first and ask, what matters most to the community we want to use this tool? In doing so, we account for their values and needs, clearing the way to make informed technology and design choices. A tool is more likely to be adopted if it reflects the values of and meets the needs of a user. Listening to and accounting for different stakeholder voices early on makes all the difference.

To illustrate why voice matters and what putting a community listening strategy into practice looks like, I’ll use my most recent engagement with the Agora Journalism Center and Journalism that Matters. The two organizations put on a conference, Experience Engagement, in October to bring together journalists, civic leaders, and citizens who share interests and a commitment to community engagement. The leadership behind the conference wanted to build a tool to support and sustain the community beyond the conference.

In advance of the conference, I spent time with the leadership and listened for their intentions and goals for the technology project. Initially, we discussed a simple resource tool that centered on the value of information exchange and creating a repository of resources that anyone could contribute to or search.

With the support of leadership, I used the conference to apply my listening methodology to explore community voices and surface their desires. The leadership understood that listening now would save time and energy, and perhaps resources later. If the needs of the community jived with the originally envisioned project, wonderful! Full steam ahead! But if potential users wanted something else (or maybe nothing at all) the leadership team wanted to know before devoting funds, time, and energy into the project.

During the conference, I actively sought out insights that would inform our understanding of the potential tool. I listened as community members talked to one another, and I sat down with several of them throughout the conference. I discovered that the value other stakeholders saw in a potential tool went far beyond the idea of information exchange. This community wanted to build a tool that yes, supported exchanging of information, but they also wanted a place to collaborate with one another and network, as well. The civic leaders and non-journalists especially voiced the desire for the creation of “safe” place where they could connect and share stories with one another and with journalists in order to spread knowledge and and share lessons learned. The community clearly wanted something more than the conference leadership’s initial idea of a resource repository. In fact, the repository idea was challenged by several attendees. “It’s going to take a lot for me to regularly check in on a website… and if it’s just a bunch of links, I’m not going to bother,” one told me.

Often the only “listening” strategy for technology development is put into practice during “user design” or usability stage—after resources have already been spent. As we saw with this project, incorporating the listening strategy much earlier in the process led to important insights that are continuing to shape and inform the development process. Just two weeks out of the conference, we are now actively examining what types of platforms already exist that can support the different values and functionalities sought by various stakeholders.

And listening doesn’t stop here. As is the case in this project and whenever I work with an organization to develop their technology and communication strategy, I make sure we incorporate elements of listening first and listening frequently throughout the process. If an organization is planning on building or implementing a community tool, make sure to start first with the community and not the tool. Time and again, it’s the best way to make sure the organization and the community it serves will both reap the rewards later.

Update

Since the last post, we’ve had a chance to dig through all the data and apply design methodology to surface lots of useful information to inform the next steps in building the platform.
First, we identified several values that are shared across the different stakeholders in the project that will drive the types of features we incorporate in this tool. For example, collaboration, networking, and information exchange are important to this community, and so we are working on identifying a platform that supports features that reflect these community values. This is much more specific than our initial ideas of creating a resource repository—we are learning that what matters to this community is much richer than simply sharing links.
Additionally, we realized that a particular segment of the community—non-journalists, including civic leaders, community organizers, and artists—were not well-represented in our initial data gathering. Thus, to make sure we incorporate their needs and desires ,we are currently conducting a series of stakeholder interviews with this group and inviting their voices in the process of developing the platform to make sure what we launch is representative of the community needs and values at large. After accounting for more voices and incorporating them into our design thinking, we hope to launch the community platform within the first months of the New Year.

“If you build it, they will come” is a frequently quoted but also frequently misguided philosophy for the design and development of technology and software. This approach puts the developer or organization creating the product in front of the needs and desires of the community and intended users. As organizations look to increase engagement with communities they often think technology is the answer, and many move quickly to choosing a platform. In the nonprofit sector often, the result is wasted resources in an already resource-scarce environment, and, worse, the creation of a product that few use.

How do you avoid creating a tool that ends up in the internet graveyard of good intentions? My participatory research on the process of technology development by and for communities revealed a principle that I want to shout from the mountain tops: Voice matters. The community of intended users, the developers, and the leadership are all stakeholders that contribute to the success of the tool, and a listening strategy to account for these voices has to be incorporated from the beginning. We must listen first, and ask, what matters most to the community we want to use this tool? In doing so we account for their values and needs, clearing the way to make informed technology and design choices. A tool is more likely to be adopted if it reflects the values of and meets the needs of a user. Listening to and accounting for different stakeholder voices early on makes all the difference.

To illustrate why voice matters and what putting a community listening strategy into practice looks like, I’ll use my most recent engagement with the Agora Journalism Center and Journalism that Matters. The two organizations put on a conference, Experience Engagement, in October to bring together journalists, civic leaders, and citizens who share interests and a commitment to community engagement. The leadership behind the conference wanted to build a tool to support and sustain the community beyond the conference.

In advance of the conference, I spent time with the leadership and listened for their intentions and goals for the technology project. Initially, we discussed a simple resource tool that centered on the value of information exchange and creating a repository of resources that anyone could contribute to or search.

With the support of leadership, I used the conference to apply my listening methodology to explore community voices and surface their desires. The leadership understood that listening now would save time and energy, and perhaps resources later. If the needs of the community jived with the originally envisioned project, wonderful! Full steam ahead! But if potential users wanted something else (or maybe nothing at all) the leadership team wanted to know before devoting funds, time, and energy into the project.

During the conference, I actively sought out insights that would inform our understanding of the potential tool. I listened as community members talked to one another, and I sat down with several of them throughout the conference. I discovered that the value other stakeholders saw in a potential tool went far beyond the idea of information exchange. This community wanted to build a tool that yes, supported exchanging of information, but they also wanted a place to collaborate with one another and network, as well. The civic leaders and non-journalists especially voiced the desire for the creation of “safe” place where they could connect and share stories with one another and with journalists in order to spread knowledge and and share lessons learned. The community clearly wanted something more than the conference leadership’s initial idea of a resource repository. In fact, the repository idea was challenged by several attendees. “It’s going to take a lot for me to regularly check in on a website… and if it’s just a bunch of links, I’m not going to bother,” one told me.

Often the only “listening” strategy for technology development is put into practice during “user design” or usability stage—after resources have already been spent. As we saw with this project, incorporating the listening strategy much earlier in the process led to important insights that are continuing to shape and inform the development process. Just two weeks out of the conference, we are now actively examining what types of platforms already exist that can support the different values and functionalities sought by various stakeholders.

And listening doesn’t stop here. As is the case in this project and whenever I work with an organization to develop their technology and communication strategy, I make sure we incorporate elements of listening first and listening frequently throughout the process. If an organization is planning on building or implementing a community tool, make sure to start first with the community and not the tool. Time and again, it’s the best way to make sure the organization and the community it serves will both reap the rewards later.

Whether we’re co-located or remote, every company wants their teams to be aligned towards achieving their vision, mission, and goals. And just like in the office, remote workers have to be deliberate about creating an inspiring culture. We just use different techniques.

So how do we create closeness even though we’re far apart? One way to start is by creating a team agreement. This is a living document that sets expectations and outlines acceptable behaviors about how a remote team wants to work together. How will we communicate with each other? What are the expected response times? How will we know what’s being worked on?

Each team has its own personality and its own style of communicating. When we establish a basic set of guidelines for working together, we can remove costly misunderstandings. And remember, it’s a living document. It’s important to re-evaluate the team agreement on a regular basis, especially when new members join or leave the team.

To enhance our virtual collaborations, there are various tools and techniques we can use to simulate the office. And what we actually want to do, is simulate the human-ness that we have when we’re together in an office. Here are some best practices from successful remote teams.

Use video when communicating

Adding video to the conversation significantly enhances engagement. We’ve all been on conference calls where we (or others) haven’t been paying attention (I’m as guilty as everyone). That’s because we’re programmed to respond to each other visually. So when we can’t see each other, we are easily disengaged. This can be ok for occasional calls, but when we work remotely, the ability to see body language, sighs, and smiles can greatly enhance communication.

Many teams use programs such as Google Hangouts or Skype to have work sessions with each other (“pair collaboration”). If you’re an organization with offices in several locations, try placing a web cam in each office so that teams can see each other. If you really want to go futuristic, try telepresence. There are some great, low-cost, teleportation devices like the KUBI that can help make it feel like you’re in the same room together.

Set up a group instant messaging system

In an office, we often pick up on what’s happening by overhearing conversations or running into people at the coffee machine. Virtual teams can get this same kind of random interaction with group instant messaging systems like Slack or HipChat. These tools give a way to convey information to a group quickly. They also allow for more frequent opportunities for casual conversation, without the heavy burden of email.

Find ways to work out loud

Working out loud means narrating your work and making it observable to others. This is a technique for keeping everyone on the same page when we’re out of sight. It can be something as simple as sending a daily message to your team about what you got done that day. It could also mean keeping your Skype or instant message status accurate so everyone knows when you’re available.

There are many ways to enhance the bandwidth of our communication: low tech and high tech ways. The important thing is to try things out to see what works for your team.

In order to feel more connected, it’s important to acknowledge contributions and accomplishments. Showing appreciation and saying thank you can boost morale. But more importantly, not showing appreciation can decrease morale. An easy way to share your thanks on remote teams is by sending an online kudo card.

Feedback is equally as important as praise. Many teams are setting up 360 degree feedback systems so that colleagues and managers can more frequently measure each others performance. The Happy Melly team uses Jurgen Appelo’s Merit Money system. Each month, the team is given a certain number of merit points that must be distributed to other team members with an explanation for why. All the information is transparent to everyone.

In addition to personal feedback, it’s good to get the team together regularly (virtually, of course) to reflect on how things are going. Some sample questions include:

  • What do you like about they way we work together?
  • What should we stop doing?
  • What are we doing well?
  • How do you feel about the tools we are using?

Retrospectives help identify what’s going well and what can be improved. Ultimately, retrospective are there to inspire continuous improvement.

Trust on remote teams is based on reliability, consistency, and responsiveness. When we build these things into our everyday interactions, we can create and maintain thriving organizational cultures in a remote staff environment.

You can hear more from Lisette and fellow co-presenter Adriana Vela later this month in the Ask The Expert webinar, “Working Effectively with Remote Teams.” The Ask the Expert series is free to and exclusively for Members.

 

It happens to us all… we get excited about a new technology solution and forget what it may take for our organization to make the most of it. Unfortunately, without careful attention to system adoption, your solution can be doomed to sit unused, even if its functionality is wonderful!

The good news: there are tried-and-true methods to plan for successful user adoption and, ultimately, a culture that embraces technology at your organization.

This March, I teamed up with Austin Buchan of College Forward, Kevin Peralta of Amigos de las Américas (AMIGOS), and Norman Reiss of Center for Court Innovation to present best practices for technology adoption at the 2015 Nonprofit Technology Conference. We also had a lively conversation with attendees, you can view the session slides here.

The following are some of the key points we discussed and the experiences we shared:

1. With Your Executive Team On Board, You’re In Business. When engaged leadership supports and promotes your system, it gains visibility and priority within your organization. If your executive director and leadership are actively promoting the new system to your organization, you’re probably in good shape! If not, have you defined the benefits concretely enough?

Your executive sponsor should be clearly communicating the tangible benefits to the organization, and willing to commit organizational resources (time and money) to the system over the long haul. (Having trouble getting buy-in? Here are seven tips.)

2. Align The Organization With Clear Communication and Involvement. During our session, we debunked a myth that people hate change. They don’t hate change so much as disruption to their work. If you can help them understand the benefits of the change for them and the organization, you can remove resistance.

You likely have many key stakeholders to include in the project. Make sure your objectives are documented and clearly communicated to all of them (internal and external). Norman Reiss recommended having representatives for different groups in your organization if it is too large to have everyone at the table (literally) at once.

Kevin Peralta discussed a related and critical topic in the panel: resistant stakeholders. It is very common to have a skeptic or group of skeptics. Engage them early on so their voice is heard, and make them part of the process. Ignore them at your peril as others may agree but not speak up. If those stakeholders’ concerns are addressed, they can become your biggest champions, and strengthen your system as well.

Assigning roles also helps align your organization. Consider delegating the following roles: 1) your system administrator, who owns support, troubleshooting, and maintenance for the system, and who should be involved in your project from day one; 2) a steering committee who makes major strategic decisions about the system (such as the budget); and 3) influencers who help you encourage users to adopt the system (this could be your executive director, or just someone whom the team respects as technology-savvy). Make sure that these additional responsibilities take some precedence in your staff’s schedule, rather than being shoehorned into their existing workload.

3. Prepare Wisely, Setting Appropriate Timelines and Expectations. Once you have your stakeholders in alignment, you can begin to lay out your roadmap. Our top tips include:

  • Using a realistic estimate from your implementation partner, think about when you will be devoting the most time to the project, and whether this will overlap with other organizational priorities. Technology projects can often fail just because they are implemented during an organization’s busiest time of the year, when no one has time to focus on learning to use the new system.
  • Consider budget and funding cycles for your system.
  • Set expectations and timeframes for training and launch.
  • Ensure appropriate staffing for soliciting feedback before, during, and after launch, additional training/“office hours,” and engaging sluggish users.

4. Put Your Plan Into Action With the Help of Your Toolkit. The tools in your arsenal are your communications plan, documentation/recording of feedback, and training:

Thoughtful communication can be easily overlooked as an essential component of system adoption. You want to set expectations before, during, and after the transition. The content of your communication is important, however, frequency and variety are also key. Don’t just send one email, use repetition in different mediums to make sure everyone’s clear. As well, it should be a two-way street where stakeholders feel free to ask questions and give feedback. Make sure you have dedicated roles for offering support initially and in the future.

We strongly recommend capturing/recording feedback during user testing. Ideally, you can incorporate this feedback into your business requirements roadmap for future prioritization and development, as your situation and needs will continue to evolve and change. This information may also be useful to document lessons and how-tos.

Austin Buchan shared training lessons learned at College Forward. Their system is used by fellows, or volunteers, who are only with them for 11 months. So it’s critical that they make it easy to learn the system in order to manage these ongoing waves of new users. Initially, they trained on the full array of functionality. However, they later realized that users were overwhelmed and forgetting what they had learned. They also noticed that certain functionality in their system was needed for three months, and only after that did fellows need to know some more complex functionality. The takeaway: break out your training so it focuses only on tasks users need to manage right away. Make sure it’s necessary and relevant. Less is more when it comes to retaining information. Training should be ongoing to keep up with organizational need.

5. It’s Not Over When You Deploy. It can be tempting to assume that if you’ve done this initial setup, everything will hum along smoothly like a well-oiled machine and you won’t need to think about it anymore. We’re here to tell you that you’ll always need to dedicate some time and resources to sustain your system and its users. For starters, set up a post-launch evaluation, with 30-, 60-, and 90-day reviews. Decide what metrics are important for you to measure success, both in terms of users and processes you are using the system for, and track these regularly. Continue to train, document, and communicate. Expect to iterate and improve for many years to come, and reap the rewards of going further in your mission!

Resources

New technology often brings with it a sense of optimism. Perhaps the information will be centralized, communication easier, time and money saved, or mundane tasks automated. But a few months later, data quality hasn’t improved, processes remain inefficient and only half your staff is using the product.

The truth is, no major technology investment is a turnkey solution. Planning, communication, and commitment are essential to organization-wide adoption.

Fortunately, a few thoughtful, proactive steps will go far. Below are five steps to follow when creating your technology rollout plan.

Step 1: Identify Key Stakeholders and Users

While a handful of decision-makers select new tech, there are often many others who will be affected by its implementation. For them, the platform might bring new and possibly unwelcome changes to their daily work—especially if they are comfortable with current processes.

To mitigate these feelings, have a clearly defined communication plan in place. It’s important to educate those who will be affected, and explain when and how changes will occur. Communicating early and often will help dissuade questions, doubts, and uncertainties. Don’t forget about part-time workers, volunteers, contractors, consultants, or board members who might be affected.

Encourage open communication among affected parties. This will expose you to uncertainties early on and allow you to alleviate concerns.

Step 2: Choose the Implementation Team

The other group to involve in new technology rollout is the implementation team. Every major project needs internal champions to move the project forward and rally support.

Clearly defined roles create shared ownership and spur success. At a minimum, make sure the following are filled:

  • Executive sponsor: Internal advocate to help overcome roadblocks. Support from the C-suite will go far in setting the organization’s long-term vision, and in sustaining adoption beyond initial implementation.
  • Organizational administrator: The team lead and main point of contact for support, troubleshooting and product updates.
  • Implementation support: Team members who will work together on the technology rollout plan, internal communications, software adoption, and product evangelism.

The implementation team should include those who will work most directly with the software and be responsible for its success. Make sure all internal champions have the resources they need to succeed.

Step 3: Document Key Information

New technology can be introduced in a phased approach, which allows users to adjust to the new interface and avoid feeling overwhelmed. Consider admin onboarding and training, setup and configuration, user onboarding and training, internal rollout, and ongoing training.

Document the messages, tasks, milestones, and deadlines that need to be communicated at each stage of the process. Consider including:

  • Initial announcement
  • Designation of roles and responsibilities
  • Major milestones and deadlines
  • Training session schedule and reminders
  • Requests for feedback

Anticipate questions that might be asked and formulate talking points so you’re prepared to address concerns.

Step 4: Map Out a Timeline

With a proposed rollout process and key information in hand, map communications to specific dates. Time communications around key milestones and deadlines, such as training sessions, when data will be migrated, and when a full switch will occur.

To avoid internal frustration, give recipients plenty of advanced notice before these dates. For example, give users advanced notice before training sessions to block off time on their calendars, and allow enough time to get organized before data migration. Be patient with the rollout and allow your organization time to transition.

Also, note the best channel(s) to communicate with users. Email should be included, but other means — intranet, internal social network, paper documents, signage, or in-person meetings — might also be appropriate.

Finally, be strategic with messaging. The implementation team should consider when, how, and from whom communications will be relayed.

Step 5: Incentivize Change

Before your team can get onboard with new technology, users must understand why their processes are changing.

Frame the transition in terms of how it will solve current challenges or prepare the organization for future success. Communicate the importance of achieving these goals and how technology will make it possible.

Weave the “why” into all communications. Also consider creating a one-page FAQ document for the internal champions to reference, including responses to key objections.

Don’t expect technology to solve all your problems. The transition has to start and be driven from within. Change management is challenging, but it’s never impossible. With the right team and plan in place, you have the power to transform your organization.

Consumer software products have started getting into the art of storytelling. When you use AirBnB, you’re not just booking a place to stay; you’re booking a personal experience. When you use Lyft, you’re not just hailing a ride; you’re getting a ride with a friend. Both products are focused on their community of users and on weaving their story into the very core of their products. The cloud provides the opportunity to hone a design’s story with core stakeholders with ease and grace.

As an enterprise product designer, I weave the art of story telling into the design process. Design decisions are often made so that the product or feature being designed can tell its own story.

Storytelling weaves functionality and design of a product with an emotional connection to the user. Nonprofit organizations aren’t any different from consumer or enterprise software products. Good design (being defined as having both functionality and form) is no longer enough to build trust with people on the internet. People want to be told a story. With nonprofits specifically, it’s perhaps even more important that a story is at the core of the digital experience. A deep emotional connection to a story creates a stronger connection to the organization and a higher chance at being interested in an organization’s activities, which results in a desire to contribute.

Every user interaction should be thought out with the story in mind, and designs should constantly be shared for feedback to ensure good design as well as a good incorporation of the story into the site or product. Once the overarching message that a nonprofit wants to convey to donors is figured out, the spirit of that message should be experienced throughout the site. Every screen and interaction of a site should be mocked up in low fidelity – including where users will click. At this stage, designs should not be pixel perfect. Pictures of screens drawn on whiteboard, or even paper sketches, are perfectly acceptable. To send lo-fi wireframes, I’ll often draw on whiteboard or sketch on paper, take a picture of it with my phone, text it to myself, then email it to other team members or share it on Dropbox or other cloud-based sharing solution for feedback. Both Dropbox and Box are fantastic solutions for saving and sharing designs, since everything is hosted in the cloud, and shared folders automatically sync when anything gets saved in them. For those that are fond of digital low fidelity wireframes, Balsamiq Mockups is one of the simplest tools to use. Their drag-and-dropping of assets to communicate concepts doesn’t require a designer’s aptitude, and for nonprofits, Balsamiq offers a significant discount.

Sharing an interactive prototype with either remote team members or users for feedback can be done by saving it into a Dropbox folder, then sharing the link. When users or team members are giving feedback, test to see if the story is both compelling and evident throughout each screen and interaction. For mobile products, sketches of screens can be used, then pieced together using POP: Prototyping on Paper to test out the interactions directly on a mobile phone.

After a site or product has launched, tracking metrics behind its success will also give you a good bar for how successful the storytelling is. For a simple site, Google Analytics will suffice. For sites with increased complexity, and especially for products, Mixpanel is invaluable in tracking behavior and doing cohort analysis. To track high-level snapshots of where users are clicking for quick iterations or A/B testing of screens, CrazyEgg does a phenomenal job at providing a heat map of user behavior.

Good design isn’t just about visuals or pixels; it’s about functionality and emotionally connecting with your audience or users. The stronger the emotional connection, the stronger the story.

[Editorial note: the following article first appeared in the December 2012 issue of NTEN:Change, a free quarterly journal for nonprofit leaders.]

When is a meal more than a meal? When it’s served in a soup kitchen. Most soup kitchens serve double, triple, even quadruple duty as a community center, health center, temporary shelter, and a place to get job services. That’s the complicated part of social change – complicated problems call for complicated solutions, even when our business models force us to focus on one simple arena.

Fortunately, soup kitchens don’t work alone in addressing hunger. The soup kitchen operates in a field of health clinics, shelters, and job training centers. Collectively, these organizations begin to address the complex issue of hunger. But more often than not, these individual organizations operate alone, never knowing if the individual receiving the meal is getting the job training that might help her find employment and become food secure.

Imagine if the soup kitchen knew what other services each client was receiving? And what holes in service they could recommend? If these organizations shared data, they could improve services and have a much clearer understanding of their impact.

The field of Nonprofit Technology Assistance Providers (NTAPs) began developing in the early 1990s. Since that time, the field has matured into several key players that contribute distinct value to the nonprofit sector by providing services, software, training, research, and community-building. Each organization plays a unique role in the sector, contributing our own solutions to the complex problem of helping nonprofits adopt and use technology to create more social change.

We believe that, like the soup kitchen, our overall impact could be strengthened through a better understanding of who we serve individually and collectively, and what that may imply about how we can better collaborate to meet the needs of those we serve. In January 2011, at a gathering hosted by TechSoup Global, we began talking about how to model our own ideals.

In summer 2011, we received funding from Microsoft to bring our groups together to answer at least one question: which nonprofits do our organizations serve separately, and collectively. Seven organizations participated: 501cTECH, Idealware, Network for Good, NTEN, NPower, NPower PA, and TechSoup Global. Together, we designed a data analysis project that answered this key question, and provides tremendous insight into the nature of their relationship to each other and the nonprofit sector as a whole.

Of course, collaboration is tough work. Building a collaboration in name, is easy, but a gust of disarray, and it topples like a house of cards. We aimed to build a better collaboration, with a strong foundation, and sturdy frame. Something that would stand the test of time while our relationships and collaborations evolved. Did we succeed? By most measures, we did. Here are some of the lessons we learned:

1. Build a Foundation of Trust: Define Parameters

While all seven organizations had agreed to participate in principle, sharing is never an easy task. In our initial conversations, we each had plenty of questions about where, and with whom, the data and the results of our analysis would be shared. In essence, what and how much data any org was willing to contribute was directly tied to how secure the process would be and how much control they had over what got shared. It quickly became clear that creating mutually agreeable security and transparency protocols would be our single most important piece of work for the project. The project would fail if participants did not trust in each other and the process.

To that end, we convened in November 2011 to answer those questions. Working with a facilitator, we discussed what each organization wanted to learn from the project (what was “in it” for us), what data would be required (what we had to give), and the conditions under which we could share that data.

We spent the first part of the day simply sharing information about our organizations, and finding common ground between participants personally. This was a great launching pad for our next step, identifying some common goals for the project:

  • Understanding demographics of the audiences for each of our organizations
  • Understanding where our organizations served unique, or overlapping audiences
  • Understanding what part of our audiences were deeply engaged in the nonprofit technology sector (and do they have some common traits?)
  • Understanding the demographics of nonprofits receiving paid and unpaid NTAP services
  • Identifying potential collaboration/partnership opportunities with other NTAPs

We also identified many more goals that we could potentially explore, but were deemed beyond the scope of this initial project because we were either unsure if we could share the data required, or sharing the data was too onerous.

This initial meeting was also used to outline a data submission process that would ensure that data could be transmitted securely to the data analyst and not be shared with any other project participant. Additionally, we identified the key elements of a non-disclosure agreement for all participants.

2. Strengthen Trust Infrastructure: Formalize Sharing Principles

With clear foundation of trust—in each other and in the process—we left the meeting with direction and purpose. Getting the infrastructure we agreed on at the meeting in place seemed like it would be an easy task. Yet, we were reminded once more that agreement in principle is not the same as agreement. No one wants to see how the plumbing works in their house, but knowing where the pipes are is an essential part of homeownership.

Setting up systems to securely collect and deliver participant data was actually relatively simple. Getting agreement from all parties on a Non-Disclosure Agreement (NDA) was not, however. The NDA was an essential part of the process, legally requiring that all participants in the project keep project information to themselves. In other words, what happened in this project stayed in this project, unless all parties agreed to share. Without this base, no one could be comfortable sharing their data.

Though all participants had fantastic intentions, anything involving multiple lawyers is sure to grind to a halt. In fact, it took us over two months to get an agreement in place. We had assumed we would get it wrapped up in a couple of weeks, and the delay set us back quite a bit. However, once we got it in place, things progressed rapidly.

3. Move Beyond Trust: Specify Expectations

With the agreement in place, we securely shared anonymized data and began getting findings back from our data analyst. During this time period, we met regularly on the phone, as well as at one in-person meeting. As the findings began to come in, it became clear that what one person meant by “map the sector” was not always what everyone else meant when they said the same words.

While we had agreed to pursue data in support of some specific goals, seeing the results come back made us acutely aware that participants often had differing ideas about what that goal meant (at worst) or how the data would be presented (at best). And sometimes our analyst had different ideas about those definitions from any of us. This is where we learned the hard way that you cannot get too specific.

We had, metaphorically, clearly defined that we wanted a house with three bedrooms. We had failed to mention how big that house should be, or what style it should be built in. Fortunately, none of these discrepancies between reality and expectation were show-stoppers, but it did mean that we had to take a lot more time to walk through the findings and get as specific as possible.

Conclusion

Our organizations succeeded in answering our key research questions (and you can find our public report on what we found by downloading this pdf). We learned that, collectively, we reach over a third of the 1 million nonprofit organizations filing 990s. We also dove more deeply into how the distribution of that reach compares when examining characteristics like issue area, size, and location. Additionally, we took a nuanced look at the landscape of gaps and overlaps among the lists of those served, examining not only nonprofits reached, but those for which unpaid and paid transactions—as well as multiple types of paid transactions—had been recorded.

Most importantly, we learned that there are tremendous opportunities for our organizations to work together to support each other, build the field of nonprofit technology, and increase the number of nonprofits that are really using technology effectively. Thanks to this project, we also learned a lot about how to make that collaboration happen.

Sincere thanks to each of the NTAPs that joined NTEN in this experience: 501cTECH, Idealware, Network for Good, NPower, NPower PA, and TechSoup Global; to the project team: Christine Egger, facilitator; Henry Quinn, analyst; Katie Guernsey, analyst support; and Amy Luckey, advisor; and of course our funder: Microsoft, without whom this assessment would not have been possible.

> Download the NTAP Sector Research Assessment Report here.

Holly Ross is former executive director of NTEN, and currently executive director of the Drupal Association. Holly was recognized as one of the “NonProfit Times Power and Influence Top 50″ three times, in 2009, 2010 and 2011.

Has this happened to your nonprofit? One staff person or select members of the communications team are assigned to manage all of your organization’s social media. While this might make sense, it can be a lonely and complicated job speaking for an entire organization to the outside world.

Instead of siloing social media into “that stuff that so-and-so does,” what would a more integrated approach look like?

String Theory of Community Organizing - Marta Evry, Venice for ChangeSince social media relies on social networks, let’s examine junctures in various organizational relationships where social media could be useful.

1. Relationships between staff and colleagues

Email is probably a big part of your relationships with colleagues. Sure it’s useful, but many of us are feeling overwhelmed by it these days. Social media can help lighten the burden and actually make email more useful. I’m thinking in particular about those “have you seen this?” emails. Imagine if staff shared interesting articles or videos with hashtagged tweets or even a collaborative Pinterest board or Storify?

Use social share buttons on websites or in your browser to easily categorize and save information that you want to share with colleagues. Curating this newsfeed-type delivery system might seem daunting, but the most important items have a way of rising to the top. Strive to make email an efficient tool primarily used for dealing with work-related tasks. If you’re actually looking for more information on the issues you work on, use a combination of listening tools like RSS Readers (for subscribing to key blogs, Youtube/Vimeo channels and Google Alerts), Tweetdeck or Hootsuite (to follow hashtagged conversations on Twitter, etc.) or even your Facebook newsfeed.

2. Relationships with program participants

We’ve all struggled with whether or not our website really reflects all the great work that’s happening in our organization. This is why blogs are important. Over time, short concise updates build a more complete and current picture of your nonprofit’s work. An editorial calendar can help you think through upcoming news hooks or events to tie your work to. But who’s going to do this, you ask? Think about easy ways to digitally “capture” your work as it’s happening. While professionally created photos or videos can be an important part of your communication strategy, it’s also possible to document the work of your organization easily and cheaply.

If your nonprofit focuses on social services, be sure to think through issues related to possible confidentiality of participants, especially when they’re minors. Print out a release form for interview subjects to sign (click here for a PDF sample). For public events, encourage all of your staff to document events and activities with fun mobile-based apps like Instagram for photos, SoundCloud for audio, or Vine for video. Rather than just assigning written blog posts to staff, ask staff to record short participant interview clips with audio or video that don’t require lots of editing. You can ask program participants to partner with you in capturing your work. For example, design activities or trainings that actually result in media content that you can use for your organization. Instead of activities using paper and markers, try mobile phones or video/audio recorders.

3. Relationship with volunteers and supporters

While your staff curate information online (see #1 above), your social media followers and supporters can “eavesdrop” on these conversations and look to your staff as thought leaders on your issue. But to move beyond a “broadcast” mindset, you can also involve volunteers and supporters directly in your communications strategy. Like staff, invite them to help capture events with video, audio, or photos. To cultivate your own mediamakers, recognize and thank them regularly. Also pay attention to the measurement tools provided by social media to see what your supporters like best. That way, you’ll have a better sense of what to make more of.

We often hear that “social media is not for broadcasting but for conversing.” While this is true, I think this is a scary proposition for many nonprofits. Many wonder who has all that time or energy to converse with all the people in your multiple relationships that we’re detailing in this post. I agree that social media should be “conversational” in tone. Drop the advocacy-speak or wonky terms no matter what tool you use. This does not mean “dumbing down” your communication: just make it sound like the way someone really speaks.

Tip: It’s difficult to avoid a more formal style when you’re managing the organization-branded Twitter account or Facebook page. If you need to have direct conversations with your organization’s social media fans or followers, respond as an individual staff person. If this is uncomfortable for you, try setting up a specially created account that you use for work-related communication.

4. Relationships with broadcast journalists

If we want journalists to cover the important issues we work on, we should help distribute their reporting. News organizations are watching their web traffic to make editorial and advertising decisions. The more readers, viewers, or listeners that we send their way, the more likely they will report on our issues in the future. In social media terms, this means sharing links to articles and not copying/pasting text into emails. When tweeting, google the journalist’s Twitter handle and cite them when you share their media. And publicly thank them for great work!

5. Relationships with funders and donors

All nonprofits struggle with quantifying their work for funders and donors. To illustrate your work, you can repurpose much of the digital content that your staff, program participants, and volunteers have made. At the same time, analytics on your website, Youtube videos and Facebook page reveal lots of information about the value of your work. But look beyond the number of Twitter followers, video views or Facebook fans. Examine how others are sharing or talking about your content, how extensive your reach is and how this has changed over time. Use these numbers to explain what funder and donor support make possible.

P.S. I could have added a sixth relationship: with consultants like me. But I’m hoping this post will get you started with several ideas that can also be applied to other relationships.

Let me know what you think of the advice above. Which ideas are useful for your situation? What did I miss?

Photo credit: Marta Evry http://veniceforchange.blogspot.com/

change_Logo_265x65.jpg

[Editor’s note: The following is from the December 2012 issue of NTEN:Change, NTEN’s quarterly journal for nonprofit leaders. Read the complete issue on “Collaboration” when you subscribe to the journal for free!]

By Rachel Weidinger, Upwell

Developing a High Touch, Human Platform for Collaboration

At Upwell, we’re inventing a new way to work together. We’re a nonprofit, data-driven social media PR agency with one client, the ocean, and one goal: more people talking about the ocean. It’s our mission to make the ocean more famous online. Competition is real in the marine conservation space. “Blue” orgs get a small fraction of environment funding. With just a year and about $1 million to prove our model, this pilot project (incubated by Ocean Conservancy) had no choice but to invent a new way to collaborate. Taking on this project felt like jumping off a cliff.

In my past work life, incentives made it too easy to focus on my own small slice of the pie. I struggled frequently, though I was often successful at bridging boundaries. At Upwell I wanted to hack the competitive dynamic, to disrupt the nonprofit communications “institutionitis” that keeps our heads down and focused on building our own email list, optimizing clicky action, and thinking of the people on our lists as statistics. Our philosophy at Upwell is that we’re part of a big ocean team that includes marine conservation organizations, marine scientists, and ocean activists. Our competition is Justin Bieber, not each other.

Social Mentions:
single items of online content, such as a tweet, a blog post, etc., that contain one of our mission key words.

Attention on ocean issues is the currency of our collaboration. “Sharability” is the mechanics of how we make attention abundant. We monitor keywords in social media, determining a baseline of attention for the ocean and seven marine conservation topics: sharks, whales, overfishing, sustainable seafood, ocean acidification, marine protected areas, and the Gulf of Mexico. We work to spike the level of online attention focused on each of these issues. Instead of awareness, list-building, advocacy or fundraising campaigns, we run attention campaigns. We measure the success of the campaigns based on the number of social mentions they generate, and how much they elevate the baseline of attention for each issue.

As Upwell has evolved, so have our models for collaboration. For example, leading up to World Oceans Day (June 8), we strategized all sorts of bridging campaigns to amplify the messages of our informal network of marine conservation professionals. Unfortunately, when it came time to roll the campaigns, many organizations couldn’t participate. Their organizational list building goals for that day were a higher priority. Our social mentions from World Ocean Day were unremarkable.

So we tried a different tack. Instead of asking marine conservation organizations to collaborate on messaging, we facilitated their sharing campaigns and messaging ideas with each other, and provided tips and data to help them with their organizational campaigns. And instead of trying to create a new event, we decided to piggyback on one that was already getting a lot of attention.

image_tayo_upwell.preview.pngDiscovery Channel’s annual Shark Week has a reputation among conservationists as being sensationalist. It also brings the single biggest spike in attention for three years running to ALL of the eight marine conservation topics we monitor. If you’re campaigning on ocean issues, Shark Week is the best chance all year to expand your audience. Plus, we discovered that the bulk of the online conversation is by fans talking about how they love sharks, and think they’re awesome. It turns out, Shark Week is a massive shark fan convention online, and a huge opportunity for conservationists.

We decided to host a series of “Sharkinars” for organizations. During each brief webinar, we shared our analysis in quick, memorable bites, and asked organizations (if they felt comfortable) to share their Shark Week campaign plans and challenges. Rather than asking the Sharkinar participants to add their plans to a wiki (which we knew they would be too busy to do), we took notes about participants’ plans, dug up working links ourselves, and curated them in a toolkit, which we shared after the call. We’re big believers in high touch sharing, not just asking people to drop stuff in.

The kind of collaboration that works for us involves surfacing shared interests, disparate resources, and the elbow grease of getting it all organized.

Once Shark Week was over, we tracked our aggregate impact with our social mention tracking system in Radian6, and shared the impact with the Sharkinar participants. The result: we blew away our own expectations of what was possible in terms of raising the baseline conversation, increasing social mentions, and facilitating collaboration in the marine conservation sector. We learned that the kind of collaboration that works for us involves surfacing shared interests, disparate resources, and the elbow grease of getting it all organized. When we frame our work as competitive, we miss the chance to gently collaborate for a much bigger impact, like we facilitated during Shark Week.

I suspect that as do-gooders, we can do better by each other. Part of the framework of Upwell is a reaction to all of the institutional processes that keep us from working together, and from initiating, creating, and making real collaborations with our peers. I want to abandon the frame that keeps us from being our most expansive, network-building, big listening, and hopeful selves at our desks. We don’t have to work alone.

Rachel Weidinger, Director of Upwell, can be found on Twitter at @rachelannyes and @upwell_us