Tag: Data

When you first look at your AdWords account, it may seem like information overload: there are so many metrics! The columns that exist in your dashboard by default may seem like more than enough.

But I’m here to tell you that the columns that are already in there when you open up your shiny new account are not nearly enough. This is particularly true for nonprofits who use the Google Grants version of AdWords because they need to make quick checks for compliance with the new 2018 rules.

The following are my recommendations for setting up what columns exist in the keyword view, but many of these options also exist whether you are looking at a campaign, ad group, advertisement, etc.

Quality Score

There’s an old adage in journalism: “Don’t bury the lede.” In Adwords, you should see your most important columns first, without having to scroll.

With the new compliance rules, the most important keyword attribute is the Quality Score. If you have a Quality Score below three for any one of your keywords, your account is in violation. When you first create a keyword, you will see a dash where there should be a number. Those are called null scores and mean that the system has yet to assign a rating. These are allowed, but scores of one and two have to be paused. You can create a rule in your account to automatically pause these.

Click-Through Rate (CTR)Michael Rasko quote

The next most important is CTR, and many would argue that this is the most important. Though I agree it is the most difficult rule to adhere to, I think it is slightly less important than quality score because one keyword in violation of the quality score can bring your entire account out of compliance—whereas it takes the account-wide CTR to be below the threshold of 5% for two calendar months in a row for the account to be considered in violation of that rule.

Put another way, the result of the number of clicks divided by the number of impressions must be below 0.05 for back-to-back calendar months for you to be at risk of being suspended for violating this rule. This is still a very difficult endeavor, but one in which each individual keyword’s performance has a diluted impact.

You may want to set up a rule in your account to automatically pause low CTR keywords as well, but I encourage you to be careful not to overdo it with how you set this up. Pausing every keyword below 5% may not be necessary to bring your account over the account-wide CTR requirement but could drastically reduce how often your ads are seen.

Clicks, Impressions, Average Cost Per Click, and Cost

These four come standard with the account, and they’re good to keep. They let you know whether your advertisement is being seen by people, how many of those who see it actually click on it, and what you have to pay for the click.

Among these four, total cost often gets overlooked simply because all the ad grants money is free. This should not be overlooked—even though you don’t have to pay Google for it, the traffic that comes from this AdWords spend is very valuable and is use-it-or-lose-it. You get $329 every day to spend ($10,000 a month), and that money doesn’t roll over if you don’t use all of it. Don’t leave that money on the table.

Conversion Tracking Information

If you haven’t set up conversion tracking for your account yet, make that your number one priority. Maximize Conversions bidding is literally the only good news that came out of the 2018 rules update.

There are many ways to track conversions. The Ad Grants team recommends using Google Analytics to set up E-commerce tracking for financial transactions and Goals for non-financial conversions. I agree, mainly because this method is one of the easiest to implement and gives you the most thorough information.

Additionally, they warned against setting up the view of a key page as a Goal in Analytics. They see it as an attempt to game the system as you could make the landing page for your ad the key page. The Ad Grants team announced in a recent video that they will be checking for excessively high conversion rates. Though they did not specify a threshold for when a rate is in violation of the policy, they claim in their compliance guide that a normal conversion rate is 1% to 15%. I would recommend revising your conversion tracking if your rate is outside of that range.

There are a currently 16 different conversion-related metrics that you can add as columns. The two I recommend above all others are total Conversions, so you can see if you have enough data in the system to start Maximize Conversions bidding, and Conversion Rate, so that you can see if your rate is too low or too high.

Expected CTR, Ad Relevance, and Landing Page Experience

Expected CTR, Ad Relevance, and Landing Page Experience are recently added columns.

  • Expected CTR is how likely someone is to click on your ad and is calculated by past performance.
  • Ad Relevance is how closely a keyword matches the content of your ad.
  • Landing Page Experience is how useful your landing page is for the search term, based on how well the content matches the search and how easy your landing page is to navigate.

These columns are measured as either below average, average, or above average. All three of these factors are considered when the system assigns your keyword a Quality Score. If you have a very low Quality Score for a keyword, this is something to take a look at to uncover why.

Additional Options

I said at the start that your keywords columns will mostly match those in other views. A notable exception is that in the search terms view, you can add a column for the keyword that triggered the advertisement. I find this especially helpful because when I see that my advertisements are showing for searches that are not particularly relevant, I can quickly find the keyword that I need to make alterations too, most commonly by switching it from broad match to phrase match.

Another column you may want to add for a variety of views is landing page. I don’t personally have this because each campaign I design is organized around one landing page, so I know the landing page by looking at the campaign title. But if you organize your account in a different way, it may be helpful to see where exactly your advertisements are sending your users.

Trust, humanity, equity, and privacy are the four pillars of the responsible use of data. These themes emerged spontaneously from presentations given at Data on Purpose: The Promises and Pitfalls of the Connected World, hosted by the Stanford Social Innovation Review (SSIR) in February 2018. The conference was about the current state of data for nonprofits, and a glimpse into what the future holds.

Several best practices for designing big data systems, measuring impact and trends, and making decisions about what’s next were also revealed. Taken as a whole, they form a guide to how we should move forward into a data-driven landscape.

The current state of data: messy

SSIR Managing Editor Eric Nee said less than 0.5% of all data was ever analyzed and used. While surprisingly low, it makes sense, given that the rate at which we’re collecting data is exploding. When a field is growing so fast, it’s a challenge to build streamlined, polished systems that can keep up. Often, there are no common standards for data formatting. This makes it challenging to aggregate data and to obtain more informed insights.

People measure the wrong thing. Erika Salomon, a data scientist at the University of Chicago who worked on the Data Maturity Framework, said the desired social change outcomes are often not captured within data sets. For example, a homeless organization might count the number of people in beds at a shelter every night, but that doesn’t tell us whether people’s lives are actually improving.

Nonprofits may lack internal data culture. Data and tech exist in silos, and may not be included in decision-making, reporting, and planning processes. Or when data is collected, people don’t know that proper formatting and consistency are essential to making sense of the inputs.

The future state of data: exciting and unknown

Moving forward, our ability to use data to make good decisions will become dramatically more powerful. We’ll have sharper insights, more quickly, and more reliably than ever before. We will harvest vast amounts of ever more fine-grained data. It will become more readily available as the world moves to a more data-transparent, query-able environment. This will happen alongside the increasing adoption of the internet of things, wearable devices, and the current lack of regulation on access and use of data.

We’ll be able to make accurate predictions. The rise of AI means much faster collection and automation of analysis processes. With enough fine-grained data, we can build predictive data systems. For example, migrating birds may need pop-up water spots in drought-ridden areas at certain times, and the ability to forecast that means we have the opportunity provide a solution.

The four pillars

1. Trust is key

It’s critical that people trust organizations using data, the data itself, the algorithms used to parse it, and the assumptions and biases built into the methods of collection and analysis. Those who control data must be good stewards of that power.

Organizations must ensure how and why they are collecting data are mission-driven practices. One method is to “champion the customer,” according to Erika Salomon. Organizations should listen, and employ participatory design practices to achieve positive outcomes.

Communities and individuals must have input. Big data must be given local context in the form of “ground-truth”—information coming directly from the people, place, or things surveyed. For example, the Streetwyze platform, which uses community insights about neighborhood health to empower local residents. Without the local context big data is at risk of providing inaccurate insights, which can lead to detrimental and discriminatory policy-making and power structures.

Trusting algorithms really means trusting those who wrote them. Data practitioners should beware of bias built-in to algorithms that collect and analyze data, and should examine the assumptions that may be embedded in the methods of data collection and use of data.

2. Remember your humanity

"The key is remembering: Ultimately, it's not about the data. It's about people." - Marcy RyeThere is fear in the nonprofit sector that data turns human beings into cold, hard numbers. Finding ways to humanize data and even how we talk about data is critical to widespread acceptance of data-driven culture. It’s a basic human need to want to connect with others. We must keep this in mind, and ensure data systems advance humanity.

Data is simply information. Technology is simply a set of tools to manage information. Both information (data) and tools (technology) can be used for social good. This kind of straightforward language can help open doors for conversations about the benefits of data, and change perceptions of its value.

Pursuit of efficiency can result in dehumanizing processes. At Laboratoria, a web development and job skills service for Latin American women, they initially posted just the technical bios of women seeking work in coding jobs. But when they included personal stories about the women, the rate of hire increased dramatically. Hiring managers said it improved the interview process because they felt like they already knew the candidates.

3. Community involvement drives equity

Equity must drive the construction of new data ecosystems, methods, and tools. The concept of “ground truth” is part of this equation—ensuring that local communities have power over their own data. But there’s more to consider.

There are several obstacles to data equity that must be considered in development of any data practice, which include:

  • Availability: Can you get to it?
  • Access: Can you actually use it?
  • Awareness: Do you know that it exists?
  • Affordability: Can you find it, access it, or afford it?
  • Agency: Are you able to use it the way you want?
  • Ability: Do you have the know-how to get it done?

Inclusiveness and participatory design are two ways to move towards equity. Organizations should involve the surveyed communities, or better, let them lead. People and communities should own their own data and systems. Practitioners should learn to balance their power, and scale back their role—leaving room to empower others.

4. Privacy is everyone’s responsibility

It’s important that people and communities are in charge of their own data. They should be able to have a say in, if not outright control of, what happens with data collected from them.

There is risk of abuse when an organization captures large amounts of detailed data. Planet, Inc. has thousands of cameras and satellites orbiting the earth, snapping photos of every square meter of the globe. Andrew Zolli, VP, promised us that the company would never give the data to anyone with nefarious intent. He was convincing. But there’s really nobody to stop them. We could never really know the level of detail the company is actually collecting. We must take Zolli at his word. The recent Facebook/Cambridge Analytica scandal underscores this problem.

People should control their data the same as they control their money. Equitable legislation (and enforcement thereof) that dictates how corporations may use an individual’s personal data would help. Mark Zuckerberg’s testimony about the Facebook platform’s use of data is highlighting the importance of this. The EU’s recent release of the General Data Protection Regulation (GDPR) leads the way on equitable legislation, and serves as a model for other parts of the world.

Nonprofits need a data culture

75% of NPs collect data but only 6% think they use it well, says Stanford Program on Social Entrepreneurship lecturer Kathleen Janus. By growing a strong data culture, nonprofits can gain real insights about the realities, scale, and impact of their work, and have the information to adapt when necessary.

Data and technology should flow through all aspects of operations and programming. Staff should be trained on why quality data collection matters, and how to do it. The key is remembering the humanization factor. Ultimately, it’s not about the data. It’s about people.

Even small organizations can and should build a data culture. They should focus on what they can realistically measure. It doesn’t need to be sophisticated. It can be as simple as tally marks on a whiteboard or a basic spreadsheet. They could pick 3 to 5 key metrics or even measure proxies if direct measurement is out of reach.

Sharing how and why organizations are collecting data is becoming a donor requirement. To meet this requirement means organizations must first build a strong data foundation. It also requires funders that want the data to help pay for building the culture, and collection, analysis, and reporting.

Connect data sets for stronger insights

The lack of data standards across most sectors makes connecting data sets difficult. But it won’t be difficult forever. By doing this, we gain new insights, identify new patterns or trends, and see systems from a different vantage point. All of this helps us build tools that benefit the sector as a whole, and help everyone to make more informed, better decisions.

In 2012, the US Internal Revenue Service made Form 990 data open source. It’s an underutilized and incomplete data set, but it’s still possible to build a data engine that could parse it in various ways to help us better understand the ecosystem of all environmental nonprofits, for example. Charity Navigator has, in fact, created a repository for anyone to use to explore the 990 data.

A focus on building tools that can help entire communities of practice, systems, or sectors can drive the process of connecting data sets . And, if we keep in mind the need for standard formatting across individual data sets, the connection process can become easier as we go along. This can empower people to solve problems faster and more effectively.

Measure (and share) your impact

Once you’ve built a humanized, trustworthy, equitable data system that’s collecting and aggregating data, it’s critical to know if your efforts are working. Measuring impact lets you tell and prove your story of impact to funders. And it provides internal information necessary to make improvements to operations and processes.

There’s more to proving impact than having data. Weaving your data into a good story is a powerful way to capture emotion while providing evidence of your success. Choosing the right data and story is essential.

In choosing what to measure, distinguish between outputs and outcomes. Outputs are vanity metrics, and are often easy to measure. Outcomes are indicators of actual behavioral change, but can be harder to measure.

Dr JaNay Nazaire, Managing Director for Performance and Results for social change organization Living Cities, provided a framework for data-driven decision making. It can help organizations identify the right metrics. These are four of the five steps.

  1. Define the problem: What are the root causes? What are measures of success?
  2. Take a data inventory: What do you have, need, want? Is it accessible? Does it need work?
  3. Be smart about data collection: What can you start with? How can you fill gaps? Who can help you?
  4. Communicate clearly and powerfully: What can everyone understand? What will galvanize people? What visuals will get attention?

In answering these questions, organizations can discover the metrics that prove they are actually solving problems. Your critical metrics must address root problems and measures of success, be something you can actually measure, be realistic in terms of your ability to measure it, and support your powerful story of impact.

Better decisions with data

Data-driven decision-making is a way to systematize and validate the critical decisions nonprofits must make. Start by identifying key metrics. Then set up a system to collect data about them, and frame a powerful story of impact supported by them. Now you’re ready for the fifth step of Dr Nazaire’s data-driven decision-making framework. This is where you make the actual decisions.

Business intelligence (BI) tools can offer powerful insights. Jaclyn Roshan, from myAgro, discussed their use of BI. There’s a data warehouse, data mining, analytics, a dashboard, and data visualization of outcomes and processes. This helps them make smart, timely decisions about the farmers they serve. There was a case where many crops had become infected with a disease early in the harvesting cycle. But through their business intelligence they were able to tell farmers the best time to harvest to get the maximum crop yield before the disease destroyed the entire crop.

Cross-referencing different data sets yields insights. Impact View Philadelphia uses the 990 data and data from the US Census Bureau to display information about local nonprofits juxtaposed with information about the people living there. You can see where organizations are on the map, what type of organization it is, and add demographic layers like median income or poverty rate to understand proximity of organizations to those in need.

Final thoughts

Access to data sets is expanding all the time. By learning more about ourselves and our communities, and by connecting data sets in smart ways, we can start to build data-driven decision-making tools that lead to improvements in people’s lives and communities.

Trust, humanity, equity, and privacy should be at the heart of our data-driven future. What we build should be in service to the communities they are for. Data systems should be protected from potential abuse through legislative changes, changes in business practices, and education of individuals.

Organizations should begin building a data culture and start sharing data. This will allow them to measure and prove their impact for fundraising and self-improvement. It will also help them make informed, data-driven decisions.

If we handle our emerging ability to gather, analyze, and share massive amounts of data well, we will be poised to make truly dramatic improvements to how we handle all the systemic problems nonprofits exist to solve. Ultimately, it could result in a world that is more equitable, supportive, and safe for all.

Your constituent relationship management (CRM) system can be one of the most impactful pieces of technology that your nonprofit uses. It helps you create an infrastructure to centralize your constituent data, and also allows you to track the goals and successes of your fundraising, volunteer, marketing, and program efforts.

But the truth is, technology is constantly changing, and your organization’s needs are constantly changing as well. At some point, your organization will need to optimize or change CRMs. If a CRM change is on the horizon for your organization, there are some critical things to consider before shopping for your next system.

The questions below can be your guide in creating the vision and strategy you’ll need in order to implement a new CRM. While it may seem laborious and time consuming, doing your pre-work will pay-off in leaps and bounds when your new CRM is ready to go live.

1. Who will use the system?

First and foremost, you need to understand the CRM users and their needs. This will drive everything from licensing to system administration and maintenance to user adoption, training, and change management. Understanding the users will also inform the features and functionality the system must have.

For example, if the users will be field staff, then your CRM will likely need a mobile access option. Taking a user-first approach will set you up for success.

2. What does your data look like?

In addition to users, you also need to consider your data. Where does the data live now? What governance and processes do you need to put into place to ensure data quality? Are there any security considerations for any of your data? What decisions will the data inform? And, to make those decisions, what types of analytics do you need? These key questions will guide the type of CRM you need, providing insights into the features and architecture required. Thinking early about your data will also give you plenty of lead time to clean up and archive data that is not critical for your new CRM.

3. Does your CRM need to talk to other systems?

Increasingly, users demand a seamless data landscape. Bringing together disparate systems is one of the top motivators for moving to a new CRM or enhancing an existing one. Pushing and pulling data to and from other systems is a requirement that may determine what CRM system you select. Do your homework so you know if there out-of-the-box solutions for the integrations you need, or if you’ll need to build something custom to perform the necessary integrations (and then budget accordingly).

4. What is your budget?

For most nonprofit organizations, this might be the very first question they tackle—or the first roadblock they experience. Budget is almost always a driving factor in technology decisions. When reviewing costs, consider both immediate and ongoing costs. Keep in mind integrations, ongoing licensing and system maintenance, training, staff needed to run the system, and the actual cost of moving systems, whether this cost is internal staff time or a contract with a partner. Carefully plan your total budget so that when you start shopping for a new CRM, you make decisions with this in mind.

5. Who will own the system?

Implementing a new CRM is a good opportunity to define the rules of the road that will govern that system going forward. It is easy to fall into the trap of thinking that the technology will run itself.

At the organization level, you should determine a governance structure. I often recommend a governance committee that represents each user stakeholder group (accounting, fundraising, etc.). This committee reviews issues, requests, and opportunities (such as newly released features or upgrades) and develops the strategy for the system. They will also identify and define best practices and escalate any critical risks before any risk could cause an operational stand-still.

Your CRM and how you use it will constantly evolve, so you need to put together your crew for steering the ship before you set sail. This governance committee can also play a critical role in reviewing and selecting your new CRM, and in the launch process.

6. Do you have leadership buy-in?

To do this, write up a plan or create a power point presentation that shows that you have thought through all of these key questions. Outline your vision, strategy and roadmap. Engage leadership in decision-making and keep them informed as the plans for the new CRM evolve. Coach leaders on how to talk about the change and tee them up to do so. Visible leadership buy-in is invaluable for user adoption. Long-term success relies on leadership involvement.

7. Do you need to re-engineer or design any business processes?

Before you decide to change CRMs, be sure that you have evaluated the key processes that will be handled by the CRM. Your processes should be tech-agnostic. Work to remove platform specific language from your processes. Define your processes using your terms and then find the technology to meet those needs. Ensure that processes are defined before introducing a new technology that could influence how things work.

Your business process should help define your technology needs—not the other way around. Working on business process improvement prior to launching your CRM migration will help keep your search clearly defined and launch timeline on track, so your team will not have to pause to review and make decisions on processes.

8. How change-resistant is your organization?

Take a look at your organization as a whole. How do folks handle change? What has been successful in the past and how can you learn from that? Are there any critical stakeholders who may attempt to disrupt the change? Has your organization undergone any other major shifts recently?

Think specifically about those folks who are responsible for key components or processes in your existing system. Ensure that these individuals are brought on board early and are engaged in the process. Provide them with training on the new CRM so that they can quickly become the experts in the new system and feel secure. Making a change without these folks on board could leave them feeling vulnerable about their job, and therefore are at risk for impacting the new CRM launch.

9. What type of learners do you have?

You’ll need to engage users early and often in the change. Identify the learning style of your organization, of specific teams, and of individual users. Be sure to develop and offer training materials that meet users where they are at. Appeal to different learning styles by providing materials in various venues. And, do not wait until the new CRM is ready for go-live to provide training. Early training sessions can build excitement and engagement. Your reward for this hard work will be user engagement, user adoption, and system success.

10. Who will implement the change?

Do you have the skills in house to change CRMs or will you rely on a partner to work with you? Think about what specific skills and traits are important for your selected partner. Do you need someone who knows nonprofit speak, or someone who has handled fundraising implementations? You may also want to consider issuing an RFP or interview potential partners more informally to make sure you select the partner with the most knowledge and cultural fit for your organization. Change is not easy, but if you build out your team, it will be easier.

 

You’ll notice that this list of considerations does not include timeline. Timeline is definitely important; however, prioritizing a timeline could put unnecessary pressure on your organization to move too quickly through the important strategy and planning stages. For higher adoption and overall success, your CRM strategy and vision should come first, using the key questions outlined above. Then, using this strategy to decide which CRM you will use and how you will implement the change, you will define timeline, launch phases, and other important implementation and change management–related details.

Contrary to what most database vendors will tell you, picking a CRM is the least important part of being successful at tracking how your participants engage with your organization. The three key aspects of CRM success have more to do with organizational planning and staffing than they do with technology. All of them take investment of time and energy at all levels of your organization.

1. Culture of database use

What this looks like:

  • Staff across the organization understand the role of data in their work and to the organization overall.
  • Staff across the organization know how to use the database appropriately for their role.
  • Agreed-upon information is collected consistently and entered promptly, including changes to contact information and resolution of duplicate records.
  • When someone needs contact information or a report on the latest event attendance numbers, they go to the CRM—not to their Outlook or to their team member’s spreadsheet.
  • Problems are reported as they arise and resolved on a reasonable timeline.

2. Strong data strategy

What this looks like:

  • Our desired outcomes are driving data collection and analysis and not the other way around.
  • Clear communication on what is being collected and why.
  • There is a reason behind every piece of data collection.

3. Solid implementation

What this looks like:

  • Fields and other data structures exist to capture everything called for by the strategy.
  • The searches and reports called for by the strategy are available.
  • Processes for data entry, reporting, and updating are not experienced as frustrating or overly onerous by staff.

How to cultivate all three

Here are the most important practices that I have seen lead to success with CRMs; together, they address all three key facets in an interlocking way.

Assess your needs early and often

  • As a first step in change of data tools or features, ask the right questions:
  • What do we need to know about the people and organizations who participate in our programs in order to do this work?
  • What are the activities that our CRM needs to support?
  • What information is currently missing that would transform our work if we had it?
  • What information do we need to support daily tasks or to provide to our partners, members, donors, funders, and other stakeholders?

Make the time

  • Make sure that all data management duties (tracking problems and new feature requests, training staff and maintaining documentation, designing clear processes) are assigned to someone in the organization. This work can be shared among more than one person, though if the work is spread out, it is helpful to have a single point person who is in charge of coordination. Make it official: put it in job descriptions and workplans, and ensure that the time is truly available.
  • Include data entry and other related responsibilities appropriately in all job descriptions and workplans of all those who are expected to use the CRM.
  • Set clear and consistent expectations for data entry (backed up by workplanning).
  • Promote accountability with regular attention to database practices and outcomes in supervisory check-ins.
  • Make time at staff or team meetings to discuss database challenges, wish list items, and ideas. This is a great venue to discuss how any changes to program work should be reflected in the database/result in changes to it.

Support the database

  • Have a clear feedback loop for staff to report bugs and problems, and a plan in place for how they will be fixed; this can be as simple as a spreadsheet or as complex as a help desk staffed by a vendor or consultant.
  • Have a clear feedback loop to ensure that the database stays up to date as organizational work changes. Make sure that all staff know to talk to the database manager about changes to work that is tracked in the database.
  • Have a budget for support for and changes to the database, and a decision-making structure for how the money gets allocated if there are competing priorities.

Support your people

  • Train on both strategy and implementation—in the right order for your organization. Consider current organizational needs and challenges, along with what skills and expertise already exist on your staff. Some organizations need strategy first, and some need implementation first. There is no one-size-fits-all training plan.
  • Establish clear protocols for what needs to be entered, and document them, so that everyone knows what to collect and where to put it.
  • Include role-specific task training in all new staff orientations and make this training available to existing staff as a refresher.
  • Identify coworkers who are power users or both interested in and enthusiastic about the database as peer champions: enlist them to cheerlead for the importance of the database and to offer their support to others by answering questions and providing quick task-based demos on the fly as needed. (And be sure to recognize and plan for the impact this will have on their time by including it in their workplan!)
  • Provide training on any new features and practices.

Document wisely

  • Rely on standard user manual resources produced by the software vendor or community for documentation of basic operations; don’t spend time documenting simple things that are already in a user guide that someone else has made.
  • Document the workflows, processes, custom fields, and other elements of database use that are specific to your organization.
  • End-user documentation should be brief, task-based, and visually driven.

Watch out for these common pitfalls

When people revert to using spreadsheets or other tools to store information that belongs in the CRM, or put off database tasks in favor of things that may be more urgent but are less important, it undermines the viability of your CRM as a complete picture of organizational activity. Be proactive about troubleshooting and intervening as necessary. The solution will depend on the problem, but questions to ask include: Is training adequate? Does the CRM process need to be redesigned for greater efficiency? Does more time for database activities need to be built into workplans?

  • Be realistic about your organizational capacity. If a data collection or reporting process sounds like it would be too time consuming for staff, it probably is.
  • Don’t expect immediate results. CRMs can make your organization’s work easier, more efficient, and more effective—but making changes to the way you work is hard, and can make everyday tasks take longer until everyone is adjusted.
  • Software marketing materials often carry the message  “[name of software] will do [work you want done].” Remember that this is not exactly true: Software is a tool for humans to use while they do the work.

Always remember these basic truths

  • Database use is not about technology; it is about work practices and human habits.
  • No software is perfect. Any choice of tool requires tradeoffs. Even the best database is frustrating sometimes. Understanding this will make the inevitable struggles less painful.
  • Databases take time, energy, and money to use and manage well; the more complex the system, the more time, energy, and money it takes. Be realistic about what your organization can take on.
  • Not every organization needs or can handle a CRM. If you don’t have the capacity to maintain a CRM, a few well-chosen, carefully designed, and consistently updated spreadsheets can be a better choice.

Do you know how well your nonprofit is doing to meet its mission? Is your team able to reflect on ways to improve at meeting your mission? Nonprofits that make their missions measurable can build the tools to improve their effectiveness.

Imagine the difference for our communities if every nonprofit could improve its performance at meeting its mission. Based on nonprofit Theory of Change, here are five steps nonprofits can take to prepare to improve mission performance, starting with the two foundational mission commitments:

1. Define who the nonprofit commits to engage.

Who do we live and die for? Who is our target population? This can be:

  • Individuals, for example: parents of color with school-age children in Boston
  • Organizations, for example: environmental nonprofits in Vermont
  • A community, for example: the Logan Square neighborhood of Chicago

Then you can define key measurable, observable characteristics of your target population:

  • Demographics: For individuals, where are they? How old are they/their children? For organizations, how big are they or what kinds of missions do they have? For communities, what are the boundary lines?
  • Assets or strengths: What are they good at? What assets do they bring to the table?
  • Challenges: What challenges do they bring to the process. What help do they need from you?

2. Agree to what end the nonprofit commits to engage them.

There are three kinds of nonprofit missions, each a different way to answer the question “To what end?”

Some nonprofits are accountable to provide basic needs: food, shelter, safety, or clothing; for example a food pantry or the Red Cross. Basic needs are those without which one cannot live, the base of Maslow’s hierarchy of needs.

Some nonprofits are accountable to deliver a specific quality service, for example a liberal arts education, health care, ballet performance, or museum experience.

Some nonprofits commit to help participants climb an outcomes ladder. Outcomes are meaningful, intentional, sustained changes in people’s lives. Here is an example showing how a child learns to read:

  • Initial outcomes include inside changes: New knowledge and skills, like learning the alphabet and understanding phonics and sight words, and changed values and attitudes like taking an interest in reading
  • Intermediate outcomes include behaviors and milestones, such as reading an early reader out loud and reading at a third grade level with comprehension. Intermediate outcomes can only grow if the necessary initial outcomes have first developed
  • Long-term outcomes are the changes in life status that result from sustaining intermediate outcomes over time, in this case reading for knowledge and pleasure.

All three kinds of missions are uniquely valuable to deliver public benefit. Each kind of mission implies a different definition of success. Each kind of nonprofit mission can be made measurable by selecting indicators that tell if the accountability has been met.

All nonprofits can be clear, focused, and agreed on their definition of “Who?” and “To what end?” These two measures are the bookends of nonprofit missions and nonprofit performance management.

3. Codify the nonprofit’s program strategy and activities.

What strategies and activities are necessary to help your target population reach your committed destination above in #2? What quantity, quality, and duration of program activities, requirements, and relationships are logically necessary to engage your participants in their unique context and help them progress along the path you believe is necessary to reach the final destination in your mission?

Once your nonprofit has mapped out its theory of change, it can measure participant attendance and progress at each key step along the journey.

4. Define indicators and select measurement tools.

Indicators are measurable data that tell whether individual participants have received the basic needs or quality service commitment, or have achieved success on a priority outcome. After indicators are defined, a nonprofit can select or develop questions and a realistic plan to gather that data. For example, who will gather it, how, and when? Where will the data be stored? How will the data be compiled and presented?

By taking these steps, you design a clear pathway for participants to reach the intended mission destination. Depending on your nonprofit’s specific mission commitments and pathway, here are the kinds of data sets nonprofits need to manage and improve their performance on mission, which tell if a participant:

  • meets criteria for target population.
  • has completed sufficient program dosage needed to receive basic needs or quality services or achieve outcomes.
  • has completed all the program requirements.
  • has met basic needs or quality standards.
  • has achieved initial outcomes, intermediate outcomes, and long-term outcomes.

The goal of data analysis is to answer the question, “How many of our target population reached our final mission destination?” And for the ones who did not reach the desired destination, “Where did we lose them? How can we intervene sooner and do better next time?”

5. Design a plan to learn from the data and use it to inform improvement.

You want to gather data about each individual participant’s progress down your nonprofit’s pathway. This empowers front-line staff to use real-time participant data to make optimal daily decisions to help them. This process is called tactical data use.

At the same time, you want to compile data about groups of participants’ progress, so that program leadership can identify patterns of success or lagging behind, to make needed program adjustments and provide support to staff.

Finally, your organization’s management and board of directors can use compiled data to make better decisions about raising and allocating resources, and about staffing, partnerships and program strategy. This process is called strategic data use. Together they become a powerful engine for organization-wide learning and improvement, in the hands of leadership and staff committed to mission effectiveness.

 

This approach is an all-hands-on-deck, relentless pursuit to improve participant outcomes, which requires that a nonprofit make changes in the way it manages its operations. Any nonprofit can clarify its mission commitments and theory of change and ultimately make it measurable.

In 2016-2017, a Washington, DC–based nonprofit with a staff of about 40 and a 3.5 million-dollar budget undertook a redesign process to convert a ColdFusion website into a content management system with a custom mobile-responsive theme.

To make sure the finished results worked, the website team made strong efforts in:

  1. understanding the overall needs of the website,
  2. involving staff in specifying their own needs,
  3. determining content types,
  4. thinking in terms of lists,
  5. testing against assumptions,
  6. creating reporting mechanisms, and
  7. wireframing/building/testing/refining,

Feedback loops were built in to the ongoing process in order to course correct and gain early constructive criticism from internal stakeholders.

Tip #1: Understand overall needs

The team tasked with pre-planning the redesign process undertook a review of existing web pages and reached consensus that content belonged under different lenses, programs, campaigns, and actions.

The team identified the organization’s theory of change, current audience, new website objectives, comparables, desired functions, specified revenue models, and desired budget and timeline, and circulated this information via an RFP.

Example A. Website RFP Top 10 List

Better storytelling -> Optimized content -> Engages more people -> More social change -> Greater financial support -> Fulfills our mission.

Top 10 must-have list (goals for the site):

  1. Increased email sign ups, social engagement, and activists
  2. New donors: the website needs to encourage people to sign up as donors with attractive donation pages
  3. Stay on budget. We are open to creative work share solutions
  4. Clarity and Simplicity: needs to give visitors a clear sense of Green America’s work
  5. Attractiveness: A clean look, beautiful storytelling, and responsive design
  6. Flexibility: we do lots of stuff. Our ability to adapt has always been our secret weapon. Our website needs to be flexible to handle our diverse content and programs
  7. Ease of Use on the backend: we will have many editors with various technical expertise (the ability to update the website frequently is essential)
  8. Integration of all our channels and platforms: Daughter Sites, Blogs, Social, Digital Publications, Apps, SALSA (action CRM), Raiser’s Edge (fundraising CRM), Charity Engine (donation pages)
  9. Authentic product/sponsorship placement
  10. Visual Story: Telling our stories in a visually compelling manner to better engage audiences and increase shares of our materials

Takeaway #1: What to do

During pre-planning, convene individuals across different teams to construct a shared model for content. Aim for transparency around budget, timeline, and requirements for the website.

Tip #2: Involve staff in identifying their own needs

By mapping out content across major areas, staff better clarified their understanding of how content fit into lenses, programs, and/or campaigns. A pilot content management system allowed staff to test their assumptions against real data.

Example B. Early model of content hierarchy and structure

Team members continuously articulated how content fit into the proposed data architecture. For example:

Lens Program Campaign Focus Area Action Issue Topic
Food GMO Inside

Good Food for people and planet

Starbucks

No GE Wheat

Organic

Say no to GMOs

Climate

Factory Farms

Pesticides

Tell congress to stop trading our food stystems.

Tell congress to reject TPP

Tell Kraft to Remove GMOs from Miracle Whip

Tell Starbucks to go organic

Let Mars know you say no to GMOs

Tell American What Growers Association no GE Wheat

Soil not Oil

GMOs a case for Precaution
Don’t have a Cow

21 foods to always buy organic

An external design firm worked with staff to sketch out user personas, delve into content relationships, formalize roles and permissions, and determine the initial menu.

Example C. Sample site map content

Food Climate Labor Finance
Fight GMOs Fight Dirty Energy Ending Child Labor Save for Yourself and a Better World (banking)
Beyond Organic Invest in Clean Energy Ending Smartphone Sweatshops Divest from Fossil Fuels, Invest in Clean Energy
Fair Labor Better Paper Ending Sweatshops in Supply Chains

Finding Fair Alternatives

Green your Money/ Finances (investing)
Take Action:____ Take Action:____ Take Action:____ Take Action:____

Takeaway #2: What to do

Allow multiple opportunities for individuals to voice concerns, update assumptions, and validate the model against live data.

Tip #3: Determine content types

Content types evolved whenever staff identified a long bulleted list of the same type of content. For example, blog posts, media mentions, events, staff listings, job descriptions, magazines, press releases, and business listings all converted to “content types.”

Required fields emerged from discussions about content types. For example:

  • Media Mention = Title, Website link, Image, Byline, Body text
  • Business listing = Organization Name, Categories, Website link, Image, Address, City, State, Zip, Body text

Example D. Sample content type for a blog post

This is an example of blog fields:

Field About the Field
Blog Post Type Multiple choice, multiple answer, choose from categories)
Body Long formatted text
Display Image Image upload allowing for pngs, jpgs, or gifs
Business Network Recommendations References a list of all available businesses in a related directory
Relevant Lens Multiple choice, multiple answer, choose from a list of available lenses
Relevant Program Multiple choice, multiple answer, choose from a list of available programs
Relevant Campaign Multiple choice, multiple answer, choose from a list of available campaigns
Tags Free tags in keyword style

Certain fields existed across content types. For example, the “Relevant Lens” field attached to campaigns, programs, actions, victories, and press releases.

Takeaway #3: What to do

Create fields for each type of content. Identify fields to repurpose across content types.

Tip #4: Think in terms of lists: referencing entities and normalizing data

Certain fields became standardized and used across multiple content types. For example, almost all content types require an image field, so content types used a “Display Image” field.

As another example, blog posts, media mentions, programs, campaigns, and actions all used the same “Relevant Lens” field to reference available lenses.

As a final example, blog posts, articles, and green living pieces used the same “Relevant Program” and “Relevant Campaign” fields as reference fields. The list of all available programs or campaigns continuously updates upon the addition of new programs or new campaigns.

The idea of “entity referencing” allows users to continually grow and easily make changes, because any list of referenced content is always “up-to-date.”

Normalizing means an edit to a specific piece of content perpetuates through all instances where that piece displays. By using normalization, categorization of items, and entity referencing, it became easier and easier for any privileged user to make changes sitewide.

Example E. Data normalization samples

5 Most Recent Blog Posts: On a blog post, a list of the 5 most recent blog posts displays on the bottom of every page, in descending chronological order (most recent first). Any new blog post auto-adds to the list. Any edit to the title updates in all instances.

Fruit List: A fruit list begins with apples, oranges, blueberries, and bananas. Additions like blackberries, peaches, plums, nectarines, mangoes, strawberries, and papayas automatically display on the “Fruit List.”

Fruit List Categories: Categorizations on fruit include “Citrus” or “Berries.” Additions such as “Stonefruit” automatically update, such that a categorized list might read:

  • Berries: blackberries, blueberries, strawberries
  • Citrus: lemons, limes, oranges
  • Stonefruit: nectarines, peaches, plums
  • Tropical: mangoes, papayas
  • Not Yet Categorized: apples, bananas, starfruit

Takeaway #4: What to Do

If an “edit” button makes sense next to every item in a list, convert that list to a content type: most useful for items such as blog posts, press releases, events, staff listings, directory listings, and similar content.

Tip #5: Test against assumptions

During the buildout, question if the articulated data structure matches staff needs. By taking time to find and correct incomplete/faulty assumptions about content relationships, all stakeholders better understand the final product.

As an example, a “Magazine Issue” offers the ability to choose from a list of available “magazine articles” in order to display “featured articles.” A “Lens” offers a display of “relevant pieces.” In one case, our team mistakenly focused on “parent” relationships for content, and based on feedback, turned that into focusing on “child” relationships.

Example F. Choosing relevant pieces on a lens

A “Climate” lens shows a green living piece “Cut Your Carbon at Home” and a blog post “Add Socially Responsible Investments to Your Workplace’s Retirement Plan.” On any lens, the “entity reference” field helps specify relevant pieces, in their desired display order.

Takeaway #5: What to Do

Course correction takes time. Identify, test, review, and go back to the drawing board based on feedback from editorial and program staff. Large projects require flexibility to address initial incorrect assumptions.

Tip #6: Create reporting mechanisms

Reports help staff understand the website content better. Report-building benefits when customized to the specific type of user requesting that report. Early beta versions help identify gaps and allow the user to continuously access, understand, and download available data in order to make suggestions.

In an iterative buildout, the technology team benefits from early feedback. Conversely, an administrator or executive reviewing a prototype report better understands what is available to them and makes more informed requests about new fields and filters.

Technology teams who engage with end users by requesting, correcting, and fine-tuning build more relevant and useful reports.

Example G. Sample administrative reports

  • Recently Updated: a list of all recently created or updated content
  • All Green Living Pieces: a list of tips on green living
  • All Press Releases: a list of all generated press releases
  • All Blog Posts: a list of all blog posts
  • All Lenses: a list of all major areas of work
  • All Programs: a list of all available programs, sorted by lens
  • All Campaigns: a list of all available campaigns, sorted by program and lens
  • All Victories: a list of success stories
  • All Staff: a list of all people who are staff members, consultants, and interns

Takeaway #6: What to Do

Reports help users understand the existing information. Create a new report for each content type and fine-tune as needed.

Tip #7: Wireframe, build, test, and refine

Prepare to be exhilarated, challenged, rewarded, and exhausted by the minimum viable product process. Technologists build digital tools twice: once in the mind, and second in reality. Prototypes help with the process of getting feedback across internal stakeholders. Drawings, mockups, and paper versions all assist teammates in understanding the proposed redesign architecture.

Build in a refinement period into the website redesign schedule so there is time to clarify and details that weren’t addressed the first time around.

Example H. Mindmap about Homepage

Mindmap about Homepage includes a Box called Enter, with arrows coming out that display Lenses: Food Lens, Finance Lens, Climate Lens, and Labor Lens. Other arrows go to five other sections. 1: Current Program Highlights, which leads to Relevant Programs and Relevant Campaigns. 2: Current Campaign Highlights, which leads to Relevant Campaigns and Relevant Actions. Relevant actions continues to Salsa Action (third party). 3: Current Action Highlights, which leads to Relevant Actions and All Pieces. 4: Piece Highlights, which leads to All Pieces and Focus Areas. 5: Sign up for E-news. 6: Donate (third party embed)

Image: Member Landing Mockup

This is a Balsamiq-generated mockup image to help the team understand different pieces for the member landing. It includes a main block with three tabs called My Biz Listing, My Coupons, and My Ads. There is a second block underneath called Members-Only Documents. On the left sidebar is a list of Announcements. Inside the My Biz Listing tab is four bolded field labels and text as follows: 1st line - Title: My Green Business Listing. 2nd line - Date Updated: 2017 January 7, Description. 3rd line - Description: lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do ejusmod tempor incididunt ut labore et dolore magna aliqua. 4th line - Categories: Children, Pets, Clothing. 5th line: (edit this listing).

Takeaway #7: What to Do

Tools such as Balsamiq, LucidChart, and Invision assist stakeholders in gaining clarity around mental models: use them liberally. Your platform never really reaches completion, so build in time post-launch to continuously improve.

Conclusion

Technology professionals create effective tools to champion positive social, environmental, economic, and political change. By integrating feedback loops early and often, these tools spread the message, educate the populace, gain support for the cause, and make a positive difference.

Three years ago, Habitat for Humanity decided to test the power of technology to increase resident engagement in neighborhoods we serve. With a grant from the Fund for Shared Insight, we launched a pilot project with 12 Habitat organizations across the country to determine whether using feedback loops increases participation by community residents in choosing strategies and projects to promote their “community voice” and aspirational goals.

At Habitat, adapting a feedback loop methodology from Feedback Labs—a systematized approach to collecting and analyzing data and sharing back findings with community residents—has produced clear outcomes and made the case for scaling up this initiative.

To start, we interviewed Habitat staff to see if the residents they partner with prefer online, mobile, paper, or some other medium of engagement. Then we created a multi-medium method for data collection and the ability to share real-time feedback to hear directly from the community, so local Habitat organizations could discuss and strategize on their next steps.

The end result has been a widening of communication channels for our nonprofit headquarters to hear directly from feet-on-the-street community activists.

What our feedback loops told us about engaging our community

 

The less burden on people taking surveys, the better.

One Habitat staff member noticed many people in her neighborhood had challenges with the online survey. When she realized this, she stopped asking them to take the survey online and took it to them face-to-face. While academics might cite how many changes this shifting approach would make to the data, from a community organizer’s stance, the move created a better sense of trust with neighbors.

Meet people where they’re meeting.

Some local organizations didn’t know why their survey response rate was low. But when they decided to go out in the community, response rates went up. Also, once the community members got involved with Habitat, they stepped up their civic engagement in general. In Central Berkshire, Massachusetts, for example, residents who became active with Habitat and housing issues later took leadership roles in local transportation initiatives.

Make it fun.

In Dupage, Illinois, the local Habitat used the community’s BBQ and Resource Fair as an occasion to share results and hear feedback from local residents.

Make it personal.

People conducting surveys can be misinterpreted as dry. With feedback loops, we used our strength in housing and community outreach to connect neighbors, sometimes resulting in life-changing experiences.

Take Demita in Springfield, Missouri. Demita and other community leaders agreed to host a neighborhood event to clean alleys and to inspire homeowners to spruce up their yards. A few days before their “Rally in the Alley” day, Demita decided to introduce herself to residents along her alley and generate face-to-face enthusiasm for the event.

At one door, she met Kathy, a homeowner whose tailored front yard held potted plants and lawn art but whose backyard was overrun with waist-high grass. Kathy explained that she had been feeling hopeless since her lawn mower broke, and caring for her disabled husband left her no time to do anything about it. When Demita told others of Kathy’s plight, the next door neighbor came over and cut the lawn right away. It was Kathy’s first time meeting him, and in expressing thanks, she said, “This is the nicest thing anybody has ever done for me.”

For Demita, this new relationship alone made the community building effort a success.

Don’t let the medium control the conversation.

It’s easier to pick a sustainable technology that already supports human behavior rather than forcing human behavior to adapt to technology. Feedback from the pilot sites showed that some people found the online survey technology to be too prescriptive while others preferred it.

Also, there are residents in some neighborhoods who do not use smartphones and have no Wi-Fi at all. Low-tech methods should be considered as legitimate for immediate response feedback.


In each of the pilot communities, Habitat saw improvement in community involvement and resident engagement. Sometimes this manifested as a statistical increase in attendance at meetings and participation in projects. For example, in Greater Lowell, Massachusetts, only 29 resident leaders had partnered in neighborhood efforts before the pilot. Since October 2016, the number has grown to 56 participating in the first community conversation and 62 in the second—an impressive increase of 70%, or 39 residents, participating in both.

Over the past 40 years, Habitat for Humanity has worked with people around the globe to help families achieve the strength, stability and self-reliance they need to build better lives for themselves. The most important element of our mission is the partnership between Habitat and the homeowner, and we continuously seek to keep homeowners and their input at the center of what we do.

Each of the 12 feedback loop pilot projects shows a positive reflection of outreach in Habitat organizations nationwide, where our mission is guided by the aspirations of the communities we serve. Using feedback loops helps us energize communities and chart our progress in sustaining and advancing Habitat’s mission in partnership with donors, volunteers, and homeowners.

Collecting shared metrics has strengthened the evaluation of community engagement and helped us to continuously refine programs, projects, or systems. Our next step is to explore the growth and sustainability of feedback loops, which can change the dynamic between community residents and the people and agencies that partner with them.

Most people will say security is important, but if pressed, chances are they don’t really know what that means. What is IT security, exactly, and what’s the worst that can happen? Most pressingly: How can often cash-strapped nonprofit organizations keep their information—and their clients’ or donors’ information—safe and sound?

Leon WilsonLeon Wilson, Chief for Digital Innovation & Chief Information Officer for the Cleveland Foundation and past NTEN Lifetime Achievement Award winner, is leading an online NTEN course on security basics for nonprofits: Intro to IT Security, in May. He was kind enough to answer a few questions about IT security and the special considerations for nonprofit organizations.

Why are nonprofits at greater risk of information breaches and other hacks?

Because hackers know that they’re easy prey; that is, they presume that nonprofits not only don’t have a sophisticated or a secure environment as say a bank or hospital, but that they aren’t even performing the basics well enough. Also, nonprofits have a trove of donor and client information that can be pilfered for identity theft and social media trolling.

What are the potential consequences to nonprofits and their clients?

Loss of trust between the nonprofit and their client that can lead to loss of donors/donations and loss of business/clients wanting to work with the nonprofit.

What are a few things that nonprofits can do to assess their risk?

1) Hire a credible IT consultant to perform a comprehensive IT security & risk assessment; 2) Identify any compliancy regulations they must conform to (e.g. HIPPA, PCI-DSS, Personally Identifiable Information (PII) pertaining to kids).

Why is having an IT security strategy important?

Most, if not all, IT security experts will tell you that these days, it’s not a matter if you’ve been hacked, but when. It’s nearly inevitable in this day and age. Therefore, having a “constantly” current IT security strategy is akin to being a fiscally responsible organization.

What’s the first step that at-risk nonprofits should take to improve their practices?

I can’t say it enough: You don’t know how bad of a situation you have until you assess the situation. Thus, the first step is for nonprofit leadership to take IT security seriously and have a IT security assessment performed. A good IT security assessment should not only identify your vulnerabilities, but rank them by severity. Tackle the severe ones first.

What is the number one pitfall or roadblock for nonprofits implementing an IT security policy?

Unfortunately, it’s a four-way tie: a) lack of awareness, b) not knowing who to turn to for help; that is, finding a good IT security consultant that will help them identify and plug any holes without going overboard, c) lack of finances to perform a good IT security assessment, and d) funding to implement those changes warranting additional technology solutions and consulting work.

View our courses page to find the next Intro to IT Security course with Leon.

The digital landscape is changing at a dizzying rate and sometimes it feels like the plans you made yesterday are obsolete by morning. But help is at hand!

For the third year, NTEN is proud to partner with Care2, hjc and Resource Alliance on a report that sets the standard for nonprofit digital planning. But we need your help. The 2017 Digital Outlook Report is powered by responses by nonprofit professionals just like you. The survey will take about 10-15 minutes to complete and you’ll be entered in the drawing for some great prizes.

Take the survey today and be the first to know when the findings are published later this year.

 

While good leadership can help employees understand the need for security measures and encourage compliance, bad leadership can foster employee discontent, conflict with the IT department, and the failure of even the best of plans. Executives must have a good understanding of what computer security risks are out there for nonprofits so they can guide the organization in evaluating how much risk the organization can afford. The IT department can educate and give advice, but decisions and support must come from the highest level.

Here are the seven most common security weaknesses that nonprofits have:

  1. Lack of organizational understanding or commitment to security
  2. Ineffective or unenforced policy
  3. No regular user education on risk
  4. Weak passwords
  5. No anti-malware software
  6. No email filtering
  7. No website filtering

Beyond the tips below, organizations should probably seek outside professional security incident management services, which can provide a level of monitoring and responsiveness to threats that most organizations couldn’t afford on their own. Such services usually provide monitoring of logs and other indicators of network activity, using a combination of automated and human evaluations, providing almost real-time responses to threats.

How To Make Your Nonprofit More Secure

The SANS website has a number of sample policies for almost everything related to computer security. These can be modified to meet your organization’s needs. The biggest trick with policies is getting people to follow them.

Providing procedures such as a quick checklist or flowchart of how to evaluate suspicious emails or web sites can help users make better decisions. For IT, such procedures would be more complex and in-depth, but for end users, a quick “if you see something like this, do this” will be helpful.

Finally, users need constant training and re-training on the importance of organizational security. Such training doesn’t have to be a massive all-day affair; frequent reminders are probably more effective. Train new employees thoroughly and all employees on new threats as they arise. After that, a quick mention at meetings, posters, or other reminders should suffice. If you do have periodic trainings for employees, make sure you cover why it’s important as well as what they should do differently.

There’s No Excuse for “Password1”

Passwords are the first and—in many organizations, the only—method of protecting computers, so let’s talk about the reality of passwords in a day of massive computing power at the hands of almost anyone who wants it.

A good password should be at least 12 characters long with a combination of letters (upper and lower case), numbers, and symbols, in order to be very complex by most current standards. If the password was truly random, it actually would take over a 100,000 years to crack by brute force.

The problem is that even moderately complex passwords are hard to remember, and unless you use a password generator they are never really random.

Most organizations require an 8-character password with letters, numbers and symbols. A random 8-character password can be cracked by brute force in about a year. This may seem like more than enough since you probably (hopefully) change passwords more often than that, but remember, people don’t create random passwords. They usually use a familiar word (such as the names of loved ones) with some numbers (like important dates) and symbols added on or mixed in, or they post their “random” password on a sticky note on their monitor or worse, in a plain text file on their computer.

Passwords sometimes aren’t enough. Some organizations have implemented multi-factor authentication. Using two factors (for example, a password as well as a fob scan) gives an order of magnitude improvement in security. What is practical (and least expensive in most cases) for organizations is to use the cell phone as the second factor by setting up systems to ask for a code that is texted to the phone. Google and Office 365 both have good multi-factor authorization options.

The biggest problem with implementing solutions like these is never the complexity or cost: it’s your users. Many users will see this additional requirement as a burden and some will even seek ways of circumventing them, like saving your confidential files on their personal cloud storage account, violating the principles of confidentiality and integrity as well as authentication. Once again, the key to solving this problem is leadership, policy, procedure, and training.

Protecting Your Users from Themselves

A couple of basics that all organizations should have are: anti-malware software (commonly called antivirus), email filtering (spam filtering), and website filtering (content filtering). These three solutions are a good way to help protect users from themselves, if they are used effectively and kept up to date.

Security is everyone’s business. Let’s say a user gets a phishing email that got past your filters, but because they got training, they realized what it was and notified the IT department. IT staff could then update the anti-malware software for the new threat and update the email filtering rules to block the sender. They can also update the website filtering rules to block the bad URL where the virus is disseminated. These changes would help less careful users who might click on the link in the email, since their access would then be blocked.

Have a Good Backup Plan

A good backup isn’t a single copy somewhere else on the network—you may not know exactly when the attack happened and your backup might be a backup of encrypted or infected information. Best security practices dictate that you have multiple backups, covering several weeks or even months, held in an isolated location. If your files are compromised or held ransom, you can clean up your systems and restore from the last good backup.

Is Your Organization Compliant?

If your organization is required to follow one of the many government or industry regulations and rules, such as HIPPA, FERPA, PCI and the rest of the alphabet soup, you should definitely have professional help in implementing and certifying compliance. It is a good idea, unless you are willing to spend the money to do it right yourself, to use vendors. Make sure your vendors are able to provide proof of such compliance and give you the documentation you need to maintain it.

Working with a Limited Budget

If you are working with a very limited budget and think you can’t even begin to do the basics, here are some tips to get started:

  • Make sure your organization’s leadership is committed to doing things in a more secure way.
  • Check to see if what you already have can do more. For example, if you already have a firewall, you also might have some web filtering capacity or other advanced features you haven’t used yet. Both Office 365 and Google have the ability to implement multi-factor authentication free for nonprofits. Both have some level of email filtering, although probably not as robust as I’d like without additional licensing or expense, but it is a start. Learn how to use what is available and start making better and more complete use of the basic features included with what you already use.
  • For backup, your nonprofit may qualify for free or discounted use of Microsoft Azure services. Not only can you create virtual servers, there are some good backup products that you can use to back up servers located in your offices.
  • In addition to the cloud services mentioned above, techsoup.orgoffers software and hardware donations to qualified nonprofits for a very modest fee.
  • There are also a number of subscription services that can help create a more secure environment. When you look at the cost of purchasing software or hardware, maintenance, support, and the other related costs of an owned solution, a subscription may be a more cost-effective way to go.
  • Always ask for a nonprofit discount. Even if they don’t advertise it, many vendors will give you at least a 10% discount if you ask, and some offer even more.

Taking Security to the Next Level

Good security is multi-layered, each layer adding another amount of security. Good security also isn’t “set it and forget it.” It needs to be maintained and monitored by organizational IT staff or an outside vendor. And most important, IT leadership; without this, even the best plans will fail.

When planning security for your organization, think about what it would cost to have a hacker gain access to your organization’s information, and what the loss of data and reputation would actually cost your organization. And then make a plan to prevent it.

Resources

Microsoft Trust Center: www.microsoft.com/en-us/trustcenter/Compliance/default.aspx

TechSoup: www.techsoup.org

Google 2-step verification: support.google.com/accounts/answer/185839?hl=en

 

Photo credit: blogtrepeneur