Archives for posts with tag: The Bridgespan Group

I know a lot of people who’re starting up new nonprofits, and most don’t have any prior experience with fundraising. That was me!, back in 2007 when I took over the Wikimedia Foundation. And so, the purpose of this post is to share some of what I learned over the past eight years, both from my own experience and from talking with other EDs and with grantmakers. I’m focusing on restricted grants here because they’re the most obvious and common funding source for nonprofits, especially in their early stages of development.

Restricted grants can be great. Grantmaking institutions fund work that’s socially important, that’s coming out of organizations that may have no other access to funding, and that is often risky or experimental. They take chances on people and organizations with good ideas, who may not yet have a track record. That’s necessary and important.

But restricted grants also pose some specific problems for the organizations seeking them. This is well understood inside nonprofitland, but isn’t immediately obvious to people who’re new to it.

Here are the five main problems with restricted grants.

Restricted grants can be administratively burdensome. At the WMF, we actively sought out restricted grants for about two years, and afterwards accepted them only rarely. We had two rules of thumb: 1) We would only seek restricted grants from organizations we knew well and trusted to be good partners with us, and 2) We would only seek restricted grants from organizations that were roughly our size (by staff headcount) or smaller. Why? Because restricted grants can be a lot of work, particularly if the two organizations aren’t well aligned.

Big institutions have a big capacity to generate process: forms to fill out, procedures to follow, hoops to jump through. They have lots of staff time for meetings and calls and email exchanges. They operate at a slower pace than smaller orgs, and their processes are often inflexible. People who work at grantmaking institutions have a responsibility to be careful with their organization’s money, and want to feel like they’re adding value to the work the nonprofit is doing. Too often, this results in nonprofits feeling burdened by expensive process as they procure and report on grants: time that you want to spend achieving your mission, instead risks getting eaten up by grantmakers’ administrative requirements.

Restricted grants risk overwriting the nonprofit’s priorities with the grantmakers’ priorities. At the WMF, we didn’t accept grants for things we weren’t planning to do anyway. Every year we developed our plan, and then we would (sometimes, with funders we trusted) seek funding for specific components of it. With funders we trusted, we were happy to get their input on our priorities and our plans for executing them. But we weren’t interested in advancing grantmakers’ goals, except insofar as they overlapped with ours.

Too often, especially with young or small non-profits, I see the opposite.

If an organization is cash-strapped, all money looks good. But it’s not. Here’s a crude example. Let’s say the WMF knows it needs to focus its energy on mobile, and a funder is interested in creating physical spaces for Wikipedians to get together F2F for editing parties. In that context, agreeing with a funder to take money for the set-up of editing cafes would pose a distraction from the mobile work the WMF would need to be doing. An organization’s capacity and energies are always limited, and even grants that fully fund a new activity are necessarily drawing on executive and managerial attention, as well as the organization’s support functions (human resources, accounting, admin, legal, PR). If what a restricted grant funds isn’t a near-perfect fit with what the organization hopes to accomplish regardless of the funding, you risk your organization getting pulled off-track.

Restricted grants pull focus from core work. Most grantmakers want their money to accomplish something new. They’re inclined to see their grants as seed money, funding experiments and new activity. Most successful nonprofits though, have important core work that needs to get done. At the WMF for example that core work was the maintenance and continued availability of Wikipedia, the website, which meant stuff like hosting costs, costs of the Ops team, site security work and performance optimization, and lawyers to defend against censorship.

Because restricted grants are often aimed at funding new activity, nonprofits that depend on them are incentivized to continually launch new activities, and to abandon or only weakly support the ones that already exist. They develop a bias towards fragmentation, churn and divergence, at the expense of focus and excellence. An organization that funds itself solely or mainly through restricted grants risks starving its core.

Restricted grants pull the attention of the executive director. I am constantly recommending this excellent article by the nonprofit strategy consultancy Bridgespan, published in the Stanford Social Innovation Review. Its point is that the most effective and fastest-growing nonprofits focus their fundraising efforts on a single type of funder (e.g., crowdfunding, or foundations, or major donors). That’s counter-intuitive because most people reflexively assume that diversification=good: stable, low-risk, prudent. Those people, though, are wrong. What works for e.g. retirement savings, is not the same as what works for nonprofit revenue strategy.

Why? Because organizations need to focus: they can’t be good at everything, and that’s as true when it comes to fundraising as it is with everything else. It’s also true for the executive director. An executive director whose organization is dependent on restricted grants will find him or herself focused on grantmaking institutions, which generally means attending conferences, serving on juries and publicly positioning him or herself as a thought leader in the space in which they work. That’s not necessarily the best use of the ED’s time.

Restricted grants are typically more waterfall than agile. Here’s how grants typically work. The nonprofit writes up a proposal that presumes it understands what it wants to do and how it will do it. It includes a goal statement, a scope statement, usually some kind of theory of change, a set of deliverables, a budget, timeline, and measures of success. There is some back-and-forth with the funder, which may take a few weeks or many months, and once the proposal is approved, funding is released. By the time the project starts, it may be as much as an entire year since it was first conceived. As the plan is executed the organization will learn new things, and it’s often not clear how what’s been learned can or should affect the plan, or who has the ability to make or approve changes to it.

This is how we used to do software development and in a worst-case scenario it led to death march projects building products that nobody ended up wanting. That’s why we shifted from waterfall to agile: because you get a better, more-wanted product, faster and cheaper. It probably makes sense for grantmaking institutions to adapt their processes similarly, but I’m not aware of any who have yet done that. I don’t think it would be easy, or obvious, how to do it.

Upshot: If you’re a new nonprofit considering funding yourself via restricted grants, here’s my advice. Pick your funders carefully. Focus on ones whose goals have a large overlap with your own, and whose processes seem lightweight and smart. Aim to work with people who are willing to trust you, and who are careful with your time. Don’t look to foundations to set your priorities: figure out what you want to do, and then try to find a grantmaker who wants to support it.

About a week ago, I started running a little survey asking Wikimedians how we should approach target-setting for the next five years.

I did it because next month Wikimedia will finalize the targets that’ll guide our work for the next five years, and I wanted to gather some quick feedback on the thinking that’s been done on that, to date.  The survey’s close to wrapping up now, and the results thus far are terrific: there appears to be good consensus on what we want to measure, as well as on our general approach.

More detail below!  But first, some general background.

In July 2009, the Wikimedia Foundation kicked off a massive strategy development project, which is starting to wrap up now. [1] The one major set of decisions that remains to be finalized is how we will measure progress towards our goals.

The draft goals, measures of success and targets that have been developed via the strategy project are here. They were created over the past several months by Wikimedia community members, Bridgespan staff, and Wikimedia Foundation staff (thank you all) – and in my opinion, they’re pretty good.  They focus on what’s important, and they do a reasonably good job of figuring out how to measure things that don’t always lend themselves to easy measurement.

Before finalizing the targets and taking them to the Wikimedia Board of Trustees for approval, I wanted to gather some additional input, so I hacked together a quick, imperfect little survey.   (You can read it –and fill it out if you want– here.) The purpose of this post is just to share the results — I will probably write more about the targets themselves later.

First some methodology: I made the survey in Google Docs, and sent identical versions to i) the Wikimedia Board, ii) the Wikimedia staff, and iii) the “foundation-l” mailing list (a public list on which anyone can talk about the Wikimedia Foundation and Wikimedia projects), the Wikimedia Foundation Advisory Board list, and the “internal-l” mailing list (a private list intended for Wikimedia chapters representatives and Wikimedia Foundation board and staff).  Then –for the purposes of this post– I aggregated together all three sets of results, which total about 120 individual responses thus far.

If I’d been more serious I’d have used LimeSurvey, which is a better survey tool than Google Docs — but this is really just meant to be a structured solicitation of input, rather than a proper quantitative study.  For one thing, the “community” results reflect only a tiny fraction of active editors — those who read English, who are on Wikimedia’s mailing lists or are connected with people who are, and who self-selected to answer the survey.  So, please resist the temptation to over-interpret whatever numbers I’ve given here.

In general, I was happy to find that the survey surfaced lots of consensus.  A comfortable majority agrees with all of the following:

  • Wikimedia’s goals should be “ambitious but possible.” (Other less-popular options were: “definitely attainable, but not necessarily easily,” “audacious and probably not attainable, but inspiring,” and “fairly easily attainable.”)
  • We agree that the purpose of setting goals is “to create a shared understanding and alignment about what we’re trying to do, publicly and with everyone.” (Other options: “to create an audacious target that everyone can get excited about and rally behind,” and “to create accountability.”)
  • In setting goals, we believe “perfection is the enemy of the good: I would rather see us using imperfect measures than no measures at all.” (About 15% of respondents felt otherwise, believing that “imperfect measures are a waste of time and energy.”)
  • The Wikimedia Foundation’s goals should be dependent on efforts by both the Wikimedia Foundation and the Wikimedia community, not by the Foundation alone. (18% of respondents felt otherwise, that the targets should be “entirely within the control of the Wikimedia Foundation to influence.”)
  • If we exceed our goals, practically everyone will be “thrilled.” (About five percent of respondents felt otherwise, saying that they would be “disappointed: that would tell me our goals weren’t sufficiently challenging.”
  • If we fail to meet our goals, about three quarters of respondents will feel “fine, because goals are meant to aspire/align: if we do good work but don’t meet them, that’s okay.” Interestingly, this is one of the few areas of the survey where there was a real division between the staff of the Wikimedia Foundation and other respondents. Only 17% of staff agreed they’d be okay with missing our targets. I think this is probably good, because it suggests that the staff feel a high sense of personal responsibility for their work.
  • Almost everyone agrees that “goal-setting for the Wikimedia Foundation is difficult. We should set goals now, but many measures and targets will be provisional, and we’ll definitely need to REFINE them over the next five years, possibly radically.” (Runner-up response: “we can set good goals, measures and targets now, and we should NOT need to change them much during the next five years.” And a very small number felt that we should refrain from setting targets for “things we’re still uncertain about,” and instead restrict ourselves to areas that are “straightforward.”)
  • The global unique visitors target is felt by most to be “attainable if the staff and community work together to achieve it.” (About 20% of respondents felt the target might be “even happen without any particular intervention.”)

I wanted to get a sense of what measures people felt were most important. They’re below, in descending order of importance. (The number is the percentage of total respondents who characterized the measure as either “critical” or “important.” Other options were “somewhat important,” “not important,” and “don’t know/not sure.”)

It’s probably worth noting that consensus among community members, the board and the staff was very high.  For more than half the measures, the percentage of respondents rating the measure as “important” or “critical” varied by less than 10% among the different groups, and for the remainder, it varied by less than 20%.

Measure Avg
Retention of active editors 84
Number of active editors 83
Site performance in different geographies 80
Demographics of active editors 80
Uptime of all key services 78
Financial stability 74
Global unique visitors 66
Secure off-site copies 65
Number of articles/objects/resources 65
Regular snapshots/archives 60
Thriving research community 54
Offline reach 53
Reader-submitted quality assessments 41
Expert article assessments 40
Community-originated gadgets/tools/extensions 22

The survey’s still accepting input — if you’re interested you’ve got until roughly 7PM UTC, Wednesday August 18, to fill it out.

————————————————————————————–
[1]

I launched the Wikimedia strategy project at the request of the Wikimedia Foundation Board of Trustees, and it was led by Eugene Eric Kim of Blue Oxen Associates, a consulting firm with a special focus on enabling collaborative process. Eugene worked with Philippe Beaudette, a longtime Wikipedian and online facilitator for the project, and The Bridgespan Group, a non-profit strategy consulting firm that provided data and analysis for us. The premise of the project was that the Wikimedia movement had achieved amazing things (the number five most-used site in the world! 375 million visitors monthly!), and it was now time to reflect on where we were making good progress towards fulfilling the mission, and where we weren’t. With the goal of course-correcting where we weren’t doing well.

To come up with a good plan, we wanted to stay true to our core and central premise: that open, mass collaboration is the most effective method for achieving high-quality decisionmaking. So, we designed the process to be transparent, participatory and collaborative. So, during the course of the project, more than a thousand volunteers worked together in 50+ languages — in teams and as individuals, mostly in public on the strategy wiki, but supplemented by IRC meetings, Skype calls, e-mail exchanges, and face-to-face conversations (e.g., meetings were held in Berlin, Paris, Buenos Aires, San Francisco, Boston and Gdansk).

The project’s now entering its final phase, and you can see the near-final results here on the strategy wiki.  What remains to be done is the finalization of the measures of success, which will happen over the next six or so weeks. At that point, there will be some final wordsmithing, and the result will be brought to the Wikimedia Board of Trustees for approval.

I will probably write about the strategy project at a later date, because it is super-interesting. (Meanwhile, if you’re interested, you can read a little about it here in a story that Noam Cohen wrote from Wikimania 2010 in Gdansk.)