Are Your Trustees Satisfied?

I’m always interested in why donors choose different means of formalizing their giving and the ever-expanding set of options they have to do so. Because of that, I read the Center for Effective Philanthropy’s recent report What Donors Value: How Community Foundations Can Increase Donor Satisfaction, Referrals, and Future Givingand related blog posts.

In a survey of more than 6,000 donors of 47 community foundations, the Center found that:

  • The strongest predictors of donor satisfaction are donors’ sense of the foundation’s level of responsiveness when they need assistance and their perceptions of the foundation’s impact on the community. (The first predictor isn’t new information and the second one confirms community foundation’s hunches and hopes.)
  • 1 in 4 donors were “moderately satisfied” or less with their community foundation. (Oddly, the report glossed over this second point. I don’t know what good business manager would be happy with that metric.)
  • A donor’s level of engagement wasn’t a driver of her or his satisfaction. (This shouldn’t be a surprise, but likely was to many community foundation staff).

I steward a multi-generation family foundation and am writing this while attending a symposium by the National Center for Family Philanthropy. Family foundations don’t have the pressures that community foundations create for themselves to grow assets, grow the number of donors with funds, and grow the number of donors who give to the community foundation’s initiatives. However, the report surfaced important questions for those of us in the family philanthropy business.

Shouldn’t family foundation staff worry about the satisfaction of our board members and/or trustees? For the most part, they choose to participate in these roles, even if that choice is coerced by other family members.

  • What would customer satisfaction metrics look like for family foundation board members? How would those metrics be different from those for other nonprofit boards?
  • What if 25% of your family foundation’s board members weren’t very satisfied with their experience with the foundation? How would that damage family dynamics behind the scenes and harm discussions at the foundation? How would that change their description of their experience to their friends and colleagues who sit on nonprofit boards?

And, shouldn’t family foundations pay attention to the report’s third point about engagement? Community foundations often set metrics for increasing the level of engagement of donors. Those metrics are often drawn from university fundraising models. The community foundations falsely presume that success means more donors moving up a ladder of participation in the community foundation’s activities, communications tools, and goals. As the Center’s report notes, donors can be satisfied even when they have, or desire, little or no involvement from the foundation in their giving decisions.

The majority of family foundations (and many donor-advised funds) are established as vehicles for families to give together and learn together about giving. Ideally, the foundations are safe places to learn and grow together. But it is easy for staff and founders to fall into the same trap as community foundations – that all board and family members desire to be fully engaged in the foundation’s work.

  • How do we ensure that our family foundations are safe places to learn and don’t force a one-size-fits-all approach for participation?
  • How do we design customer-centric experiences that meet our volunteers where they are? Can we allow them to flexibly dive in or dial back over time, perhaps learning from techniques of good network management models?
  • How do we blend the engagement expectations of founders or other family leaders with trends in how younger people choose to interact with organizations and choose to give of their time and skills? Especially for endowed foundations, how do we ensure that institutional culture doesn’t automatically turn people away?

Unfortunately, I have more questions than answers at this point and NCFP’s forum didn’t have sessions addressing the topics. I’ll be doing my own research on the issues and hope that you’ll feel free to send me good ideas and your own experiences.

What Does Your Community’s Social Economy Look Like?

What if you convened a group as diverse as codefest winners, giving circle donors, librarians, start-up leaders, and fundraisers? And what if you asked them to describe all the ways they spend time and money for the public good? 

We did just that in Pittsburgh a couple weeks ago. The Philanthropy Forum at GSPIA and Grantmakers of Western Pennsylvania had invited scholar Dr. Lucy Bernholz to a two-day whirlwind of meetings related to her work on the social economy and digital civil society. Lucy’s brief definitions for those terms are:

  • Social economy – all the ways we use private resources to create public benefits or public good
  • Digital civil society – how we use private digital resources to organize, create, distribute and fund public benefits or public good

On the second day, Lucy met with a group of about 36 people who mostly work outside of the traditional grantmaking world. Many met each other for the first time, and we missed the voices of about 50 more invitees who weren’t able to attend. Lucy’s slide deck from the conversation is at

Lucy asked the attendees to list and post: 1) all of the actions they took for the public good, and 2) all of the groups through which they took those actions.


Pgh Social Economy Group

I used Wordle to create a summary of their actions for the public good (bigger words indicate a greater number of that response):

Pgh Social Economy Wordle

A majority of the attendees listed volunteering for a nonprofit organization, many serving as board members. A majority listed donating financially to causes. Those traditions remain strong, even with the younger attendees.

But, only about half of the activities listed were in the nonprofit sector and the attendees used the word giving without regard for receipt of a charitable deduction. A larger picture of philanthropy (defined as “voluntary action for the common good”) emerged to include social enterprises, political activities, and the uses of crowdfunding sites. The group also made a clear connection between achieving the common good and taking such actions as: buying a farm share, using sharing services such as Lyft and Airbnb, participating in Meetups, and activating their social networks for causes. They described how those actions built stronger relationships, trust, and sense of community (translated for grantmakers – “community building and social capital outcomes”).

Of course, humans volunteered together for the common good long before tax laws defined charitable giving and charitable organizations. But, Lucy noted that today’s technology and digital environments allow people to more quickly return to those roots of collective action – of sharing in community well-being. The Grassroots Grantmakers team recently wrote about this activity in Citizen-to-Citizen: Funding, Sharing, and Generating Ideas. (Translated again for grantmakers – “this is great stuff that we’ll avoid because we don’t want to deal with expenditure responsibility rules and/or it doesn’t meet our definition of strategic philanthropy.”)

What Does This Mean for Pittsburgh (or your City)?

Admittedly, the group of attendees wasn’t a random sample of Pittsburghers. They were purposely invited to develop Pittsburgh’s first glimpse into its social economy and digital civil society beyond grantmakers and nonprofits. (If someone wants to develop a more complete picture, let’s talk!)

Since the conversations, I’ve been wondering about the intersections between foundations and the wider array of social economy activity. As we look ahead, how will we…

  • Act together? – Will traditional philanthropy associations such as Grantmakers of Western PA successfully include this broader set of social good doers? Or are those associations so dominated by large funders and “grants as the main tool” thinking that others won’t feel welcome? Will an alternate set of social good associations (or meetups and/or political action groups…) rise up around regional grantmaker groups?
  • Lead together? – Many of the attendees will likely become the next generation of community leaders. Could they become the next leaders of foundations and corporate giving programs? The legal structure of foundations has proven adaptable to forms of social good such as impact investing and grassroots grantmaking. But will the culture of professionalized philanthropy be ready for people who effortlessly deploy the full array of social good tools?
  • Grow together? – Groups of funders in Pittsburgh and many other cities have built capacity-building resources for 501(c)(3) public charities. Will they build similar resources to provide free and discounted management assistance, legal advice, tech support, and more to B-Corps, unincorporated groups, code for good groups, and more? And can funders build those resources in ways that don’t force those groups to follow the rules of 501(c)(3) land? Conversely, how will communities help nonprofits effectively adapt to the broader social economy and collaborate with these free agents on community problem-solving?
  • Know and learn together? – Lucy talked about the absence of a national conversation around the ethics, rules, and regulations of digital public goods – information produced and shared by charities, government agencies, and other social economy groups. The Brookings Institution has made the case that metropolitan areas, rather than nations, are now the main hubs of innovation and community problem-solving. Though digital information flows across borders, could or should communities such as Pittsburgh craft their own, shared codes of conduct around digital public goods?

Lucy and the team at Stanford PACS are tackling some of these policy and practice issues on a national and global level. My own problem-solving orientation leans more local. I’d love to see Pittsburgh – and any other community – tackle them. Maybe one day we’ll even see regional Social Economy Leagues that parallel the power of regional economic development organizations and chambers of commerce. Or, perhaps communities will create Digital Public Good Trusts that parallel the collective donor power and asset preservation of community foundations. (For philanthropy history geeks, who will write the Dead Hand Harnessed for the 21st century?)

What would a picture of your community’s social economy look like, and how would you grow that economy?

The one where the philanthropy data geeks got it wrong

Belushi food fight

Philanthropy data geek ready for battle (speculative artist rendition)

I guess you have to commend philanthropy’s paper of record – the Chronicle of Philanthropy – for trying to make fundraising data exciting with its January 17 article, Group Estimates Philanthropy Rose 13% in 2013, Clashing With ‘Giving USA’.

The article pitted estimates of charitable giving from the relatively new Atlas of Giving against the results of the long-time resource Giving USA. Forbes’ philanthropic and social good contributor, Tom Watson, described the article as a “philanthropic food fight” – a creative headline that quickly spread across philanthropic and nonprofit social media.

The Chronicle’s article drew defensive explanations from the CEO of the Atlas of Giving and the leaders of Giving USA.* If you don’t want to read the details, the arguments boil down to differing methodologies for tracking contributions and competition around monetizing those methodologies. If you do want to be geeky, you can read more about the methods at Atlas of Giving, Giving USA, and for good measure, the Blackbaud Index.

Here’s the larger issue: all of the methods miss the forest for the trees the buffet for the appetizer tray.

Ultimately, these resources primarily focus on giving to incorporated charitable (501(c)(3)) organizations. They reinforce a narrow definition of the term “philanthropy,” imprisoning it within artificial tax and legal boundaries. It’s the same mistake made by the pundits worried about the Stubborn 2% Giving Rate – U.S. giving to charity being fairly level at 2% of GDP and 2% of disposable income.

A classic definition of philanthropy is “voluntary action for the common good.” It is the generosity you and I feel and express, and it spills far beyond tax and legal boundaries (see my previous post on this issue). This recent “food fight” over philanthropic data doesn’t fully include:

  • Giving to charitable organizations that isn’t reported to the IRS – e.g. giving by people who don’t itemize on their taxes; the cash we drop into jars at counters, buckets on street corners, and collection plates of congregations; text message giving; and some crowdfunded gifts.
  • Giving to organizations that aren’t 501(c)(3)s – gifts to advocacy organizations, civic organizations, and other non-charitable nonprofits. This giving may not be charitable by some people’s definitions, but it is definitely a legitimate tool for achieving a public or social result.
  • Supporting the common good through gifts to individuals, unincorporated groups, artists, social enterprises, and even businesses through cash, crowdfunding platforms, and grassroots groups such as the Awesome Foundation and Sunday Soup. As just one example, an annual report on crowdfunding platforms shows about $2.7 billion flowing raised in 2012, with 38% going to “social causes” and 25% going to arts and environment projects. The public version of the report doesn’t show the percentage going to traditional charities.
  • Support for people in need and families through remittances instead of charities (a World Bank report estimates worldwide remittances at $401 billions in 2012).

I’ll grant that Giving USA and others know that they don’t track those forms of philanthropy and that tracking those forms is much harder than tracking gifts to 501(c)(3) charitable organizations. However, I’m increasingly convinced that these data resources – and the media hits they generate:

  1. Can distort public perceptions about philanthropy, further separating the concept from a value and action connected to everyday people and maybe reinforcing philanthropy and its attending tax benefits as a privilege of “the 1%.”

  2. May display a lack of cultural competency, as the data collection methods undercount the giving patterns of the growing percentages of non-Caucasian populations and immigrants in America.

  3. Will become less accurate over time as Millennials, and perhaps other generations, increasingly express their financial commitment to the common good outside of the charitable sector and tax-advantaged giving.

What do you think? Are these data resources useful as-is, or do we need something more expansive?

* Full transparency: I was born, raised, and educated in Indiana. Shortly after we Hoosiers learn to walk and/or drink beer, the Indiana University’s School of Philanthropy pours fundraising knowledge and data into our heads. I count I.U. staff as mentors, colleagues, and occasional fellow beer drinkers in my philanthropic career.

2/20/14 update: Giving USA and Atlas of Giving are facing off in an online radio show on Feb. 21 that will be archived for download.

Would Your Nonprofit Succeed With Donors Who “Give With Purpose”? (part 2)

Written for and cross-posted at

Let’s try a quick exercise: pull up the website of your favorite nonprofit. Can you answer any of these questions based on the information on that site?

  • Does the nonprofit demonstrate familiarity with evidence and best practices related to the need it addresses by citing research, data, reports, past experience, or other reliable sources of information?
  • Does the organization demonstrate cultural awareness and roots in the community?
  • Are its results likely to stick over time for the intended beneficiaries?
  • Do the board members play a meaningful role in supporting the organization through their work and financial support?

These are just four of the 35 questions from the new RISE Assessment Tool to evaluate nonprofits. The Learning By Giving Foundation offered this tool to thousands of people who participated in its Massive Online Open Course on philanthropy, Giving With Purpose, this summer. The course describes “giving with purpose” in two goals: a) satisfy your personal motivations for giving, and b) invest in high-performing organizations. In my last post, I gave my assessment of the course itself and its ability to meet the first goal. In this post, I’ll dig into the second goal.

The RISE Framework

Course instructor Rebecca Riccio created the RISE Framework for Social Change to define four “hallmarks of strong organizations”: Relevance, Impact, Sustainability, and Excellence. The RISE Assessment Tool has a set of questions and rating criteria for each hallmark. The Foundation hasn’t released a final version to the public yet, but I pasted the questions into a document (download the PDF – RISE Assessment Tool – 2013-Aug Vsn) as an example.

Hundreds of the participants in this summer’s course tried using the assessment tool on the web sites of the 700+ nonprofits nominated by other participants. The nonprofits ranged from all-volunteer, faith-based efforts to large, long-time civic anchors. Ms. Riccio briefly encourages people to visit nonprofits in person. But, the design of the course biases participants toward finding information on nonprofit websites and online services such as Guidestar and Charity Navigator.

What if Donors Use the Tool?

In my use of the RISE Assessment Tool during the course, and in feedback I saw from other participants, nonprofit websites mostly came up short on answers. The tool will likely inhibit donations if, as Ms. Riccio suggests, donors use it before they choose to make a donation.

First, most nonprofits are small and underinvest in their communications, technology, fundraising, and evaluation capabilities. Of the 1 million public charities in the U.S. that submit 990s to the IRS, about 75% have budgets of less than $500,000. (There are also 386,000 congregations and an unknown number of very tiny, local charities that don’t have to make information available publicly.) Only a small percentage of these nonprofits choose to focus their limited resources on the operational and program management criteria in the tool.

Second, not all nonprofits are in the business of “social change.” In the course, Ms. Riccio notes that the nonprofit sector is very diverse. However, I think the assessment tool and thrust of the course work best for nonprofits providing direct services (e.g. mental health, education, or international development). Historical societies, conservation groups, arts organizations, and others will fail many of the tool’s criteria for relevance and impact.

Lastly, nonprofits are receiving conflicting advice on the focus of their websites. Ms. Riccio tells participants to look past good stories and packaging to find answers to the assessment tool’s questions. Her advice, of course, isn’t new. The Better Business Bureau and others publish checklists of nonprofit information that should be publicly available. And, her questions about results and impact are similar to the new, controversial effort by Charity Navigator to rate nonprofits on the reporting of their results.

However, nonprofits hear a different story from experts who advise on fundraising communications, Millennial and Gen X giving, and engaging donors in social giving and crowdfunding models. Those advisors tell nonprofits to lead with compelling stories, share-able multimedia content, and pictures and quotes from donors’ peers. These might describe results, but not in the ways that match the expectations of the RISE Assessment Tool and its peers.

With all respect for Ms. Riccio’s hopes for success with the course and tool, my bet is still on the nonprofit websites filled with compelling, share-able multimedia content. Here’s why…

Hearts Still Win Over Heads

Ms. Riccio joins a chorus of philanthropic advisors, authors, and consultants who earn money trying to convince donors to stop, think, research, and create criteria before they give. As an example, most issues of the Chronicle of Philanthropy have a new opinion piece telling donors and foundation what they should do. All their work isn’t changing mainstream donor behavior, at least yet.

Hope Consulting’s Money for Good reports, the 2013 Millennial Impact Report, the annual Burk Donor Surveys, and other sources report that donors say they want to see tangible results and good performance in nonprofits. This hope for giving with purpose reverberates through the philanthropy media and is amplified as a real trend. But, we donors are only human. Our behavior frequently doesn’t match our intentions and we’re susceptible to all types of cognitive and emotional biases.

We have a variety of motivations for making charitable gifts, and the emotional, social, and spiritual reasons win over the intellectual ones. One of the classic books on donor motivation, The Seven Faces of Philanthropy, showed that only 15% of donors selected nonprofits based on considerable research and evaluation. The more recent Money for Good research showed that only 16% of donors were driven to primarily support high impact nonprofits.

The Money for Good research also shows that donors don’t spend much time researching nonprofits before they give. Only 35% of the 5,000 donors researched reported doing any research. And, when donors did research, it was to validate their donation, not to find the “best” nonprofit. They were looking to see if the nonprofit seemed reputable, had low overhead, and did good work (not necessarily defined as theories of change, performance measures, business plans, etc.).

Do I Now Know How to Invest in High-Performing Nonprofits?

The Tool didn’t add to my knowledge or skills. But as a long-time philanthropoid, I’m not the target audience for the Giving With Purpose course and RISE Assessment Tool.

In my previous post, I identified some potential audiences for the course – college classes, giving circles, professional advisors, and nonprofit staff and board members. Those audiences may be predisposed to spending more time exploring their giving preferences and evaluating nonprofits. The RISE Assessment Tool could be helpful if they’re realistic about the capabilities most nonprofits have (or lack) and what information is easily publicly available (or not). If the recent research on Next Gen Donors is to be believed, Gen X and Y donors may be predisposed to ask nonprofits harder questions. The Tool’s questions wouldn’t be a bad start, though the five Charting Impact questions might be a simpler start.

Despite my critiques and challenges over these two posts, I very much hope the Learning By Giving Foundation sees the summer 2013 version of the course as a good beta test. Hopefully the foundation will take the course’s advice and evaluate the participants’ behaviors and knowledge over time, use the participants’ feedback to update the content and process, and offer future, improved versions.

Should you take the course? If you’re at the point in your life that you have more time to consistently think about and act on your generosity, the course could be a helpful launchpad for that process. If a MOOC isn’t your style, ask Nathaniel James or me about the other strategic philanthropy and philanthropic planning tools on our bookshelves and browser bookmarks.

If you took the Giving With Purpose course too, I’d love to hear your experience, especially if you disagree with my take on the course.

Giving With Purpose: Will It Stick? (part 1)

Written for and cross-posted at

learnbygivingfoundationThis summer, the Learning By Giving Foundation launched its Massive Online Open Course (MOOC) on philanthropy, Giving With Purpose. News reports played up the fact that Doris and Warren Buffett were backing the course and that some nonprofits would receive “Buffett money” because of the course. That financial carrot likely drove the initial registration to upwards of 10,000 people.

I have to give kudos to the Learning By Giving Foundation and the course presenter, Rebecca Riccio. I’m thankful for their commitment to providing free philanthropy curricula, to openly experimenting with the still-evolving MOOC format for learning, and to being willing to learn from the feedback provided by thousands of course participants.

The course encourages people to “give with purpose” – described in two goals: a) satisfy their personal motivations for giving, and b) invest in high-performing organizations. In this post, I’ll cover my thoughts about the course as a whole and its ability to help participants understand their personal motivations for giving. I’ll write about the “invest in high-performing organizations” in the next post.

The Experience

The six-week course combines about an hour of video content with about an hour of  homework each week. There are a few short quizzes, but no final test and no grades assigned. Participants optionally choose to spend more time on the “Giver” track. Givers complete an initial online profile of a nonprofit to nominate it for a grant and assess six or eight other nonprofit profiles based on what they learn through the course. This summer, at least 1,300 participants also posted thoughts and commented on others’ posts on a Google+ Community.

Ms. Riccio delivers all of the educational content in short video segments. She also interviews donors ranging from Warren and Doris Buffett to the founders of Ben & Jerry’s (previews are available on YouTube.)

Ms. Riccio is often an engaging instructor and the course mercifully lacks endless slide decks. That said, if a student isn’t attracted to Ms. Riccio’s teaching style, he or she is out of luck. Some points are reinforced by visuals. However, new ideas often go by quickly or in quick lists of options that should be reinforced by written materials or downloadable slides. And, there isn’t always an easy connection between the videos and other online content. I don’t know if this disconnect is due to the Foundation’s first experimentation with online learning or to the limitation of the platform (Google’sCourse Builder). I do know there are more sophisticated online learning platforms available (the instructional firm In The Telling has one example).

At the end of the course, I was left with four questions.

1. Who is the target audience?

The course’s web site says it is “for students who are passionate about or interested in philanthropy.” It doesn’t define if it means the Learning By Giving Foundation’s traditional audience of college students or a broader audience. Judging by profile pictures in the Google+ community, most users this summer were adults.

The content is too basic for practiced philanthropists, philanthropic advisors, and foundation staff. But, I think these audiences would find the course useful:

  • College classes – while the content seems designed to stand on its own, people unfamiliar with the nonprofit sector or new to giving would benefit from a teacher or tutor providing context and reinforcing some points.
  • Giving circles – the content could be useful fodder for discussion in a giving circle and/or for new circle members who aren’t familiar with the basics of the nonprofit sector and options for giving. It could serve the same function for corporate giving offices to educate employees and service clubs to educate members.
  • Professional advisors – accountants, financial planners, and lawyers who don’t have charitable giving as part of their regular practice could use the course as a quick grounding. That said, see the caveat in the next question.
  • Staff, volunteers, and board members of nonprofits – the majority of the participants this summer seemed to already be connected to one or more nonprofits (likely because of the incentive of grants at the end of the course). Many said the course provided useful background on the sector and how to look at their favorite nonprofit(s) through a new lens. A nonprofit CEO could use the content to help staff see the nonprofit sector through the eyes of donors.

2. Is it too much of not enough?

The instructor, Rebecca Riccio, does a good job of synthesizing dozens of ideas from the literature on strategic giving and on nonprofit assessment. And, she frequently counsels participants to be fair and realistic when looking at issues such as evaluation and nonprofit overhead.

That said, participants may end up with too much of not enough information. As an example, she spends less than four minutes total introducing three tools: theories of change, logic models, and performance measurement practices. Participants are encouraged to look for these tools as signs of an effective nonprofit, but aren’t shown examples or provided context to judge quality if they do see them. Even seasoned foundation staffers have trouble assessing these tools, and too few nonprofits have good ones.

This felt like the equivalent of telling average consumers to buy a car based on the engine components. Sure, we can memorize a couple basic facts and ask the auto dealer to pop open the hood so we look like we’re being smart buyers. But it doesn’t mean we really know what we’re looking at.

3. Do I better understand my personal motivations for giving?

Each week included content designed to help participants reach both course goals (learn to satisfy your personal motivations for giving and learn to invest in high-performing organizations). Unfortunately, I don’t think the course succeeds on the first goal.

Ms Riccio poses good questions at the beginning of the course, for example: “Does a nonprofit have meaning for me?”, “Is investing in it a meaningful way for me to make a difference?” and “Does a contribution fit into my financial plan for giving?” But, the course doesn’t provide concrete tools to explore these and other values-driven and personal finance questions posed.

Truly understanding your personal motivations for giving takes time. In my experience in working with donors and donor families, they’re far more likely to take time for self-reflection and soul-searching in two circumstances. The first is when they’ve encountered a big change (e.g. selling a business, a parent dies, or inheriting wealth). The second is when they have an ally (e.g. an advisor, pastor, or life coach) and/or peer group regularly holding them accountable for the activity and documenting the results. Simply giving a scan of the ideas or presenting information freely online won’t change donor behavior.

4. Will it stick?

Thousands of people have now viewed content designed to help them be more purposeful givers. A subset of them attempted to practice what they saw on a few nonprofits’ web sites, 990s, and the often-incomplete nonprofit profiles submitted by participants. The course will accept another round of students at some point (TBD at the time of this post).

I’m not convinced that the content will stick for the majority. Like many great resources on strategic giving, the course provides a useful roadmap but following the map takes too much work for the donations under $100 that comprise most of charitable giving. In addition, even for wealthy donors, the motivation to give to high-performing nonprofits is easily undermined by other motivations and emotions (see Money for Good and other research on donor behavior).

Returning to our car-buying analogy, we may know all the right technical features we should value, but our purchases end up being driven more by emotion – the status the car conveys, the feel of the drive, the hot new color or style, loyalty to a brand, etc.

What if it does stick?

But, what if I’m wrong? What if the Giving With Purpose content does stick for hundreds or thousands of donors over time? Then, the nonprofit sector could be in trouble. The course raises expectations for the information donors should be able to easily learn about nonprofits. Most small- and mid-sized nonprofits won’t meet those expectations. That’s the subject of my next post.

If you took Giving With Purpose, please feel free to weigh in with your experience and reactions by commenting on this post! 

Why Your Evidence Is Ignored

“The truth is rarely pure and never simple.” – Oscar Wilde, The Importance of Being Earnest 

Thanks to the wonders of social media (specifically Darin McKeever’s Twitter feed), I just read a fascinating article – Why Policymakers Ignore Evidence – by Gerry Stoker, Professor of Politics and Governance at the University of Southampton, U.K.

Stoker’s article is meant as advice to other academics. Based on my experiences, I’d argue his advice is equally useful to nonprofit and foundation staff trying to communicate with politicians, donors, and other community leaders. From Stoker’s many points, I’ve pulled out nine questions that could be swirling in someone’s brain when presented with your facts or your proven solution:

  1. Why is this important right now, and more important than other issues?
  2. Why are you bringing me a problem analysis and no solutions?
  3. What are the administrative, financial, and/or political challenges we’ll face in doing this?
  4. How much social and political capital will we burn trying to get this done?
  5. What are the opportunity costs of diverting time and resources to this?
  6. Is the information coming from a trusted resource or ally?
  7. Does the idea match my personal values and world view?
  8. What are the unintended consequences?
  9. Are you communicating to me in language I understand?

If you’re a program officer trying to sell your board on a strategy, could you predict their answers to these questions with confidence? If you’re a nonprofit advocating for a program, how would your city councilors or state legislators respond to these questions?

I see too many of my colleagues get caught up in their causes – in the “rightness” of their ideas and evidence – that they forget to see the world through others’ viewpoints and experiences. Lord knows, I’ve made the mistakes too.

Stoker’s advice is similar to counsel provided by lobbyists and political communications strategists. Fortunately, there are free resources nonprofits and funders can use to be more successful in turning facts into persuasive communications. Two good places to start are Spitfire Strategies, and if you lean more liberal, the FrameWorks Institute.

What’s On Your Charitable BOLO List?

Part of the recent scandal surrounding the IRS’s Exempt Organizations office was its use of BOLO (“Be on the Lookout”) lists. IRS staffers used the lists to screen groups applying for tax-exempt status based on certain phrases and criteria. Groups that didn’t make it through the screen were subjected to longer and deeper reviews (see the Treasury Inspector General’s report here).

I’ll leave arguments about the appropriateness of the IRS’s processes and staff to others. But, the screening process and lists got me wondering about my grantmaking and philanthropoid peers: “What’s on your charitable BOLO list?”

Foundations and other grantmakers often have public guidelines that describe their priorities. At their best, the guidelines provide clear signals about how inquiries and proposals might be screened. In too few cases, they may even meet nonprofits’ hopes for transparency as recently reported by the Center for Effective Philanthropy.

The Treasury Inspector General’s report described BOLOs as an internal “shorthand way of referring to a group of cases.” I’m likely confirming the suspicions of nonprofit leaders everywhere: grantmakers have BOLOs too, even if unwritten or not labeled as such.

At their most benign, these short-hand BOLO lists are a way to streamline internal communications, e.g. “the founder’s favorites,” “the grantees from initiative X,” or “groups using evidence-based practices.” The lists might signal shorter due diligence processes or a shared sense of internal trust.

At their most constructive, the BOLOs are a means of prompting increased due diligence to better understand how the grantmaker can be most helpful, e.g. “groups that have cashflow problems” or “groups going through executive transition.”

At their most troublesome, the BOLOs can prevent good ideas and organizations from receiving fair consideration. The lists become an automatic gatekeeping mechanism that might not accurately describe the diversity of opinions and interests you’d expect from a group of people at an grantmaking organization. In my work with and around grantmaking groups over time, I’ve witnessed* BOLOs such as:

  • Thresholds around overhead ratios or other financial ratios that don’t take into account the variety of business models and stages of organizational growth
  • Expectations about board composition or behavior that may not fit the cultural norms of an ethnic group or community or 21st century, networked practices
  • Expectations of particular personality traits of nonprofit CEO
  • Groups of nonprofits that are deemed “too small” or “too large” to be effective
  • Nonprofits’ policy or advocacy work unnecessarily preventing them from receiving consideration for program support

My Take

Grantmaker staff and board members, like all people, carry biases, experiences, and relationships that influence their perspectives on their work and on their potential customers. We’re all human – it’s impossible to avoid these influences completely. I suspect BOLOs exist in nonprofits as well – “clients that are abusing the system,” “ticket buyers that will never convert to donors,” “those irresponsible parents from that neighborhood,” etc.

Board, grantmaking committee, and staff members of grantmakers need to uncover and discuss our BOLOs on an ongoing basis, even if they seem benign or useful. I encourage new members to be brave enough to ask about them – to BOLO for BOLOs if you will. And, veterans (myself included) have to challenge ourselves not to let cynicism and years of work create accidental BOLOs. My grantmaker mentors cautioned me always to question my own assumptions and biases before talking with a nonprofit or reading a proposal. I know I haven’t succeeded 100% of the time on this. But I’m willing to dig deeper to ensure any short-hand lists don’t get in the way of my being open to ideas that come my way.

So, what are the BOLOs, even unwritten, at your grantmaking shop? And, what are you going to do about them?


* DISCLAIMER: The examples are not and should not be interpreted as the views of my current or past employers or consulting clients. That said, ask me sometime about a former BOLO regarding audacious ideas mounted on foam core.