Published first on Harvard Business Review
As concerns about scarcity and inequality become increasingly urgent, many investors are eager to generate both business and social returns—to “do well by doing good.” One avenue is impact investing: directing capital to ventures that are expected to yield social and environmental benefits as well as profits. But there’s a problem: Although the business world has several universally accepted tools, such as the internal rate of return, for estimating a potential investment’s financial yields, no analogue exists for evaluating hoped-for social and environmental rewards in dollar terms. Forecasting gains is too often a matter of guesswork.
Investors hoping to use a company’s track record on social and environmental impact to assess future opportunities will similarly find little useful data to evaluate. The reporting of environmental, social, and governance issues is now standard practice at nearly three-quarters of the world’s large and mid-cap companies, but it is usually confined to information about commitments and process and rarely scores actual impact on customers or society.
Key industry players have recognized these analytical shortcomings and have stepped up their quest to better understand impact measurement and management. Notable among them are Root Capital, the MacArthur Foundation, the Omidyar Network, Skopos Impact Fund, Bridges Impact+, the World Economic Forum, and the Rockefeller Foundation. This work has produced a number of interesting metrics, including social return on investment (SROI). The Impact Management Project, a collaborative launched in 2016 involving foundations and major investment managers, aims to weave all these threads together into a shared language about impact management and to develop a set of practical tools to implement best practices. Building on this work, the organizations we work for—the Rise Fund, a $2 billion impact-investing fund for growth-stage companies managed by TPG Growth, and the Bridgespan Group, a global social impact advisory firm—have attempted over the past two years to bring the rigor of financial performance measurement to the assessment of social and environmental impact. Through trial and error, and in collaboration with experts who have been working for years in the field, the partnership between Rise and Bridgespan has produced a forward-looking methodology to estimate—before any money is committed—the financial value of the social and environmental good that is likely to result from each dollar invested. Thus social-impact investors, whether corporations or institutions, can evaluate the projected return on an opportunity. We call our new metric the impact multiple of money(IMM).
Fewer people touched deeply may be worth more than many people hardly affected.
Calculating an IMM is not a trivial undertaking, so any business that wishes to use it must first determine which products, services, or projects warrant the effort. As an equity investor, Rise does a qualitative assessment of potential investments to filter out deals that are unlikely to pass the IMM hurdle, just as it filters out deals that are not financially promising. Companies with a social purpose and a potentially measurable impact get a green light for IMM evaluation. Rise will invest in a company only if the IMM calculation suggests a minimum social return on investment of $2.50 for every $1 invested. Businesses that adopt this metric can set their own minimum thresholds.
To be clear, numerous assumptions and choices are involved in this process, precluding any claim that our method can provide a definitive number. But we believe that this approach provides valuable guidance regarding which investments will or will not have a significant social impact.
In the following pages we explain how to calculate an IMM during an investment-selection process. The method consists of six steps.
1. Assess the Relevance and Scale
Investors should begin by considering the relevance and scale of a product, a service, or a project for evaluation. A manufacturer of home appliances may want to consider investing in energy-saving features in its product lines. A health clinic provider may want to assess the potential social benefits of expanding into low-income neighborhoods.
With regard to scale, ask, How many people will the product or service reach, and how deep will its impact be? Rise’s experience with calculating the product reach of the educational-technology company EverFi, one of its first impact investments, provides a good example. (The financial and participation data in this article is representative; the actual numbers are confidential.) Rise identified three EverFi programs that already had significant reach: AlcoholEdu, an online course designed to deter alcohol abuse among college students, which was given at more than 400 universities; Haven, which educates college students about dating violence and sexual harassment and is used at some 650 universities; and a financial literacy program that introduces students to credit cards, interest rates, taxes, and insurance, and is offered at more than 6,100 high schools. On the basis of projected annual student enrollments in these programs, Rise estimated that an investment in EverFi could affect 6.1 million students over a five-year period beginning in 2017.
Of course, a program’s impact is not just about the number of people touched; it’s about the improvement achieved. Fewer people touched deeply may be worth more than many people hardly affected. Consider another Rise investment, Dodla Dairy, which procures and processes fresh milk every day from more than 220,000 smallholder farmers across rural southern India. The number of farmers affected was known, so what Rise needed to assess was how much milk Dodla was likely to buy from them and at what price. With projected sales of 2.6 billion liters of milk over five years, Rise estimated that investments in Dodla would increase farm families’ annual incomes by 73%, from $425 to $735. Smallholder farmers with a reliable buyer for their milk spend less time and money marketing and have the predictability and support needed to make long-term investments, increasing milk yields and, therefore, income.
2. Identify Target Social or Environmental Outcomes
The second step in calculating an IMM is identifying the desired social or environmental outcomes and determining whether existing research verifies that they are achievable and measurable. Fortunately, investors can draw on a huge array of social science reports to estimate a company’s impact potential. Over the past decade foundations, nonprofits, and some policy makers (including the U.S. Department of Education’s Investing in Innovation Fund) have relied heavily on research results to guide funding for social programs. This “what works” movement has spurred the development of an industry around social-outcome measurement, led by organizations such as MDRC, a nonprofit social-policy research organization; the Abdul Latif Jameel Poverty Action Lab (J-PAL), at MIT; and Mathematica Policy Research, based in Princeton, New Jersey.
For AlcoholEdu we drew on a 2010 randomized controlled trial demonstrating that students who had been exposed to the program experienced an 11% reduction in “alcoholrelated incidents” such as engaging in risky behaviors, doing or saying embarrassing things, or feeling bad about themselves because of their drinking. That would amount to some 239,350 fewer incidents. According to the National Institutes of Health, alcohol-related deaths account for about 0.015% of all deaths among college students in the United States. Rise estimated that AlcoholEdu would save 36 lives among the approximately 2.2 million students who were projected to engage with the program over a five-year period. (Lives saved, arguably the most important impact of less drinking, are relatively straightforward to monetize. But reducing alcohol abuse clearly has additional benefits for individuals and society.)
For Haven we focused on the prevention of sexual assault. Some 10.3% of undergraduate women and 2.5% of undergraduate men experience sexual assault every year. According to a 2007 study that evaluated the effects of an in-person course on preventing sexual assault that was taught at a college in the northeastern United States, assault declined by about 19% for women and 36% for men among those who took the course.
Applying this data to 2.6 million students expected to experience the Haven program over five years, and assuming that an equal number of college women and men participated, Rise estimated that the program would avert 25,869 incidents of sexual assault among women, and 12,029 incidents among men.
3. Estimate the Economic Value of Those Outcomes to Society
Once they have identified the target outcomes, socialimpact investors need to find an “anchor study” that robustly translates those outcomes into economic terms. Cellulant, a regional African provider of a mobile payments platform used by banks, major retailers, telecommunications companies, and governments, is a good example. Cellulant worked with the Nigerian Ministry of Agriculture to redesign a corruption-plagued program that provided seed and fertilizer subsidies. The company developed a cell phone app that allows farmers to pick up their subsidized goods directly from local merchants, reducing the opportunity for graft. The program had been losing 89% of funds to mismanagement and corruption. Cellulant’s app now enables delivery of 90% of the intended aid.
Our task was to understand the economic impact on farmers when they received the subsidized seed and fertilizer. We used a reliable study that compared one season’s outcomes for farmers enrolled in the subsidy program with those for similar farmers who were not enrolled. The study found that participating farmers earned an additional $99 that season by improving maize yields.
To choose an anchor study we look at several key features. First, its rigor: Does the study systematically evaluate previous research results to derive conclusions about that body of research? Alternatively, does it present findings from a randomized controlled trial—which compares groups with and without a designated intervention? Both types of research are preferable to observational or case studies. Just as important is relevance: Does the study include people living in similar contexts (urban, say, or rural) and in the same income bracket? The closer the match, the better. Recent studies are better than older ones. And studies frequently cited in the research literature deserve extra consideration.
When uncertainty or a lack of reliable research stalls your work, seek guidance from an expert in the field. For example, we sought advice from the Center for Financial Services Innovation, in Chicago, when we could not locate appropriate studies demonstrating the impact of helping people establish a regular savings habit—one of three impact pathways we were examining for Acorns, a fintech company for low- and middle-income individuals. That call led us to research showing that even modest savings among the target group can reduce the use of high-cost payday loans.
To translate the outcomes of AlcoholEdu into dollar terms, we turned to the U.S. Department of Transportation’s guidance on valuing the reduction of fatalities or injuries, which uses a measure called the value of a statistical life. According to this anchor study, a fatality is worth $5.4 million. Thus AlcoholEdu could expect to generate social value of at least $194 million by saving 36 lives.
In the case of Haven we found that researchers at the National Institutes of Health have done quite a bit of work on the economic impact of sexual assault. In fact, the NIH has pegged the legal, health, and economic costs of a single assault at $16,657, adjusted for inflation. Rise multiplied the NIH figure by the estimated number of sexual assaults Haven would avert (37,898) to get close to $632 million. Because sexual assault is underreported, Rise believes that Haven’s impact may be even greater.
Wrestling with Moral Issues
At times, monetizing social or environmental benefits and costs raises complex questions. For instance:
- Does an extra dollar of income have greater impact on someone in an emerging market versus someone in a developed market?
- When increased income is the target outcome, should we count that impact no matter how much the family was earning before, or only when it earned below a certain threshold?
- When saving lives is the desired outcome, can we put a dollar value on each person who benefits?
- Health economists’ estimates of the value of a statistical life (VSL) vary dramatically by country—but should human lives be valued differently just because of an accident of geography?
To address such questions, Rise, an impact-investing fund, relies on research to ground decisions in evidence and provide an analytical basis for decision making. For instance, for some IMMs Rise has created a global weighted average value of a life saved rather than using a country-specific metric, to avoid the unintended consequence of tipping investments in favor of developed countries. For other IMM calculations Rise has looked at how impoverished people actually spend incremental dollars in contrast with those in a higher income bracket. Such difficult issues merit ongoing attention from the investment and research communities.
For EverFi’s financial literacy program we relied on a 2016 study that looked at a similar program for high school students. It found that program participants had an average of $538 less in consumer debt at the age of 22 than a similar group of students who hadn’t been exposed to the program. On average, interest paid on that additional debt came to about $81 over five years. Assuming that 1.3 million students completed the EverFi program over five years and they all saved $81, the economic value of the program would total $105 million.
We estimated that the social impact of the three EverFi programs combined had a five-year economic value of about $931 million: $194 million for AlcoholEdu, $632 million for Haven, and $105 million for financial literacy.
4. Adjust for Risks
Although we have proved to our satisfaction that social science research can be used to monetize social and environmental benefits, we recognize the risk in applying findings from research that is not directly linked to a given investment opportunity. Therefore we adjust the social values derived from applying the anchor study to reflect the quality and relevance of the research. We do this by calculating an “impact realization” index. We assign values to six risk categories and total them to arrive at an impact-probability score on a 100-point scale.
Two of the index components relate to the quality of the anchor study and how directly it is linked to the product or service. Together these account for 60 of the possible 100 points. Anchor studies based on a meta-analysis or a randomized controlled trial merit top scores, whereas observational studies rate lower. AlcoholEdu’s study was in the former category; Haven’s and the financial literacy program’s studies were in the latter.
- Sustainability2019.07.10We need a new mindset to advance the UN’s Sustainable Development Goals and smart city initiatives
- Climate Change2019.07.09Indian summer: Why heat waves are bad for people and the economy
- Climate Change2019.07.08One climate crisis disaster happening every week, UN warns
- Projects2019.07.02Tesla batteries reach Eritrean villages in SolarCentury’s minigrids