One of the things I’ve found most striking about working in philanthropy is how much time we spend focused on what we do wrong, what we could do better, what we’re not achieving, and so on. To be honest, it can be a little deflating, especially in a field like ours that can and ought to be uplifting 24/7. Fortunately, every once in a while, one does get the opportunity to read something really inspiring—the kind of thing that reaffirms one’s pride about working in this sector, with these people.
I’m referring to the letter published today by Darren Walker, president of the Ford Foundation, on “What’s Next for the Ford Foundation?” I won’t rehearse the letter’s content here—you should just read it. Darren does a wonderful job knitting Ford’s great past together with its present and future aspirations, underscoring the foundation’s continuing commitment to social justice and equality, and sketching out important shifts in how the foundation will do its work going forward. One of the criticisms most often leveled at big foundations is that they can’t or won’t change. Certainly that’s not true here. The sorts of changes Darren describes, particularly in how Ford makes grants, will not be easy to implement. They are, however, the right kinds of change—not just for Ford, but for the whole sector. I was particularly excited to see Ford’s commitment to “a concerted effort to support stronger, more sustainable, and more durable organizations,” including through making “larger, longer-term grants that can be used more flexibly.”
I, for one, am eager to follow “what’s next.” The Ford Foundation’s storied past gives it a special place in American history and U.S. philanthropy. Under Darren’s leadership, its future looks bright to preserve that place. And its success will be everyone’s success.
When I joined the Hewlett Foundation in 2013, a renewed attention to transparency was high on my list of priorities. The Foundation has long been committed to openness and sharing information. As my very first post for this blog noted, we were among the first foundations to publish the results of our Grantee Perception Reports, and we have made a practice of sharing information about our grantmaking over the years. And we went still further after I arrived, sharing things like the grant descriptions we provide to our Board and a detailed study we prepared for them of grant trends in the past decade. We launched this blog, in fact, to offer Hewlett staff members a way to share their thinking, so others would know what we are up to and could challenge us.
But philanthropy is—or should be—all about learning, and we’ve learned that there really can be too much of a good thing. Even when it comes to transparency. We are thus replacing, or perhaps I should say modifying, our Transparency Initiative, which will now become our Translucency Initiative.
Quite simply, over the past few months, my colleagues and I have grown concerned that we may have gone too far with all this sharing. We wonder whether providing so much information just adds to the noise—pouring more and more data into the ceaseless flood of infographics, spreadsheets, and “must-read” thought pieces that we all confront each day. I confess that sometimes, staring at my ceaselessly replenished inbox late at night, I feel a little overwhelmed by it all. And I’m not even on Twitter!
So rather than continue contributing to the endless stream of humdrum data, we’re ready to make a change. We’re still committed to sharing information about our work, of course, but we’ll do so in ways that are hopefully easier for us to manage and you to digest. With apologies to the Foundation Center, we think of it as having “frosted glass pockets.”
To give you some idea of what this will look like in practice, I’ve included an image of our reimagined grants database, which reflects our new policy of translucency. We asked ourselves: is there really that much difference between a $200,000 grant and a $250,000 grant? As you can see, in the new database, grants of both sizes will be categorized as “A lot of money.” All grants over $5 million will simply be listed under “This better work.”
Our new translucent grants database.
We’ll do something similar for evaluations of our strategies. Rather than diving deep into the weeds of performance indicators, metrics, M&E, and ROI, we’ll ask our evaluators to share their findings using a clear three-point scale: “Getting out the checkbook,” “Meh,” and “Let us never speak of this again.” Simple, wouldn’t you agree?
Consistent with our new commitment to translucency, I’ve shared only the broad, somewhat fuzzy (but colorful!) outlines of our new policy. There’s much more to it, of course. How could there not be? We’re still a foundation, after all.
If you simply must know the intricate details of our current thinking on the topic, I encourage you to read the whole darn thing.
Members of the Nairobi (Kenya) Young and Old cooperative group gather in their small center to make products to sell. (Photo Credit: Jonathan Torgovnik/ Reportage by Getty Images, licensed under CC BY NC 4.0)
Among the Hewlett Foundation’s oldest, strongest, and most enduring priorities has been to help women gain control over crucial decisions in their lives. In the beginning, our attention was on expanding access to family planning services, so women in the U.S. and in developing countries could control the number of children they bore—benefitting not only the women themselves, but their families, their countries, and the environment.
That commitment remains as strong as ever, but our vision has broadened with experience. Under the leadership of Anne Firth Murray (who directed the Foundation’s Population Program from 1978-1987 and later was the founding president of the Global Fund for Women), the Hewlett Foundation widened its focus and placed family planning within a broader framework of women’s health and human rights. That vision continues today to inform the work of our Global Development and Population Program.
But learning never stops, and with enthusiastic encouragement from our Board, the Global Development and Population Program has fashioned plans for a new line of grantmaking—expanding our efforts still further to encompass advancing women’s economic opportunities in developing countries. The importance of economic opportunity in promoting the values that have always motivated our efforts to support women is now evident, a connection established by decades of social science research as well as our own experience.
I encourage you to take look at a more complete description of the new strategy. Our concerns are not the usual ones—microfinance, vocational training, and the like. Such efforts can be important, but we hope to empower women by showing economic decision makers in international agencies and developing countries how women contribute to growth and how they are affected by relevant policies, from taxation to employment regulation (Ruth Levine’s Friday Note from last week addresses this). We want to see better data, focused and relevant research, and informed advocacy, so the true value of women’s economic contributions can and will be fully and properly recognized—and further contributions encouraged.
Like our continued investments in expanding access to contraception and safe abortion, we believe these sorts of changes will help women realize their full potential as citizens, as workers, as parents, and as people.
Last March, our Board approved a new “Cyber Initiative” with a budget of $20 million over five years. The Initiative aims to build a field of policy analysis for problems relating to security and technological trustworthiness on the Internet. While government and industry are both already spending vast sums of money to deal with such problems, their focus is overwhelmingly on present needs and problems and mainly involves developing technologies to combat hackers, thieves, and enemies. Hardly anyone is thinking about the lasting consequences of today’s solutions, much less about developing overarching policy frameworks for long-term global governance and security.
The importance of having such frameworks cannot be overstated. Our lives increasingly depend on the Internet, and choices we are making today about Internet governance and security have profound implications for the future. To make those choices well, it is imperative that they be made with some sense of what lies ahead and, still more important, of where we want to go. Yet little or no thought is being given to such questions, partly because of avoidable obstacles. At present, few institutions treat questions of cybersecurity and Internet policy as a central or even important focus of their work. Individuals with proper training to address these questions are in short supply, and those who exist seldom speak to each other or share information. Nor has anyone given them reason to do so: funding for this sort of work is practically nonexistent.
The Cyber Initiative seeks to overcome these obstacles, and, in so doing, to build a “marketplace of ideas” about cyber policy—generating the kind of robust arguments and analytic frameworks needed to begin articulating sensible long-term public policy. We plan to do this by (1) supporting and/or building dependable, independent institutions capable of training, nurturing, and supporting experts with a sophisticated understanding of the problems; (2) convening experts from government, industry, academia, think tanks, and other arenas to share information and develop the trust needed to work collaboratively; and (3) attracting additional funders to help grow and develop the new field.
The announcement of the Cyber Initiative prompted a great deal of commentary about the need for, and importance of, our plan—much of it likewise emphasizing the absence of serious public policy analysis. These reactions came not just from potential grantees (whose statements may perhaps be taken with a grain of salt), but also from people working in government, industry, the media, and philanthropy. Our specific focus—creating opportunities for people coming from different sectors and disciplines to exchange information and work together, and developing multiple long-term policy options—was singled out for particular approbation. Given the modest size of the initiative, the attention it garnered came as a pleasant surprise, but it also says something about timing: public policy for cyber is a field ripe to be built.
Even so, building the field will not be easy—and not just because of the modest scale of our initiative, which is only $4 million per year for five years. As we explained to the Foundation’s Board in March, our plan has never been to build the field ourselves. Rather, we intend through our grantmaking to demonstrate what is possible, while working to attract additional funders and funding into the field. The reactions to our announcement were heartening precisely because they confirmed our sense that there is widespread interest and curiosity. The bigger problem turns out to be that the field is so underdeveloped that neither we nor other potential funders have adequate places even to begin.
An unanticipated partial solution to this last problem emerged during the summer, with the discovery that the Foundation needs to pay out more this year than originally budgeted if we want to keep the excise tax on our earnings at one percent. The Board agreed in July that we should do so. In September, I updated the Board, explaining that the Foundation needs to spend as much as $50 million before December 31 for this purpose; at the same meeting, the Board agreed to allocate $5 million of this amount for humanitarian aid in connection with the Ebola outbreak in Africa. Those funds are in the process of being disbursed.
At the time of the July meeting, the Board agreed to consider using the remaining funds in connection with the Cyber Initiative. More specifically, the Board gave permission to ask three Universities—Berkeley, MIT, and Stanford—to submit proposals for grants to establish multidisciplinary, public policy programs focused on cyber issues broadly understood.
It is worth briefly recounting the reasons behind this decision. As we discussed in July, establishing a number of strong academic centers will powerfully kickstart our Cyber effort—making a potentially transformative difference in launching a new field of public policy analysis. These grants will create critical centers of excellence that give other funders places to start building and enable us to use our limited resources more effectively. And if prior experience is any guide, we can expect other universities, think tanks, and funders to follow suit in launching their own cyber efforts.
A second, more easily answered question is, why Berkeley, MIT, and Stanford? To begin, it makes sense for us to start with major research universities. Eventually, we will need to support think tanks and other potential homes for policy development, including institutions that can attract participation from quirky and unorthodox technology types. But universities are likely to remain the foremost centers for developing long-term policy analysis, especially as our goal is to support analysts who are independent of both government and industry. Universities will also remain the key destination for training new people. Finally, and especially relevant for present purposes, major research universities are among the relatively small set of institutions capable of absorbing and making good use of $15 million grants in short order.
There were, of course, other universities to consider in addition to Berkeley, MIT, and Stanford. Based on our research, however, the three schools we selected seemed like the most promising places to start. All three have concentrations of world class faculty and graduate students working in a variety of relevant programs and centers scattered across their campuses—programs and centers that could and should be aligned to work collaboratively on cyber issues. What they have lacked are the resources and the impetus to do so, which the proposed grants will supply.
The full memoI shared with the Hewlett Foundation’s Board last month (from which this blog post is adapted) to provide them with the information they needed to approve these grants is now available on our website. It has more details about the proposal development process and the content of each school’s proposal, as well as thoughts on how we will measure success and the benefits and risks associated with making the grants.
If you’re interested in learning more about the thinking that went into making these grants, I hope you’ll take the time to read the whole thing.
Every November, the Board of the Hewlett Foundation authorizes a budget for the upcoming year, and, as part of that process, reviews what progress we have (or have not) made in our grantmaking strategies during the preceding year. As this requires Board members to absorb a great deal of complex detail, last November we rolled out a new version of the Board Book, designed to make the material easier to follow. (June Wang wrote a post about the Board Book redesign for our blog.) The revised Book included, among other things, a new “overview” that presented data for the past five years on the number of grants, their average size and duration, and the percentage that were for general operating support (GOS). Here is what the Board saw:
These figures raised questions for a number of Board members, who found them surprising in certain respects. Several asked whether our grants had become smaller in amount and/or shorter in duration than they used to be. Others wondered if we were drifting away from the Hewlett Foundation’s longstanding preference for GOS. Still others remarked that it was hard to draw conclusions without seeing the data broken down by program. They asked for a more thorough analysis of our grant trends.
The Board’s reaction stimulated a robust conversation among the staff. Had our grantmaking changed in ways that ought to concern us? Have our grants become smaller or shorter or both? Have we moved away from the tradition of helping institutions through general operating support toward a more controlling emphasis on discrete projects? If so, have these changes affected our staffing or the way we work?
Answering questions like these, we soon discovered, is anything but straightforward. On the contrary, our efforts to do so simply raised more questions. For example, the data we used in November presented GOS in terms of the number of grants, which can be misleading because GOS grants tend to be larger and therefore made to relatively fewer organizations. Would it be more accurate to measure GOS as a percentage of grant dollars? Should the data include Organizational Effectiveness grants, which have become numerous in recent years as part of a concerted effort to help grantees, but are—by definition—small and for a single year? How should we classify something like the extraordinary $500 million ClimateWorks grant, which was GOS and paid out over five years, but booked entirely in the year it was made (thus overstating GOS for that year and understating it for the following four)? Similar complications presented themselves when we focused on other measures, like grant size or duration. Even veteran program staff were surprised by the number of potential variations and complications that emerged in our conversations.
We concluded that a more thorough analysis of our grant trends was called for. To that end, we enlarged our review to cover the past ten years, instead of five. Beginning in 2004 made good sense: by then the Foundation’s endowment had recovered from the bursting tech bubble and incorporated the assets of Bill Hewlett’s estate, and the first stabs were being made to formulate and implement Hewlett’s distinctive brand of “outcome-focused grantmaking.” In addition to making it a ten-year review, we asked the programs to make separate presentations to explain how and why their grantmaking evolved as it did, incorporating a narrative alongside the statistics. We gave each program thirty minutes with the Board at our July meeting, during which they walked through the past decade of grantmaking and described the kinds of things that had shaped their particular outcomes. The memos they prepared for this purpose are included in my annual letter for 2014.
My task, at the conclusion of these presentations—which we interspersed with other business over the two-day meeting—was to draw things together and make some sense of the overall picture that emerged, if one emerged. (It did.)
My full letter shares what we found. I hope you’ll take the time to read the whole thing.
At bottom, philanthropy is about finding good ideas and providing the resources to see them tested, improved, implemented, and, if all goes well, brought to scale. Good ideas are essential but not by themselves enough. Even the best idea fades away without proper support, without a plan for making sure the right people hear about it, without effective advocates to press for its adoption. We do our best to use the resources at our disposal to help spread good ideas. And while it may sound peculiar to hear this from an organization with an endowment in excess of $8 billion, the fact is that there is only so much we can do by ourselves, and often it’s not enough. We need to find other ways to get the good ideas we support the widest possible hearing.
The changes we’ve made in our practices related to openness and transparency are in service of this goal—sharing what we learn with others so they can build on our successes (and avoid our failures). Now it’s time to take that one step further.
The Hewlett Foundation has for many years supported open licensing—a simple way to displace traditional copyright that facilitates and encourages sharing intellectual property. Grants to organizations like Creative Commons, which established and maintains a set of these open licenses, and to the many nonprofits that have received funds as part of our Open Educational Resources strategy, have helped to create the legal, cultural, and intellectual infrastructure for more open sharing of ideas. And we have long made information about our grant making available under one of Creative Commons’ licenses (as you can see by clicking the link at the bottom of this, and every, page of our website).
The benefits of open licenses are clear, and they are substantial. Reducing the burdens and removing the risks associated with ordinary copyright—making it easier for others to use, share, and build on work—magnifies the impact of new research and good ideas. Everybody wins.
For that reason, beginning this year we will ask grantees to license materials created with our grant dollars. More specifically, the Hewlett Foundation now requires that grantees receiving project-based grants—those made for a specific purpose—openly license the final materials created with those grants (reports, videos, white papers, and the like) under the most recent Creative Commons Attribution license. We also will require that the materials be made easily accessible to the public, such as by posting them to the grantee’s website. These requirements do not apply to grants made for general operating support of an organization or a program or center within an organization, because they are incompatible with the nature of general support. We do very much hope, however, that the positive experience of openly licensing materials created with project-based grants will lead grantees to do so for all their work.
That last paragraph comes from a new document in the “Values and Policies” section of our website, Commitment to Open Licensing. It explains why we are making this change, what the new policy covers, and how we intend to implement it.
We recognize that any time we place new requirements on grantees, we’re asking them to changes practices, and we want to be sure that the benefit from the change outweighs the cost of making it. To that end, we’re rolling this new requirement out to our programs slowly—we’ll continue to refine our internal practices, learn from how grantees respond, and make adjustments as needed. If the open license we suggest isn’t appropriate for a particular grant, we’ll work with the grantee to find one that is.
As with all of our efforts aimed at increasing transparency and openness, we’re making this change because we believe that this kind of broad, open, and free sharing of ideas benefits not just the Hewlett Foundation, but also our grantees, and most important, the people their work is intended to help.
Solving the kinds of challenges the Hewlett Foundation chooses to address requires good ideas, but ideas are not enough. Asking grantees to make sure their ideas are shared, so others can learn from and build on them, will help those ideas go further, be challenged and strengthened, and, in the end, do more good.
Last week, I spent three hours on two long conference calls . . . and I can’t wait to do it again. Now maybe you read that and think something like I bet that’s his entry for “Most Unlikely Opening for a Blog.” Except it happens to be true.
The idea for the calls came from Fay Twersky, director of our Effective Philanthropy Group. As part of the Foundation’s ongoing effort to work more openly and transparently, she suggested holding an open conference call with our grantees. We could share important news, she explained, and, more important, could let them ask questions they might have of me or of our program directors. The plan was to hold something akin to a corporate shareholders’ meeting—a town hall-like forum for conversation, inquiry, and dialogue.
We weren’t sure it would work. To begin with, everyone is busy and we didn’t know how many people would participate. Even if people did participate, we couldn’t be sure what kinds of questions they would ask. It’s an unfortunate fact that grantees are sometimes reluctant to ask us hard questions, presumably for fear the message will be unwelcome or perceived as biting the hand that feeds them. (To which I can say only that the concern is misplaced: we have thick skins here at the Hewlett Foundation and would much rather know what you think so we can make changes if necessary.) We also worried about whether the questions would be of broad enough interest to engage the whole audience. Our grantees work in disparate fields and face vastly different challenges in their day-to-day work. Would people who work on family planning in Africa find value in listening to questions about performing arts in the Bay Area or conserving the Boreal Forest?
Still, we thought the idea worth trying. So we sent an invitation to all our active grantees. To make participation as easy as possible we decided not to ask for RSVPs, and we blocked out time for two calls—one in the early morning, another the following day in the late afternoon—to accommodate grantees in different time zones around the world.
The first call was only so-so. There were fewer participants than we had hoped, and they asked only a handful of questions. The questions were good—touching on things like our new blog, our processes for evaluation, and our rationale for term limiting program officers—but it was clear that participants weren’t entirely sure what they could or should ask. We ran out of questions before the allotted time had passed and ended the call a few minutes early.
We used the day between the two calls to think about what we might do better. I had framed the first call around the results of our most recent Grantee Perception Report, which may have left participants unsure about posing questions that had nothing to do with the survey. So I opened the second call without any reference to the GPR and instead emphasized that we were eager to talk about anything and everything. At the urging of several colleagues, I also focused on speaking more slowly, as even my own staff said I was hard to follow, which probably discouraged some potential participants.
The second call went much better. The time of day made a big difference, and more than twice as many callers were on the line. Participants asked wonderful, challenging, interesting questions on topics ranging from how we can better support grantees during turbulent economic times to how we think about partnerships with government or with other foundations, how we balance short-term expectations for individual grantees with the long-term outcomes we hope to achieve, and more. We took eighteen questions and went a few minutes over the allotted hour and a half and there were still questions in the queue when we ended the call. (We answered these separately off-line.)
I found the calls challenging and interesting, nerve-wracking at moments, but exhilarating and informative. And fun. On balance, the experiment was a clear success, and we’ll definitely do it again. I hope to reach even more of our grantees as word spreads. It’s a wonderful opportunity to get and give feedback. It’s also an opportunity to foster a sense of community, to give grantees a feel for the breadth of the Foundation’s work and for each other. More than a few wrote me afterwards to say they hadn’t realized how many things we support and were surprised how much they got from hearing about wholly disparate fields. Most of all, it’s an opportunity for us to listen and learn.
Of course, no need to wait. If you have questions or comments about the Foundation and its work, please let us know. We want to hear from you.
It is, I believe, generally recognized that collaboration among funders is important, because none of us has the resources to make serious inroads on big problems by ourselves. Yet despite widespread acknowledgment of its value, I’ve been surprised by how difficult it can be to form fruitful collaborations with other funders. Collaboration happens, but less than it could—even though foundations are unregulated and not competing with each other.
The question is, why?
Many people say the fault lies with foundation presidents and board members, who some think discourage collaboration because of ego or to avoid sharing credit. I haven’t encountered this myself and doubt it as an explanation. More plausible—and this is something I have witnessed—is that presidents and boards can make collaboration difficult by becoming overly directive and inflexible about the specifics of where and how to make grants.
A bigger problem—bigger because it inheres in the structure of most foundations—is the agency cost associated with dispersed decision-making. As a practical matter, collaborating on any particular project or program requires buy-in from a foundation’s program officer, program director, president, and board—each with different degrees of knowledge, different levels of commitment to existing strategies, and different incentives to change. Not surprisingly, aligning so many differences can be challenging.
Ongoing collaboration is hard, for instance, when program officers from different foundations agree but must wait months for their respective boards to sign off. A president who wants to support a request from another foundation may be reluctant to override a more knowledgeable and informed program officer who does not want to veer from an ongoing strategy the program officer devised. A program director may discourage the president or board from collaborating on a particular strategy, even one the relevant program officer supports, if this could affect the allocation of resources among strategies within his or her particular program. And so on.
Funders can overcome these obstacles, as we see in collaborations that already happen. But it’s not easy, and I have increasingly come to believe that overcoming these hurdles—whether by persuading foundations to be more flexible about process or by aligning their internal constituencies—is difficult because there is a dearth of what is known in international relations theory as “diffuse reciprocity.”
I haven’t blogged in a while, and instead of devoting this space to a single item, I thought I’d catch you up on a number of recent developments at the Hewlett Foundation.
First, you may have noticed that we’ve renamed our blog. “Work In Progress” was a good name, but it feels a tad too earnest. So in the Silicon Valley spirit of rapid prototyping and iterating, popularized by design thinking, we decided to just change it. If we don’t like it, we’ll change it again.
I hope you’ve been reading Ruth Levine’s “Friday Notes.” While they are one of my favorite things about Fridays, I confess to being somewhat competitive. So I’ve decided to take Ruth on. Henceforth, I’ll be offering these "Tuesday Notelets.” I can’t write as fast as Ruth, so they’ll be shorter than Ruth’s Notes (that’s why we’re calling mine “notelets”). I’m sure you’ll find them interesting, though. I’ll be checking the Google Analytics on this, so please don’t let me down. Whether or not you actually read them, I’d appreciate it if you would just click on them a lot.
Speaking of strategic philanthropy (because we’re always speaking of strategic philanthropy around here), a recent evaluation that involved a robust randomized control trial suggests that our approach to grant making does no better than random chance. Before acting on this somewhat unexpected finding, I want to see if we can replicate it internally. So we’re going to take a portion of our grantmaking portfolio and throw darts at a dartboard to select a control group of grantees. We’ll be looking closely at these so-called “bullseye” grantees to see if they do better at achieving results than those we select through our normal rigorous strategic approach. Fingers crossed everyone!
It amazes me how many people think that I’m the president of HP. Anyone who really knows me, after all, knows how much I really, really hate computers. (And don’t get me started on tracked changes. But I digress.) Other people think I work for the Packard Foundation, or for any number of other organizations that have either Hewlett or Packard in the title. When I first told my mom about this job, she couldn’t understand why I would leave the law school to go into the home printer business. To solve this annoying brand confusion problem, we are going to propose a merger with the Packard Foundation, the Lucile Packard Children’s Hospital, the HP Foundation, and HP itself. The new entity will be named HF-PF-LPCH-HPF-HP.
Here’s another piece of really exciting news. If you read my initial post about transparency back in November, you know how truly committed I am to sharing everything I can about the Hewlett Foundation, and that should include what it’s like to be a foundation president. To that end, I’ve agreed to be a contestant on the TV show Big Brother (I wanted to do Survivor but worried that at my age I’d be voted off the Island too quickly). A little unorthodox, perhaps, but we pride ourselves on experimentation and risk, and this seems like a terrific opportunity in that light. I promise to watch my language and always wear a robe or towel around the hot tub.
Finally, inasmuch as we’re in the heart of the technology world in Silicon Valley, it seems crazy to make grants using such antiquated means of payment like money. So over the next eighteen months, we'll be transitioning to Bitcoin. (But we're already planning our move to Dogecoin. Dogecoin is the future, man, and we want to get with the program.) I don’t actually know how Bitcoin works, but I read that interview where Mark Andreeson takes Warren Buffet behind the shed for criticizing it and decided the whole cryptocurrency thing sounds really cool.
That’s all the news I have for now on this April 1. Hope you keep clicking, especially here.
Here are three statements that shouldn’t be controversial: climate change is real; human activities are a major factor contributing to it; and the path we are on will lead to massive social, economic, and human suffering here in the United States and around the world. Yet whenever the subject comes up, in venues large (like the Intergovernmental Panel on Climate Change’s 5th Report, which synthesizes 9,000 studies prepared by more than 800 authors from 40 countries) and small (a blog post last week by Erin Rogers of our Environment Program on the need to focus on carbon pollution rather than adaptation), some people, for whatever reason, want to continue a debate that should be regarded as settled.
In our case, critics challenged the scientific consensus on climate change, questioned our motives, and tried to change the subject. One pointed to the PBS NewsHour (which we support) as being unwilling to air the views of skeptics, and intimated that the Hewlett Foundation is responsible for silencing dissent on that program. But even if we were telling the NewsHour what to air (we’re not, of course—our grant to them is for general operating support, and long predates our work on climate change), what of the thousands of other news outlets, governments, and scientific academies around the world that have reached the same conclusion about the irrefutable reality of climate change?
Ours is just one small blog, of course, and it has attracted only a few critical comments. On a far larger scale, critics of climate science have seized on a finding by the IPCC as evidence that the threat posed by global warming is overstated if not downright fabricated. In particular, critics trill that the panel’s 5th report indicates that the Earth’s temperature has not increased as much as predicted by the 4th report released six years ago, and in fact, has slowed in recent years—a trend the skeptics say the IPCC cannot adequately explain.
This is, if you will pardon the pun, just so much hot air. In fact, the 5th Report differs from the 4th in precisely the ways we should expect (and want) from good science, relying on continued work and refinements in collecting data to produce better, more accurate models. More important, the IPCC has an explanation for the slowdown in rising temperatures that makes quite good sense.