CEP recently shared the full report and presented the findings to all staff. I asked Hewlett Foundation Evaluation Officer Amy Arbreton about the findings.

These Grantee Perception Report (GPR) findings seem pretty consistent with our last survey, completed in 2013. Would you agree? What do you see as the major finding in these survey results? 

Yes, there was very little change—which, given the relatively high ratings we received on many measures in 2013, we see as a good thing.  Overall, Hewlett Foundation grantees continue to have very positive perceptions of the Foundation, but there are areas for improvement.

The Foundation has maintained or improved on many of the already positive ratings we received in 2013.  There was notable movement around transparency, for example. On overall transparency, grantees rate the Foundation significantly higher than grantees rated us in 2013.

As in 2013, Hewlett Foundation grantees rate the Foundation’s impact on their organizations, and the helpfulness of its selection and reporting/evaluation processes similar to the typical funder. That’s been consistent for the last few years, but we actually ranked higher on measures of impact on past surveys—so those are areas where we can continue to look for ways to improve.

Q. One of the reasons we value the GPR is that it provides comparative data, because CEP also surveys the grantees of other large philanthropies. What does that additional context tell us?  

Because CEP conducts the online survey, it ensures confidentiality to the grantees and an independent perspective for us on a wide range of areas that we care about.  It’s also a comprehensive survey of our grantees, which means that CEP surveys all active grantees and not just selected ones.

Probably the most helpful part, though, is the comparative data that CEP provides.  Grantees tend to give positive ratings to funders, so the comparative data give us a sense of how we are doing relative to our peers and our own past performance. When CEP reports the findings, we see ourselves compared with all the other foundations in the very large data set, we see ourselves compared with a set of other large foundations that have similar characteristics to our Foundation, and we also see ourselves now compared with past years.

Some of the key points highlighted in the 2015 GPR are that our staff’s understanding of grantees’ fields, advancement of knowledge, and effect on public policy continue to be areas of strength for us. But as I mentioned, we have seen higher ratings in past years on things like the helpfulness of selection and reporting/evaluation processes.

Q.  That’s helpful context. Of course, our program staff have been looking at these data closely to see what we can do to improve our grantmaking. What would you highlight as areas that you expect the Foundation will be working on?

Across the Foundation, we take the findings seriously and look for ways to improve.  Each program takes a look at their own results and makes plans to address those areas that are most relevant to their program goals.  If you look at the Foundation as a whole, there are a few areas that we are paying close attention to—things that are relevant to every program.

Though our overall rating of transparency has increased, there is room to do more, particularly with respect to sharing what we have tried that has not worked.

Impact on grantee organizations is another area that we care a great deal about and one where programs will continue to pay careful attention.  CEP results suggest that our general operating support grants and Organizational Effectiveness (OE) grants tend to have a bigger impact on grantee organizations, so I expect program staff will continue to consider that in thinking about the mix of grants we make.

Grantees’ open-ended comments suggest that Hewlett might want to consider greater inclusion of grantees and other relevant stakeholders in developing our grantmaking strategies—and to communicate more broadly and clearly about the opinions and input that inform our strategic choices.  Programs will likely be working on building those kinds of opportunities for input and feedback more directly into our practice of strategy development and implementation.