This post originally appeared on the Grantcraft blog. -ed.
One way I mark the passage of another year is the welcome arrival of the latest Blueprint — the annual industry forecast report written by Lucy Bernholz and published by GrantCraft, a service of Foundation Center. This year’s report, Philanthropy and the Social Economy: Blueprint 2015, provides us once again with a rich opportunity to look back at the past year and to ponder what’s to come in the year ahead. The Blueprint is a great marker of time and creates a moment to pause for reflection. As I read this year’s report, I found much to digest, understand, and learn. Like the five previous editions, Blueprint 2015 is provocative, and — as I settled in to read — I was humbled to discover that it brought up many more questions than answers. The report piqued my curiosity about the state of the social economy and more explicitly about organized philanthropy and how we do our work. Specifically:
Are we agile and flexible enough? Are our philanthropic organizations ready?
The words “dynamic” and “dynamism” show up throughout the Blueprint 2015, and the pervasive thought I had while reading was that this is an exciting, creative, and expansive time for the social economy. Given this, I couldn’t help but wonder if philanthropic organizations are ready — will we be able to flex, bend, and adapt at the same pace as the change around us? Our ecosystem is evolving, moving, and reorganizing. In this time of globalization, disruptive technology, digital activism, new organizational forms, and even new language, are philanthropic organizations keeping pace? Do we have a picture of what “keeping pace” would really mean?
My experience is that folks doing the work of philanthropy take their role very seriously. It’s a tremendous responsibility to be entrusted with private resources in order to create public benefit. That we take that trust seriously is a good thing. In practice, this means that we tend to be careful, we analyze everything thoroughly, and we remain deliberate, trying hard not to make mistakes. This subtle — or not so subtle — perfectionism creates a tension against our desire to also be nimble, innovative, creative, and dynamic. I wonder: how can we talk about and manage that tension? Are there times we should be using philanthropy as true risk capital, maybe leaping more and looking less? Can we be nimble enough to fail, learn, and course-correct quickly, and have that process be okay, even celebrated? It’s clear that many of the newer entrants in the social economy are working from this spirit of moment-to-moment dynamism. How can we collaborate with openness, adaptability, and readiness for change? Are we learning how to be more agile and flexible along the way?
Are the right people/skills at the table?
The other thing that struck me as I read the report is the variety of new skills and voices needed to work well within the changing social economy. We know, for example, that new technologies and digital data are emerging as important sources and byproducts for learning, innovation, and achieving results. It follows, then, that we need to make sure technology and data capacity are being fostered, used, and advanced within philanthropic organizations and across the sector. Together, we need to gain expertise as we take on challenging topics like intellectual property, open licensing, transparency, and privacy. Further, working in a digital world during this time of rapid change requires operational savvy. We need to build and maintain necessary infrastructure to execute well today, while also forging the space so we can adapt and shift easily in the future. Collectively, this is a tall order. Are we listening to the right experts to make this happen? Are we building the necessary capacity and knowledge?
As “pervasive digitization” has become the new normal, have we changed the way we think about technology and data expertise in our grantmaking? It doesn’t seem reasonable that all program officers now also need to be technology experts (though some are.) How do we make sure the technologists are being included at the right times? How can our daily work be informed by data expertise and digital best practices, and how do we successfully integrate these into our grantmaking? Bernholz notes that “technologists are becoming part of the sectors that they serve” and imagines a future where “data analysis and sensemaking skills” are integrated into strategy and grantmaking. What new understandings do we need in order to know how we will do this? And, who do we need to include in the conversation to live this out fully?
The 2015 Blueprint marks a time that is vibrant, rich, and exciting for us to be working in this sector. It also invites us to adapt, flex, and change — more than ever before. It’s not a perfect metaphor, but sometimes I find myself thinking about the proverb of the shoemaker whose children have no shoes. Those of us who work in philanthropy understand that our grantees need to adapt within changing circumstances and must constantly evolve. We know that executing well is the challenging standard we place upon grantees as we give them resources. I’m not sure we always hold ourselves to the same standard, or that we take the time to know what executing well might mean within our own changing context. Just as we offer capacity building support and technical assistance to the organizations we fund, it’s also important that we do our own capacity building work, making the necessary changes within our organizations to be effective, real-time participants in the social economy. Are we checking ourselves to make sure we have the skills, roles, knowledge, and processes needed to do that?
Our changing ecosystem will certainly require that we become comfortable with the continued blurring of lines and re-imagining of everything around us. As we strive to achieve impact and social benefit, it may mean we need to bring new people to the table, while developing new skills and new ways of working ourselves. My hope is that all of our good intentions and hard work continue to fuel the adaptability, learning, and dynamism that Bernholz points to so brilliantly.
There’s a great quote about “big data” from Dan Ariely of Duke University that’s been circulating for the last couple years and is still spot on: Big data is like teenage sex: everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it.
This seems particularly true of the philanthropic sector—we’ve all been talking about it, but we’ve yet to find the right way to apply the rhetoric of big data to our work. In part, this is because we don’t actually have many examples that follow the original definition of BIG data. The oft-cited example of Target using big data to determine when a customer is pregnant in order to focus their marketing is an interesting story, but it’s challenging to figure out what the corollary would be in philanthropy. In reality, most of our data is pretty small or—at most—medium in nature. And even with that, we’re still learning to effectively define, collect, use, and share data within our organizations and across the sector.
With all this in mind, it was incredibly refreshing to attend a session at the recent TAG (Technology Affinity Group) conference in Miami that was an honest dialogue about big data and philanthropy. C. Davis Parchment, Manager of the Electronic Reporting Program at the Foundation Center and moderator of the session Managing Big Data Across Foundation Roles: Identifying New Tools for New Teams, set the stage with a much needed redefinition of big data. Rather than think of big data in its most literal definition as very high volume data sets, Davis described big data as a movement, one that focuses on analysis, rigor, metrics, and critical thinking. Big data as a movement calls on us to employ technology more effectively; to better collect, use, and share data to inform our work, make decisions, and ultimately, to create real and lasting impact through our grantmaking. If we do this well, it will change the way we work—transforming roles and processes within our organizations and redefining how we collaborate and share across the sector.
The first speaker, Kevin Rafter, gave us an example of this concept in action, focusing on creating new data about grantmaking. As Manager of Impact Assessment and Learning at the James Irvine Foundation, Rafter has designed a framework for collecting qualitative and quantitative data about grant results across Irvine’s diverse set of program areas. Irvine is implementing a new grant closing process through which staff will answer five straightforward questions about results before closing a grant. To make it easy, the user interface is elegantly built within their Foundation Connect grants management system. Collecting this information will create a new data set for analyzing their grantmaking impact and learning from grants. By collaborating together, an internal project team was also able to simplify and clarify the grant closure process overall. This is a great example of a foundation giving thoughtful attention to how they define and collect useful data. I’ll be eager to hear more from Rafter once staff has used the tool through several grant closure cycles and they’ve further developed their data set.
We know that an important function for grantmaking data is creating transparency when it is made publicly available by foundations. Suki O’Kane, Director of Administration at the Walter and Elise Haas Fund, brought this concern to the fore by sharing her organization’s experience as the 19th member to join the Reporting Commitment Initiative, which is managed by the Foundation Center as part of Glasspockets. Foundations who join the Reporting Commitment agree to make machine-readable grant information available to the public at least four times a year and to use a common geographic coding scheme for their grants thus making timely and accurate reporting on the flow of philanthropic dollars more broadly available. Upon joining the Reporting Commitment, the Haas Fund developed an open source tool that makes it easier for other foundations to participate in the initiative. The tool, called Open hGrant, is a WordPress plug-in and is a great example of how innovative technology design can ease the sharing and use of data. As O’Kane artfully described, the contribution also embodies the spirit of collaboration: it improves data sharing practices at the Haas Fund, then takes that learning even further by advancing data sharing within the whole sector. The Open hGrant tool brings us one step closer to eliminating the barriers to better use data in our grantmaking.
According to Patrick Collins, Chief Information Officer at The William and Flora Hewlett Foundation, another way to ensure better data use is through automation—which makes data activities more efficient and will ultimately transform how we do our work. Specifically, Collins talked about the use of Application Programming Interfaces (APIs) to connect and share data across systems. By connecting with external systems via APIs, Hewlett has been able to automate many repetitive tasks and free-up staff time to focus on more substantive challenges. One example Collins shared is how Hewlett now automates simple tax status checking via an API with Guidestar’s Charity Check system, eliminating the need for data entry and administrative staff time. As the use of system interfaces is adopted more widely, it will allow real-time data sharing and will deliver data to users in the right way—and at the right time—for decision-making. This, in turn, will change how people do their jobs, share information, and collaborate.
We certainly don’t have big data completely figured out in the philanthropic sector, but this session at TAG was a nice step in the right direction. I was deeply heartened by the spirit of collaboration, pragmatism, and creativity throughout the presentations and discussion afterwards. There’s big potential on the horizon for data to be used more effectively and—through that—to improve our grantmaking practice and how we all achieve impact. I’m excited by this ongoing conversation and the transformative power possible if we all collect, use and share data well.
Recently, I had the good fortune to sit down for a long conversation with Lucy Bernholz that ranged across a number of topics related to the practice of philanthropy: digital data, transparency, collaboration, operations, organizational models, and impact. One of the themes that kept coming up was the way lines or structures that have been seen as quite separate within large foundations are starting to break down—or even to blur together entirely. Lucy artfully documented this blurring—past and future—on her blog.
Lucy is a far more skilled and wise prognosticator than me and it was a treat to hear her thinking out loud about the future. I was particularly taken with her perspective that in the decades ahead, we will observe a continued blurring of organizational structures between what we’ve separated into “program” and “administration” in large foundations. The grant-giving work of program staff and the traditionally operational functions like Grants Management and IT are becoming less distinct from each other over time. What seemed like bright lines are less clear as digital data, infrastructure, intellectual property and systems are becoming the lifeblood of innovation, analysis, and learning within foundations and the social sector.
I have a front row seat for this shift as it’s happening within the Hewlett Foundation, so I thought it was worth sharing what it’s looking like in practice to us. If Lucy’s crystal ball is clear (and I have reason to believe it is) there will be lots more of this blurring of lines in the decades ahead.
At the Hewlett Foundation, the Grants Management team—responsible for the operational side of grantmaking—is assigned by program, so each of our programs has a Grants Officer working in partnership with their Program Officers and Program Associates. Program staff experience their Grants Officer as part of their team, attending meetings, conferences, and retreats and remaining engaged in their goals, work, and learning in an ongoing way. This relationship allows the operational side of grantmaking to be in constant dialogue with our programmatic colleagues. It also allows the Grants Management team to develop expertise and depth of experience about the content of our work. My team will never have the content-expertise of our program colleagues (not even close), but we can develop sufficient knowledge to ensure that the infrastructure and execution of grants fully supports and aligns with what our programs need to do their work and to do it well. The Grants Officers bring their own set of expertise in systems, data, project management, and grant requirements to the program teams.
At the same time, the Grants Officers are not solely member of their programmatic team(s). They are also members of our distinct Grants Management department where together we are able to consolidate and centralize expertise and knowledge while holding a broad view of the organization. Together, we observe patterns, learn and share best practices, aggregate information, analyze and improve digital data, and design systems. We train all new program staff regarding grant practice, workflows, requirements, and technology. We have a broad perspective on the overall risk profile of all our grant activities and work closely with the legal department to manage risk. We observe and share best practices from each program to cross-pollinate within other programs. In these ways, there is a constant dynamic interdependence between the program goals of the foundation and the team on the operational side of grantmaking. Our way of working and interacting is a real embodiment of the “blurring” Lucy and I spoke about.
In the past, I confess that I had an understanding of organizational design that was somewhat rigid and lacking in nuance. It was just boxes and lines on paper, or so I thought. But my experience leading the implementation of this particular organizational design for our Grants Management department, I’ve come to understand how important organizational structures are to support effective working environments. Organizational Design expert Daniel Kuzmycz taught me more regarding how effective organizational design actually supports and reinforces employee behavior. Working with him has helped me further understand that well-designed organizational structure can act as an anchor to culture and establish norms for relationships as well as collaboration. Since we implemented a “program-assignment” model for grant operations at Hewlett, I’ve had the opportunity to watch this come to fruition beautifully with more benefits than I could have imagined when we started back in 2011.
My colleague at the Ford Foundation, Susan Hairston, has organized her grants management team similarly. Susan likens the partnership between program staff and grants management to the relationship between a pilot and flight tower. The program staff are moving towards their goals and in charge of their activities and relationships. At the same time, strong operational support “from the ground” gives the program experts guidance and perspective on how to best achieve their goals and make grants. Susan’s is a powerful metaphor and one that speaks to me of real interdependence and fluidity between organizational lines.
Organizational structure is but one obvious aspect of blurring lines between “program” and “administration” at the Hewlett Foundation. I also observe it every day all around me in activities large and small. I see it every day in the way systems are developed and utilized and in the strong partnership that exists between our Grants Management and IT teams, learning from and working with our program staff. I see it while using our paperless grant file system, which—because it is designed well—creates and encourages collaboration and transparency throughout the foundation and across organizational lines. I see it in how we use and discuss grant data to think about our grant practices, goals, and results. It’s an exciting time to be working at the intersection programs and operations in philanthropy, primarily because the blurring of lines makes every day ripe with opportunity. I’m looking forward to seeing what comes next.
A version of this post appeared earlier this summer at GMNsight, the professional journal of the Grants Managers Network -Ed.
Nothing ventured, nothing gained. Or so our neighbors on Sand Hill Road here in the heart of Silicon Valley will tell you. They wear failure like a badge of honor, because it shows they were brave enough to take real risks. True to our roots, The Hewlett Foundation has long embraced failure, too. A perfect grantmaking record would be a sign that we’re not taking the risks that are necessary to accomplish our goals. So, we welcome failure. Not for itself, but for what it can teach us about our work, and how we can do it better.
That kind of growth requires real candor, introspection, and honesty. And there’s risk in that, too—exposing ourselves to criticism, pointing out our own errors in judgment, taking the chance that we might not like what we find. We don’t often think of it this way, but honesty is a risky virtue, maybe the riskiest. It’s one that can push us out to the fringe—of what’s comfortable for us and acceptable for our organizations. Real honesty about our failures takes real bravery. But it’s worth the risk.
In 2012 our Grants Management department was a brand new team—completely reorganized with me as the leader of a new staff working in newly defined roles within the Foundation. Our first priority was to develop trust and cooperation with the Foundation’s program staff. We were an unknown quantity, so it was important to build credibility. We had to prove ourselves as valuable assets to help the Foundation accomplish its goals.
We were inspired to demonstrate to our colleagues how helpful we could be by solving a nagging problem—longstanding confusion over the precise due date for the Foundation’s grantee reporting. The Foundation has two due dates for these reports: a defined due date and a grace period after that date (an additional 30 or 60 days). This grace period creates consistent confusion about the real due date, and when a report is truly late. Staff and grantees can (and do) point to either date as the “correct” one. Our team identified fixing this as a worthwhile and low-risk challenge, a relatively simple way we could bring more clarity and efficiency to the reporting process. Low hanging fruit, so to speak. We thought clarifying the reporting date question would be an easy project to lead and implement. We were wrong.
Our initial conversations with program staff quickly showed that there were strong opinions about what to do. Staff differed widely in their opinions on how to improve the process—including a sizable contingent who felt the process was fine just the way it was. We learned from staff with international portfolios that the grace period was important for foreign organizations to manage document translations, exchange rate accounting, and the timing required for expenditure responsibility reporting. We learned from program staff how the grace period had been used to maintain goodwill and collaboration in their relationships. Other staff were frustrated with the convoluted dichotomy of a “real” due date and a “sort of” due date. They had a hard time explaining it to their grantees and didn’t fully understand it themselves. They wanted to eliminate the grace period and have just one clear and final due date.
Taken off guard by these unexpectedly strong opinions, my team had a hard decision to make. A project we thought would win us support turned out to put us right in the center of controversy and differing opinions. Ultimately, we left the reporting dates the way they were, because we decided the risk of confronting strong opinions wasn’t worth the potential damage to the relationships we were just starting to form.