FavoriteThis article explores what it looks like when a foundation attempts to integrate accountability and learning practices, and presents a framework for the unique and complementary contributions that accountability and learning can make to the work of foundations. The article also looks at the tensions that can arise when a foundation’s internal evaluation staff attempt to design, implement, and make use of accountability systems. Specifically, it identifies three problematic perspectives that can hold foundations back from full engagement in internally driven accountability initiatives, and offers practical guidance on how to shift these mindsets to more productive practices. It concludes by Read more
FavoriteKresge’s Strategic Learning, Research and Evaluation Practice this week launched its first-ever page on the Kresge website to share its knowledge and evaluations with the sector and community partners across the country. The web page reflects the expansion of the practice with the hiring of Strategic Learning and Evaluation Officer, Anna Cruz, who joined Director Chera Reid. The foundation’s strategic learning, research and evaluation function was established in late 2015 to help Kresge program and practice teams connect data and insights with action to augment the expansion of opportunities for people with low incomes in America’s cities. Learn more here.
FavoriteThis article highlights the findings of an 18-month pilot project conducted by AcademyHealth to help the Robert Wood Johnson Foundation better understand the impact of a subset of the foundation’s research grants, across investment types, on health insurance coverage and health reform, and to help inform how the foundation may more systematically track and measure the impact of the research it funds.
FavoriteThis report shares what the Casey Foundation learned from Baltimore residents — both interviewees and consultants — and demonstrates that young adults and other community voices can be incorporated into the institutional decision-making process. It delivers a road map for implementing similar research models while helping leaders sharpen their strategies for engaging youth in educational, workforce development and employment opportunities.
FavoriteA report by the Casey Foundation highlights that while two-generation approaches — efforts to create opportunities for parents and children together — have evolved and improved as a promising strategy to interrupt intergenerational poverty, gaps in the research base are hindering progress in bringing the best efforts to scale. The report offers recommendations on how public and private funders can target their evaluation and funding strategies to build evidence demonstrating which approaches and components work best; what kinds of research funders should support; and how to communicate evidence-based findings effectively to inform program leaders and policymakers.
FavoriteThis article shares insights from a five-year evaluation of the Oral Health 2020 network, an effort by the DentaQuest Foundation to align and strengthen efforts in service of a national movement to improve oral health. The evaluation helped to place the foundation’s journey in the context of a broader field seeking new approaches to achieve deep and sustainable social change.
Favorite One of the most effective things you can do to strengthen your dialogue-to-change program is to create an accurate process for documenting and evaluating the entire effort. Many people who take part in your program will want to know who participated, how effective the organizing strategies were, and what the outcomes were. Moreover, grant-making foundations, public officials, news media, and other people who can help you expand, strengthen, and institutionalize the dialogue-to-change program in your community will all want to know about your efforts and their impact. 1. Revisit your goals 2. Set benchmarks 3. Document the basics 4. Track Read more
FavoriteThis article uses the evaluation of the Orfalea Foundation’s initiative to provide a case example of a rigorous and useful sunset evaluation, and discusses other possible extensions of these kinds of methods.
Favorite Atlantic Philanthropies’ exit strategy involved a formal partnership arrangement with the Northern Ireland Assembly. This article draws on qualitative data gathered through interviews with key stakeholders — the funder, government officials, and NGOs — and considers the consequences of this approach for sustaining and mainstreaming policies and practices. It also offers both specific and general lessons on partnering with government as an exit strategy.
FavoriteAs the need for scarce grant dollars grows more intense, so does the need to make certain those dollars are spent as effectively as possible. Hence the question of how to evaluate the consequences of grant supported activities has risen to the forefront. This tool will help you make sure grant money is spent as effectively as possible and how to evaluate the consequences of grant supported activities
FavoriteProgram evaluations provide answers to a key question: “Did this program work in a particular population and setting?” For funders who haven’t been trained in reading evaluations, wading through such reports can be confusing and time-consuming. So, to get the most out of program evaluations, here are 7 questions to ask when reading them.
Author: Center for High Impact Philanthropy, University of Pennsylvania
FavoriteIn an effort to support the broader mission investing field, Aligning Capital with Mission offers, for the first time publicly, findings that highlight the strong alignment of The Annie E. Casey Foundation’s Social Investment Program’s impact with the Foundation’s mission and programmatic goals. The report also offers lessons and insights surfaced during the course of the evaluation that can inform foundations and other impact investors in more effectively measuring the impact of their investments.
FavoriteWhile evaluation has traditionally focused on assessing programmatic impact according to pre-determined indicators, a new approach is needed for evaluating complex initiatives, as well as initiatives operating in complex environments where progress is not linear, predictable, or controllable. 9 propositions can help evaluators navigate the unique characteristics of complex systems, improve their evaluation practice, and better serve the needs of the social sector.
FavoriteThis final impact assessment provides an inventory of Beldon’s legacy, five years after its grantmaking ended. It tells the story of a foundation that brought innovative ideas to its grantees, built capacity and infrastructure, and planted seeds that continue to bear fruit. As a spend-out foundation, it struggled with challenges that other spend outs share, and the lessons learned from this assessment are intended to help others embarking on similar paths.
FavoriteThe Beldon Fund relied heavily on independent external evaluations to make early and mid-course corrections to our program strategies and to develop benchmarks to measure progress. EXTERNAL EVALUATIONS Each assessment includes a summary of key findings as well as the full report. Evaluation of Beldon’s Program Strategies PDF A qualitative evaluation, based on confidential interviews, of the impact of our program strategies. Beldon Grantee Perception Report PDF Anonymous survey of Beldon grantees. Beldon Grant Applicant Perception Report PDF Anonymous survey of Beldon grant applicants. EVALUATION BENCHMARKS Evaluating Policy Advocacy Grant-Making Strategies
FavoriteIn this video, the participants are discussing why so few market-based solutions to poverty are getting to scale, what can be done so they can deliver meaningful benefits to the poor, and more.
FavoriteGoal-free evaluation (GFE), in program evaluation, is a model in which the official or stated program goals and objectives are withheld or screened from the evaluator. This article presents the case for GFE as a perspective that belongs in a foundation’s toolbox. In particular, this article demonstrates GFE’s actual use, highlights aspects of its methodology, and details its potential benefits.
FavoriteFive Guidebooks for Grantmakers released in 2011, these concise guides provide resources for developing and expanding the use of evaluation and evaluative thinking in Grantmaking organizations. Basic Concepts for Grantmakers Using Logic Models Evaluation Data Collection Evaluative Thinking for Grantmakers Commissioning Evaluation Integrating Evaluative Capacity into Organizational Practice 2012– This manual offers concrete ways to integrate evaluation skills and evaluative thinking, beyond the program level, into everyday organizational practice, helping to ensure stronger programs and to support more effective organizations that are better able to deliver on their missions. The new guide also includes a review of Evaluation Essentials and Read more
FavoriteThe second edition of The Logic Model Guidebook: Better Strategies for Great Results is a straightforward guide, with excellent and varied examples, that achieves its purpose of giving readers a “basic understanding of how to create and use logic models” (p. xii). As enthusiastic champions of logic models, the authors adhere to the assumption that articulating precise and detailed logic models will lead to better results.
FavoriteThe authors present an overview of the Kaiser Permanente Community Fund’s social determinants of health initiative and its theory of change. The fund is based at the Northwest Health Foundation. The authors introduce frameworks and methods used to conduct their evaluation. The fund reached multiple sectors and established new partners and relationships, but the lack of depth may limit opportunities to make a profound and measurable difference within any specic domain.
FavoriteThis article reports the results from Mathematica Policy Research’s evaluation of Consumer Voices for Coverage, a program funded by the Robert Wood Johnson Foundation to support the role of consumer health advocacy coalitions in 12 states. The authors propose that funders address three elements of coalition capacity: knowledge, infrastructure, and resources. Each requires different types of interventions.
FavoriteThis article describes the network-building strategy of the Mary Reynolds Babcock Foundation and the role that network officers play in carrying out this strategy. The author then assesses whether this strategy adds value for networks and discusses a range of complications that the strategy introduces, especially with regard to the grantmaker-grantee relations. Assuming that the foundation can meet these challenges, this approach may turn out to be the most effective way for a foundation to assist networks in achieving their full potential.
FavoriteThe authors concentrate on how the nature of the intervention affects evaluation design. They outline a framework for selecting evaluation approaches for two types of grantmaking programs used to achieve far-reaching impact: models and adaptive initatives. Evaluation that is attuned to the transformations in models and adaptive initiatives will continue to help fuel these two powerful engines of social change.
FavoriteInnovation Network developed these three introductory evaluation documents as part of Building Nonprofit Capacity to Evaluate, Learn, and Grow Impact, a workshop we presented in partnership with Grantmakers for Effective Organizations’ Scaling What Works initiative.
FavoriteInnovation Network developed these three introductory evaluation documents as part of Building Nonprofit Capacity to Evaluate, Learn, and Grow Impact, a workshop we presented in partnership with Grantmakers for Effective Organizations’ Scaling What Works initiative.
FavoriteThe Annie E. Casey Foundation developed a exible but rigorous “results framework” tool that helped focus its investments and choose grantees that shared its aims by dening success, specifying results, tracking progress, and aligning its work. The framework includes an understanding of population and program accountability and lays out overarching categories for thinking about results: impact, inuence, and leverage. This paper described how the tool was developed and tested with grantees and shares lessons learned for other philanthropies in the eld of education.
FavoriteThis article examines a Gates Foundation report on a new Program Return on Investment (PROI) tool described herein simplifies monetizing the value of program outcomes and may be able to serve as a solution for valuation for all human service programs.
FavoriteThis article reports on the accomplishments, challenges, and lessons learned in creating a new Department of Research and Evaluation at the California HealthCare Foundation. Different tools were developed to address each of three key areas: performance assessment, organizational learning, and program evaluation.
FavoriteThis article offers a theory-of-change framework to help those engaged in social-justice advocacy to reflect on whether social-justice values are being retained in the process. A reproductive rights effort in South Africa provides an example of how social justice values can be lost in the advocacy process.
FavoriteThis article tells how real-time evaluation memos provide data-based feedback in a timely manner to inform decision making. Memos must be concise and include both data and expert synthesis and interpretation. The foundation must have a learning culture if the memos are to most useful; there must be time to reflect on the content and implications.
Favorite This article examines the principles that guided the development of a participatory outcome evaluation framework for the Wachovia Regional Foundation neighborhood revitalization work. The framework has enabled grantees and residents to better understand and capitalize on market dynamics, enhance their participation in revitalization activities and begin to demonstrate the impact of sustained, strategic interventions.
FavoritePathfinder is a practical guide to the advocacy evaluation process. This edition guides funders through the advocacy evaluation process from start to finish. Editions for advocates and evaluators are also available. Drawn from Innovation Network’s research and consulting experience, Pathfinder encourages the adoption of a “learning-focused evaluation” approach, which prioritizes using knowledge for improvement.
FavoriteThe Colorado Trust has provided three years of general operating support to nine advocacy organizations working to increase access to health through policy change work. The Colorado Trust has worked with Innovation Network to design an evaluation that 1) builds grantees’ capacity to evaluate their work and incorporate real-time feedback into their strategies; 2) monitors the progress of each grantee toward its unique policy goals; and 3) assesses growth in capacity of the health advocacy community in Colorado as a whole.
FavoriteA Ford Foundation funded project to promote mixed income housing in Atlanta resulted in an increase in mixed income housing, but also used a participatory evaluation to document social outcomes such as increased knowledge about housing issues.
FavoriteMore foundations are funding public policy change and standardized policy advocacy evaluation methods are emerging. This article takes a look at the federal policy efforts funded by Atlantic Philanthropies and the W.K. Kellogg Foundation. Discusses two rising grantmaking issues: 1) creating boundaries around the definition of the problem (not easy when it is truly wicked); and, 2) the right stakeholders identifying those boundaries and solutions to the problem within it, arise as key to understanding the “why” behind outcomes of a public policy advocacy.
FavoriteThe John S. and James L. Knight Foundation supplemented its standard evaluation approach by engaging professional journalists to elaborate on evaluation findings. The resulting reports are more direct, even critical, than any prior Knight Foundation attempt to evaluate and assess. It produced deeper looks into the intent and outcome of major initiatives, analyzing and addressing flaws in the theories of change underlying initiatives.
FavoriteA tool for measuring impact can reduce the barriers to funding advocacy and policy work. The tool draws upon the literature on evaluating advocacy and organizing, social capital building efforts, and return on investment approaches to evaluation. The tool was applied in two sites, where funders found it useful to understand advocacy impacts and learn how advocacy can enhance their grantmaking goals.
FavoriteParticipatory evaluation has set the standard for cooperation between program evaluators and stakeholders. Coalition evaluation, however, calls for more extensive collaboration with the community at large. There are many different labels for participatory evaluation, however not one specific to coalition work. This article describes Community-Based Participatory Evaluation CBPE, takes time, money and skilled personnel but can lead to more accurate results and coalition sustainability. Recommendations on how foundations and grantee organizations interested in coalition work should fund CBPE are discussed.
FavoriteThe Robert Wood Johnson Foundation has placed a high priority on program evaluation since its inception as a national philanthropy in 1972. It has developed a four-tiered system of evaluation that ranges from the evaluation of individual grants and clusters of grants to the qualitative assessments found in The Robert Wood Johnson Foundation Anthology series.
This site uses functional cookies and external scripts to improve your experience.
Privacy settings
Privacy Settings
This site uses functional cookies and external scripts to improve your experience. Which cookies and scripts are used and how they impact your visit is specified on the left. You may change your settings at any time. Your choices will not impact your visit.
NOTE: These settings will only apply to the browser and device you are currently using.