One of the questions members ask most frequently of CASE is this: "How much do I need to spend to raise a dollar?" However, there is no simple or universal answer. An institution building its fundraising infrastructure or preparing to launch a campaign, for example, may need to spend significantly more per dollar than one with a mature fundraising operation. Institution type, size, location, mission, fundraising goals, donor base and many other factors influence how much a college or university should invest in fundraising operations. The only certainty is that a lack of investment will yield a lack of results.
Complicating the question is the fact that fundraising success is supported by investments in other areas of advancement, including alumni relations and communications and marketing. Should these investments be considered and, if so, how should they be factored in?
In short, a seemingly simple question is actually quite complex and difficult to answer, given the many variables to consider, a lack of consensus around definitions of appropriate expenditures to include and a lack of comparable data. CASE hopes that this vacuum is filled by the practical AIMS survey tool, which will give members common definitions for what to count; the opportunity to enter their own data regarding expenses, staffing and other factors; and the ability to select their own variables to compare their own expenditures and results—anonymously—with those of peer institutions at similar stages of development.
The closest CASE has come to measuring fundraising investments by educational institutions is a study that resulted in the book Expenditures in Fundraising, Alumni Relations, and Other Constituent (Public) Relations. That work, published in 1990 and no longer in print, was funded by the Lilly Foundation, and is often referred to as the "Lilly Study." However, the Lilly Study's methodology was never applied to any broad and systematic survey. Furthermore, the world of advancement has changed significantly since the time of the Lilly Study, which called for including the cost of typewriter ribbons but did not account for investments in technology. After more than two decades, CASE thought it was time for a fresh approach.
The specific objectives of the AIMS pilot study were to:
Update CASE's 1990 Lilly Study on the cost of fundraising,
Update the methodology used in the Lilly Study by taking into account, for example, changes in information technology and
Refine the methodology and approach on the basis of experience with a pilot group of volunteers.
The specific objectives of the post-pilot AIMS study are to:
Ascertain the real present value of investing in fundraising and variances by type and size of institution, campaign status, staffing and other factors,
Demonstrate the value of advancement to institutions in terms of return on investment and to give members systematic, comparable, well-grounded information for making the case to invest,
Encourage the strategic business use of data for data-driven planning and decision-making in the advancement field,
Develop practical methods and strategies for predicting return on investment that can be used by members when planning and expanding their advancement operations to meet given goals,
Help CASE members articulate the rationale for investing in advancement, assess the effectiveness of their fundraising operations and understand how to interpret resulting numbers and use the information for continuous internal improvement of programs,
Give members the tools to benchmark their programs against their peers through the CASE Benchmarking Toolkit and
Gather information on overall advancement investment for the benefit of members and the profession.
The primary purpose of the AIMS project, and its greatest benefit, has been the development of standardized guidelines, definitions and a methodology to gather expenditure data. Each institution can use these tools to:
Assemble its advancement program cost information in the same way each year—that is, by the same rules—and measure progress from one year to the next in generating the appropriate net return on the dollars invested and
Make informed comparisons of program costs and benefits using data from other, peer institutions.
An additional benefit of the project has been the resulting dataset produced by participating institutions using study guidelines and definitions. This report presents these data in aggregate form so that other universities and colleges have, in effect, a ready set of peer institutions with which to compare themselves. As AIMS is repeated in the future, CASE and its members will also be able to track trends over time.
University and college advancement professionals now have access to expenditure data from private and public institutions of three major types: doctoral, master's and baccalaureate. Data are also broken down by various stages of fundraising program maturity--start-up, emerging and mature--and by institutions in a campaign or not. The AIMS data will enable participants and others to benchmark their programs in a variety of ways.
Beginning with the FY2011-2012 study, CASE-member participants will receive a complimentary copy of the AIMS report, which will be a for-sale resource to non-participants.
The AIMS study will be conducted using the CASE Benchmarking Toolkit, an online survey tool that helps advancement professionals benchmark activities, staffing, expenditures and other aspects of their program. Learn more about the CASE Benchmarking Toolkit. Use of the toolkit is a benefit of CASE membership.
Participation is open to CASE-member independent schools, colleges and universities in the United States and Canada.
At most institutions, several individuals will need to contribute data from several offices for the survey, but only one individual may be designated to enter the data into the online survey tool. For example, the institution may designate an individual in the advancement services office (or another office) to fill out the survey, and this person may work with appropriate staff members in alumni relations, communications and marketing and development to collect data before filling out the survey.
No. You may fill out parts of the survey as the data becomes available and return to it as many times as you need before completing and submitting it.
While the survey itself is not complex, the questions will require the commitment to gather data from across the advancement offices and calculate it as requested. (Not all costs for an activity may be included in the budget for that department.) This will require significant legwork on the part of the individual responsible for filling it out and significant collaboration with staff colleagues in other offices. The amount of time it takes to gather this information will vary widely based on the size and structure of the institution. We recommend that institutions allow for several weeks to collect data before completing the survey.
CASE will notify members of the opportunity to participate by email and in CASE communications outlets, including BriefCASE and the CASE website.
To sign up, please send the name of the institution and the name and contact information, including email address, of the individual designated to fill out the survey to the CASE research department. Please include AIMS higher education or AIMS independent schools in the subject line of the email. The CASE research staff will activate the designee's access to the appropriate survey and notify him or her directly when the survey opens.
The person designated by each institution to respond to the survey will see his or her own institution's data in full, but the data of other institutions participating in the survey and the summary statistics will not be reported in any way that connects actual data to an individual institution by name.
To help institutions benchmark with their peers, however, participants will be able to select a group of institutions by name (as long as there are at least six in the group including the participating institution) within the CASE Benchmarking Toolkit and compare their institution's results with the group as a whole (by not individually by name). In fact, CASE suggests that members of institutions interested in benchmarking their responses with a peer group of institutions encourage their peers to participate. Peer institutions might be defined as those in a particular athletics conference, those within a particular region, or those that share a particular characteristic, such as size or affiliation.
Participants will be able to benchmark their responses to individual questions with all respondents (higher education or independent school, depending on your survey group) as well as with groups of respondents that they define. Data are available in three ways:
Participating institutions can benchmark their responses to individual questions within the CASE Benchmarking Toolkit. They can compare responses with those of all respondents or with a group of selected institutions, as long as there are at least six in the group. Respondents who select a specific group of institutions can compare with the aggregate of responses within the group; in other words, data will not be attached to individual institutions by name.
CASE reports summary data, including calculations on return on investment, by institution type and other factors through a variety of publications and outlets for the benefit of the profession. All participating institutions receive a complimentary copy of this report.
If 10 or more institutions in a peer group agree to participate and notify CASE of this agreement, CASE provides a report of findings, including calculations on return on investment, to the members of that group. The report looks at the aggregate responses of the group and does not include details associated with individual institution by name.
The basic calculation looks at expenses for fundraising/development and advancement services against total funds raised. However, the survey asks participants to report expenses and staffing in alumni relations and communications and marketing to allow for benchmarking within those advancement disciplines and to allow CASE and institutions to look at overall investments in advancement in multiple ways.
CASE strongly believes that there is no single, universally "correct" figure for how much an institution should invest to raise a dollar, since institutions are all at different phases in their fundraising programs, have different missions and goals, and different donor bases upon which to draw. CASE will make this point in all communications related to the project. CASE will not release any data identifying any institution by name, and released data sorted by, for example, institution type is expressed in ranges rather than in absolute, single figures.
Participants decide what to provide their own constituents related to how they compare with their peers and with the whole.
CASE will also provide materials that will give context to the findings and help members understand, explain and use the data.
CASE appointed a volunteer advisory group of seasoned fundraising professionals and expert association representatives to lead the process with assistance from CASE staff. The group had extensive discussions around the difficult questions that need to be answered to reach consensus about the methodology that will best serve the profession. Several senior independent school advancement professionals reviewed, and modified as needed, the higher education survey instrument and accompanying definitions and guidelines to make sure the resulting documents resonated with the schools community.
The advisory group reviewed the Lilly Study in detail and scanned other relevant literature to identify modern developments in cost-of-fundraising analysis to arrive at key questions that were still open for interpretation. Members then agreed on scope, detail, working definitions and boundaries to resolve those questions before developing a pilot survey and a supporting document to assist respondents.
Members conducted a pre-pilot test using their own institutional data before recruiting other institutions to participate in a pilot test that took place in spring 2009. The pilot study allowed wider testing of the approach and the survey questions and provided preliminary calculations with real results.
Members of the group were chosen because of their extensive personal experience with educational fundraising and their deep understanding of institutional issues and data.
They include:
Public colleges and universities
Thomas J. Mitchell, chair, vice chancellor, University Advancement, University of California, Irvine
Lori Redfearn, assistant vice chancellor, advancement services, Division of Business and Finance, California State University
Connie Kravas, vice president for development and alumni relations, University of Washington
Private colleges and universities
Richard Boardman, associate dean for development and alumni relations and senior adviser, Harvard Law School
Dr. Peyton R. Helm, president, Muhlenberg College
David T. Blasingame, executive vice chancellor for alumni and development programs, Washington University in St. Louis
CASE Educational Partners
John Glier, principal, Grenzebach Glier & Associates Inc.
Donald M. Fellows, president and CEO, Marts and Lundy Inc.
Other associations
Ann E. Kaplan, director of the Voluntary Support of Education survey, Council for Aid to Education
Matt Hamill, senior vice president, advocacy and issues analysis, National Association of College and University Business Officers
The inaugural study of the 2009-2010 fiscal year benchmarked investments and staffing in each of the advancement disciplines (advancement services, alumni relations, communications and marketing, fundraising and advancement management) as well as the return on the investment in fundraising specifically.