Thursday, August 23, 2012

R to the O to the I

In March, we released More to Most: Scaling Up Effective Community College Practices, a guidebook that lays out a process for identifying promising practices and developing an expansion plan so that good programs reach as many students as possible. Central to the process is the SCALERS framework, developed at Duke University and discussed extensively on this blog. There are other steps that form the foundation for SCALERS application. We’re going to tell you about some of them in our next few posts. To download a copy of More to Most, visit the website: www.more2most.org.

The first step in the process is determining program value; you can read a rundown of that step here. One part of that rationale—and a rather popular one in these austere days of budget cuts—is demonstrating the return on investment (ROI), the sometimes-hard-to-quantify case that shows program spending generates a financial return (in increased FTE or other revenues) that offsets operating costs. Standardizing this process, where possible, can help you compare different programs and make decisions about which programs should be discontinued and which should be expanded. This kind of information, of course, needs to be combined with other qualitative data about program effects, and considerations of faculty and student engagement and institutional priorities.

Program ROI
There are a number of ways to analyze the connection between a program’s results and its costs. In 2009, Jobs For the Future and the Delta Cost Project released Calculating Cost-Return on Investments in Student Success. The report determines cost-return on investment by assessing student retention for program and non-program students, the resources required for program operation, and the revenue gained by additional retention using the following data and calculations:
  1. Additional number of students enrolled into the next year because of the program: Calculated using number of students served, one-year retention rates for program participants, number of participating students retained, and one-year retention rates for non-program students.
  2. Direct costs of program operation: Calculated based on expenditures for personnel, supplies and equipment, stipends for students, facilities, etc
  3. Spending and revenue data: Calculated from average expenditures per student, and additional tuition, state, and federal revenue gained from increased retention.

These simple calculations could be the beginning of a formula that is unique to your institution, one that incorporates the costs, revenue, and priorities that are most relevant to your program goals.

Local and State ROI
You can also think about ROI beyond the bounds of a program. The various student success programs and policies at a college add up to increasing numbers of students completing credentials. (At least, that’s where we’re headed, right?) And increased completion outcomes can lead to positive returns for communities and states, even the nation as a whole. Being able to make such a case at your college—and across a system—could garner support from policy makers at the state and federal level. The Center for Law and Social Policy (CLASP) partnered with the National Center for Higher Education Management Systems (NCHEMS) to create a Return on Investment Dashboard. The dashboard combines data from the Census Bureau, National Center for Education Statistics, and Department of Education to project short- and long-term effects of maintaining the educational attainment status quo vs. increasing the number of credentialed adults in a particular state.

In addition to credential attainment, state summaries include projections for personal economic gain, as well as returns to the state in the form of state income, property, and sales taxes. The dashboard also has figures for Medicaid savings and correctional system expenditures. The dashboard allows you to manipulate high school completion, college going rates, credential attainment, and enrollment patterns to see how increases or decreases might affect individual and state returns on investment.

ROI and External Funding

The growth of social innovation financing or social impact bonds is one way that demonstrating return-on-investment is tied directly to external funding. Organizations cover initial program costs by borrowing money from foundations and other investors. If the organization meets agreed upon outcomes, they’ll receive additional funds from the state, and original investors will receive a portion of those funds. While this form of financing is primarily occurring in the nonprofit and social enterprise sectors, it could have implications for the education sector as well. You can read a great summary of social impact bonds on the Nonprofit Finance Fund website and check out how some social programs in Massachusetts are implementing this type of financing.

Whether you’re trying to quantify program costs, make a state-wide case for investing in a particular innovation, or looking for new ways to fund your efforts, understanding and articulating the return-on-investment is an important piece of the scaling up puzzle.

Tuesday, August 21, 2012

Is it worth it?

In March, we released More to Most: Scaling Up Effective Community College Practices, a guidebook that lays out a process for identifying promising practices and developing an expansion plan so that good programs reach as many students as possible. Central to the process is the SCALERS framework, developed at Duke University and discussed extensively on this blog. There are other steps that form the foundation for SCALERS application. We’re going to tell you about some of them in our next few posts. To download a copy of More to Most, visit the website: www.more2most.org.

Before you create a plan for scaling up, you need to decide if the program or practice you want to scale is actually effective. The last thing you need is to do more of something that isn’t even worthwhile. (Such logic only applies to things like eating Doritos or watching episodes of Chopped.) Chapter 2 of More to Most, Determining Program Value, lays out a process for assessing the value of a program.

The first stage of this process is defining the problem you are trying to address with your program and identifying your desired outcome. An example of a well-defined problem:
Too many students who test into three developmental education courses never successfully complete a college-level math or English course.
An example of a concrete outcome:
Currently, X percent of students successfully complete gateway math in one year. We will increase this by X percentage points by [YEAR].
You’ve got a program that’s designed to address this problem and help you reach this outcome. During the next stage, you collect evidence that will help you decide how well the program is achieving the desired outcome. What evidence of the program’s impact, both quantitative and qualitative, is available? Possible sources include:
  • Basic demographic profile of student body, a target group, and/or program participants
  • Course completion data: by course; by student cohort
  • Focus group data from students, staff, and/or faculty
  • Data from national surveys like CCSSE, SENSE, Noel-Levitz, etc.
Now, you can determine whether the program or practice meets criteria for effectiveness. We focus on two important criteria:
  • Does the evidence show that the program delivers the desired outcome?
  • Would scaling up the program align with institutional objectives?
It is important to consider both of these questions when determining the value of scaling up the program. Many institutions analyze data disaggregated by race, income, and other demographic factors and identify achievement gaps among student populations. If closing these gaps is an institutional priority for your college and one of the desired outcomes of your program, then make sure you analyze the evidence for how effectively the program accomplishes this goal. When comparing the evidence for several programs, keep in mind that if the program has positive outcomes for a designated population that is generally less successful, it may show a lesser impact on overall student success outcomes, at least in the short term. This does not make the program less valuable, however. The value comes from how well the program matches your desired outcomes and the institution’s priorities — or makes a case for changing those priorities.

Once you’ve collected your evidence, it’s time to make the case for expansion and developing a scaling strategy. We’ll discuss those steps in upcoming posts.

Wednesday, August 8, 2012

Guest Post: A Closer Look at Accelerating Opportunity

Today, Rachel Pleasants, senior project manager at Jobs for the Future, shares the inside scoop on a national effort to restructure adult basic education, another part of the postsecondary pipeline that shares some characteristics—and students—with developmental education programs.

Accelerating Opportunity, an initiative managed by Jobs for the Future, has ambitious goals: to change the way adult basic education (ABE) is structured and delivered at the state and college levels so that substantially more low-skilled adults get the education and credentials they need to access family-supporting careers. Building on Washington’s Integrated Basic Education and Skills Training (I-BEST) program and the Breaking Through initiative, Accelerating Opportunity promotes the development of integrated pathway models that combine ABE with career and technical training.

It’s clear that postsecondary credentials are essential for accessing jobs that pay a living wage, but these credentials are out of reach for many adults without a high school diploma or GED. Low-skilled adults seeking to advance their education and career face numerous barriers to success, including a lack of career guidance and disconnected educational systems. Like developmental education students, ABE students often find themselves in long remedial sequences, with very few ultimately transitioning to postsecondary credit-bearing programs.

Through Accelerating Opportunity, JFF and our partners and funders aim to address the systemic barriers that prevent low-skilled adults from achieving their goals. We believe that in order for this to happen, states and their colleges have to focus on three areas: developing career pathways, shifting their culture to one that views ABE as an important part of the postsecondary pipeline, and building in plans for scale and sustainability. This is a major undertaking that includes changes in policy as well as practice. And not only are we asking states and colleges to engage in systems change, we are asking them to do it at scale: each implementation state in the initiative (five so far) has committed to awarding at least 3,600 credentials to students in the target population within three years.

The initiative began with a one-year design phase; in November 2011, the leadership team selected Illinois, Kansas, Kentucky, and North Carolina to move into the implementation phase. In May 2012, we added Louisiana as a fifth state. Across these five states more than forty colleges are developing and implementing integrated pathways.

Far from being deterred from the ambitious goals set out by Jobs for the Future along with its funders and partners, the states have embraced the Accelerating Opportunity vision and are already producing results. Less than a year into the implementation work, nearly all the participating colleges have pathway programs in place, enrolling a total of more than 800 students. Students and faculty are beginning to see the benefits of an integrated pathway approach and the team teaching model. Partnerships between ABE, career and technical education, the workforce system, and TANF agencies are being developed and strengthened. Some states have even begun to move toward policy changes. In Illinois, for example, ABE outcomes, including transition to postsecondary, is now part of the state’s performance funding formula. In Kansas, the eligibility criteria of a state scholarship fund have been revamped to better target AO students.

There are still many challenges ahead for the five states, including funding (especially given the loss of Pell’s Ability to Benefit provision), recruitment, professional development, and stakeholder engagement.  But we see a remarkable commitment on the part of state and college leaders to developing the types of pathways and structures that will enable many more low skilled-adults to access and succeed in postsecondary training. For example, the governors in many of the implementation states have supported JFF and other national organizations in advocating for the inclusion of an exception to the Ability to Benefit change for students enrolled in career pathway programs.

There is a growing national emphasis on career pathway development and an increasing awareness of the importance of postsecondary education, and the goals of Accelerating Opportunity are aligned with these national trends. JFF and our partners and funders believe that Accelerating Opportunity has the potential to raise the profile of adult basic education, ensure its inclusion in the college completion agenda, and ultimately provide thousands of adults with access to economic opportunity. In all this work, there is shared commitment with other national initiatives like Achieving the Dream and DEI as well as collaboration and peer learning toward a shared goal: accelerating progress for all students toward postsecondary credentials.