Wednesday, September 26, 2012

New Workshop Helps Peer Leaders Keep Students On Course!

Today’s post comes from Ruth Silon, executive director of the Ohio Association of Community Colleges’ Student Success Center and former DEI director and English faculty member at Cuyahoga Community College (CCC). Ruth describes how CCC combined what they learned from two of their DEI initiatives to increase the impact of student leaders on their campuses.

As Cuyahoga Community College’s Developmental Education Initiative (DEI) grant came to an end, making good use of the remaining funds was not a challenge. The College’s grant used peer leaders in two of its major initiatives: Supplemental Instruction (SI) and Peer Mentoring. For another initiative, a redesign of Math 0850, Skip Downing’s On Course principles were a major part of the course. Most math faculty who taught the course attended On Course Level I and II workshops during the college’s involvement with Achieving the Dream and DEI. During DEI, the workshops were offered to all faculty.

Two occurrences led us to the decision to hold an On Course workshop for student peer leaders. First, during the second year of DEI, English professor Mary Ward, who used On Course in her class, asked if her SI leader, Tiffany, could be trained. Having Tiffany in the workshop with the faculty was a real benefit as her presence added a valuable perspective to the training. Instead of just talking about On Course in an abstract setting, we talked about its principles and their value for students with a student. Tiffany then went on to promote On Course as an SI Leader with students in her English class. Then, during the third year of DEI, Robin Middleton from Jamestown Community College (JCC) facilitated the Level II workshop. From her we learned about a new workshop “Creating a Culture of Success: On Course for Front Line Staff.” In order to promote culture change across the college, front-line staff who work in admissions and financial aid were trained in On Course principles and techniques. Although a workshop for front-line peer student leaders had not been tested, Skip and Robin agreed to put it together.

On September 7, 2012, over forty student leaders from multiple Cuyahoga Community College (CCC) campuses attended the one-day workshop. The students were supplemental instruction leaders, student ambassadors, peer advocates with the C2C program, student coaches with Cleveland Transfer Connection, and an AmeriCorps coach. The students dove deep into discussions about the following topics: “Eight Choices of Successful Students,” “On Course Core Beliefs,” “The Language of Responsibility,” “Victim versus Creator Language,” and “Staying On Course When at a Fork in the Road of Life.” They practiced working with each other on the wise choice process and the problem resolution process. One of the students told me that she worries about students who are very belligerent: “They don’t know how to react to anything except by getting angry and using their street behavior.”  Student leaders quickly saw that using the wise choice process would slow down students’ reaction process and get them to think more critically. Another student said, “I think On Course will help me save my marriage!” 

The group activity that impressed many attendees was “Creating a Mission Statement for Front Line Staff.”  The task was to create a statement of “purpose you would be proud to post in your work space for all to see…. a Mission Statement that will guide your important work with students.”  Below are a few statements from the student collaboration:
  • To serve as a support system with students to help them get their Associate’s Degree and achieve a rewarding career
  • To provide an environment where students feel that they are part of a community
  • Help students to learn and achieve the tools to be successful students and citizens

Giving these leaders the opportunity to put into words what their goals are and recognize how important their position is was quite moving. They felt empowered and then would go off to empower students. Clearly, this is a workshop that other colleges should think about bringing to their institution.

It is also important to mention that staff who work with these student leaders attended the workshop and many have attended On Course Level I and II training. If this program for student leaders is to work, trained staff will have to continue the conversation with them and encourage the use of On Course principles and methods.

Often as a grant ends, you find yourself saying, “We wish we knew then what we know now; we could have done some things differently or sooner.”  Blending together two of our initiatives, using student leaders and On Course training, was a great example of the culmination of our learning. I am so pleased that we were able to make it happen.

Tuesday, September 4, 2012

A President's Reflection: Scaling with Fidelity

Last week, we discussed different ways to describe the return on investment from a particular program. A convincing ROI can go a long way in generating support for a scaling strategy. In today’s post, Sanford C. “Sandy” Shugart, president of Valencia College, reminds us of how important it is to keep testing that ROI. The returns from a pilot might change dramatically when you expand to more people and disciplines. Maintaining fidelity to the proven model is essential if you want to maintain (and continuing improving) results.

Valencia College entered DEI with a history of innovation and a clear sense of areas to scale for effect in developmental programs. Specifically, the college planned to:
  • expand dramatically the use of supplemental instruction (SI) in developmental mathematics and other gateway courses 
  • grow the number of paired courses designed to function as learning communities 
  • expand the offering of student success courses while embedding the success strategies in other first-year courses 
  • expand a transition program for certain at-risk students called Bridges to Success
In the early going, we believed the initial stages of scaling—clarifying the model, recruiting adopters from the faculty and students, training the implementers, and having the will and support to reallocate scarce resources—would present the greatest challenges. We were wrong. Early stages of implementation were marked by enthusiasm, collaboration, and high expectations for success. The performance data were eagerly anticipated each term and became important levers in the conversations and revisions to the programs that followed.

Scaling went rather well, with each treatment showing positive early results and the general trend in student performance moving toward accelerated success—more credit hours earned in fewer academic terms by each succeeding cohort of students. With this success and the word of mouth among both students and faculty about the work, recruiting adopters became much easier. In fact, demand for SI-supported sections quickly grew beyond the initial disciplines and courses selected. (Valencia served more than 11,000 students in SI-supported course sections last year.) These were generally positive signs. However, as we scaled to greater and greater numbers of course sections, faculty, and students, we began to discover a new set of challenges.

SI will serve as an example, though we had similar challenges in other innovations. As we reviewed data from the SI sections versus “non-SI” sections, we began to find significant differences in the effect on indicators like student persistence to the end of course and grade distribution. Some of these seemed almost random, while others clearly showed a pattern (by campus, by discipline, etc.). Deeper inquiry revealed that the effect for students who actually participated in the supplemental activities that make SI effective were almost uniformly high for all students. What the discrepancy in the data revealed were major differences in implementation. It seems some faculty had found ways to encourage, perhaps even require, active student engagement in supplemental learning activities, while others had managed to achieve very little of this engagement. These differences existed in spite of aggressive training efforts prior to implementation.

Similarly, we found that while SI was especially effective in many of the mathematics gateway courses, the effect was much less striking in many of the courses to which it was added in the later years of our work (economics, political science, etc.) Again, we inquired more deeply and discovered not only differences in methods, but very different understandings among the faculty (and students) about the purposes of SI and the outcomes we were seeking.

We had, as our projects reached maturity, encountered the problem of scaling with fidelity to the model. It seems that our discipline of innovation needs to include ongoing evaluation of the purposes and steps to implementation, continuing staff training, and rigorous data analysis to assure that the treatment in which we are investing doesn’t take on a life of its own after institutionalization. A useful lesson!

Thursday, August 23, 2012

R to the O to the I

In March, we released More to Most: Scaling Up Effective Community College Practices, a guidebook that lays out a process for identifying promising practices and developing an expansion plan so that good programs reach as many students as possible. Central to the process is the SCALERS framework, developed at Duke University and discussed extensively on this blog. There are other steps that form the foundation for SCALERS application. We’re going to tell you about some of them in our next few posts. To download a copy of More to Most, visit the website: www.more2most.org.

The first step in the process is determining program value; you can read a rundown of that step here. One part of that rationale—and a rather popular one in these austere days of budget cuts—is demonstrating the return on investment (ROI), the sometimes-hard-to-quantify case that shows program spending generates a financial return (in increased FTE or other revenues) that offsets operating costs. Standardizing this process, where possible, can help you compare different programs and make decisions about which programs should be discontinued and which should be expanded. This kind of information, of course, needs to be combined with other qualitative data about program effects, and considerations of faculty and student engagement and institutional priorities.

Program ROI
There are a number of ways to analyze the connection between a program’s results and its costs. In 2009, Jobs For the Future and the Delta Cost Project released Calculating Cost-Return on Investments in Student Success. The report determines cost-return on investment by assessing student retention for program and non-program students, the resources required for program operation, and the revenue gained by additional retention using the following data and calculations:
  1. Additional number of students enrolled into the next year because of the program: Calculated using number of students served, one-year retention rates for program participants, number of participating students retained, and one-year retention rates for non-program students.
  2. Direct costs of program operation: Calculated based on expenditures for personnel, supplies and equipment, stipends for students, facilities, etc
  3. Spending and revenue data: Calculated from average expenditures per student, and additional tuition, state, and federal revenue gained from increased retention.

These simple calculations could be the beginning of a formula that is unique to your institution, one that incorporates the costs, revenue, and priorities that are most relevant to your program goals.

Local and State ROI
You can also think about ROI beyond the bounds of a program. The various student success programs and policies at a college add up to increasing numbers of students completing credentials. (At least, that’s where we’re headed, right?) And increased completion outcomes can lead to positive returns for communities and states, even the nation as a whole. Being able to make such a case at your college—and across a system—could garner support from policy makers at the state and federal level. The Center for Law and Social Policy (CLASP) partnered with the National Center for Higher Education Management Systems (NCHEMS) to create a Return on Investment Dashboard. The dashboard combines data from the Census Bureau, National Center for Education Statistics, and Department of Education to project short- and long-term effects of maintaining the educational attainment status quo vs. increasing the number of credentialed adults in a particular state.

In addition to credential attainment, state summaries include projections for personal economic gain, as well as returns to the state in the form of state income, property, and sales taxes. The dashboard also has figures for Medicaid savings and correctional system expenditures. The dashboard allows you to manipulate high school completion, college going rates, credential attainment, and enrollment patterns to see how increases or decreases might affect individual and state returns on investment.

ROI and External Funding

The growth of social innovation financing or social impact bonds is one way that demonstrating return-on-investment is tied directly to external funding. Organizations cover initial program costs by borrowing money from foundations and other investors. If the organization meets agreed upon outcomes, they’ll receive additional funds from the state, and original investors will receive a portion of those funds. While this form of financing is primarily occurring in the nonprofit and social enterprise sectors, it could have implications for the education sector as well. You can read a great summary of social impact bonds on the Nonprofit Finance Fund website and check out how some social programs in Massachusetts are implementing this type of financing.

Whether you’re trying to quantify program costs, make a state-wide case for investing in a particular innovation, or looking for new ways to fund your efforts, understanding and articulating the return-on-investment is an important piece of the scaling up puzzle.

Tuesday, August 21, 2012

Is it worth it?

In March, we released More to Most: Scaling Up Effective Community College Practices, a guidebook that lays out a process for identifying promising practices and developing an expansion plan so that good programs reach as many students as possible. Central to the process is the SCALERS framework, developed at Duke University and discussed extensively on this blog. There are other steps that form the foundation for SCALERS application. We’re going to tell you about some of them in our next few posts. To download a copy of More to Most, visit the website: www.more2most.org.

Before you create a plan for scaling up, you need to decide if the program or practice you want to scale is actually effective. The last thing you need is to do more of something that isn’t even worthwhile. (Such logic only applies to things like eating Doritos or watching episodes of Chopped.) Chapter 2 of More to Most, Determining Program Value, lays out a process for assessing the value of a program.

The first stage of this process is defining the problem you are trying to address with your program and identifying your desired outcome. An example of a well-defined problem:
Too many students who test into three developmental education courses never successfully complete a college-level math or English course.
An example of a concrete outcome:
Currently, X percent of students successfully complete gateway math in one year. We will increase this by X percentage points by [YEAR].
You’ve got a program that’s designed to address this problem and help you reach this outcome. During the next stage, you collect evidence that will help you decide how well the program is achieving the desired outcome. What evidence of the program’s impact, both quantitative and qualitative, is available? Possible sources include:
  • Basic demographic profile of student body, a target group, and/or program participants
  • Course completion data: by course; by student cohort
  • Focus group data from students, staff, and/or faculty
  • Data from national surveys like CCSSE, SENSE, Noel-Levitz, etc.
Now, you can determine whether the program or practice meets criteria for effectiveness. We focus on two important criteria:
  • Does the evidence show that the program delivers the desired outcome?
  • Would scaling up the program align with institutional objectives?
It is important to consider both of these questions when determining the value of scaling up the program. Many institutions analyze data disaggregated by race, income, and other demographic factors and identify achievement gaps among student populations. If closing these gaps is an institutional priority for your college and one of the desired outcomes of your program, then make sure you analyze the evidence for how effectively the program accomplishes this goal. When comparing the evidence for several programs, keep in mind that if the program has positive outcomes for a designated population that is generally less successful, it may show a lesser impact on overall student success outcomes, at least in the short term. This does not make the program less valuable, however. The value comes from how well the program matches your desired outcomes and the institution’s priorities — or makes a case for changing those priorities.

Once you’ve collected your evidence, it’s time to make the case for expansion and developing a scaling strategy. We’ll discuss those steps in upcoming posts.

Wednesday, August 8, 2012

Guest Post: A Closer Look at Accelerating Opportunity

Today, Rachel Pleasants, senior project manager at Jobs for the Future, shares the inside scoop on a national effort to restructure adult basic education, another part of the postsecondary pipeline that shares some characteristics—and students—with developmental education programs.

Accelerating Opportunity, an initiative managed by Jobs for the Future, has ambitious goals: to change the way adult basic education (ABE) is structured and delivered at the state and college levels so that substantially more low-skilled adults get the education and credentials they need to access family-supporting careers. Building on Washington’s Integrated Basic Education and Skills Training (I-BEST) program and the Breaking Through initiative, Accelerating Opportunity promotes the development of integrated pathway models that combine ABE with career and technical training.

It’s clear that postsecondary credentials are essential for accessing jobs that pay a living wage, but these credentials are out of reach for many adults without a high school diploma or GED. Low-skilled adults seeking to advance their education and career face numerous barriers to success, including a lack of career guidance and disconnected educational systems. Like developmental education students, ABE students often find themselves in long remedial sequences, with very few ultimately transitioning to postsecondary credit-bearing programs.

Through Accelerating Opportunity, JFF and our partners and funders aim to address the systemic barriers that prevent low-skilled adults from achieving their goals. We believe that in order for this to happen, states and their colleges have to focus on three areas: developing career pathways, shifting their culture to one that views ABE as an important part of the postsecondary pipeline, and building in plans for scale and sustainability. This is a major undertaking that includes changes in policy as well as practice. And not only are we asking states and colleges to engage in systems change, we are asking them to do it at scale: each implementation state in the initiative (five so far) has committed to awarding at least 3,600 credentials to students in the target population within three years.

The initiative began with a one-year design phase; in November 2011, the leadership team selected Illinois, Kansas, Kentucky, and North Carolina to move into the implementation phase. In May 2012, we added Louisiana as a fifth state. Across these five states more than forty colleges are developing and implementing integrated pathways.

Far from being deterred from the ambitious goals set out by Jobs for the Future along with its funders and partners, the states have embraced the Accelerating Opportunity vision and are already producing results. Less than a year into the implementation work, nearly all the participating colleges have pathway programs in place, enrolling a total of more than 800 students. Students and faculty are beginning to see the benefits of an integrated pathway approach and the team teaching model. Partnerships between ABE, career and technical education, the workforce system, and TANF agencies are being developed and strengthened. Some states have even begun to move toward policy changes. In Illinois, for example, ABE outcomes, including transition to postsecondary, is now part of the state’s performance funding formula. In Kansas, the eligibility criteria of a state scholarship fund have been revamped to better target AO students.

There are still many challenges ahead for the five states, including funding (especially given the loss of Pell’s Ability to Benefit provision), recruitment, professional development, and stakeholder engagement.  But we see a remarkable commitment on the part of state and college leaders to developing the types of pathways and structures that will enable many more low skilled-adults to access and succeed in postsecondary training. For example, the governors in many of the implementation states have supported JFF and other national organizations in advocating for the inclusion of an exception to the Ability to Benefit change for students enrolled in career pathway programs.

There is a growing national emphasis on career pathway development and an increasing awareness of the importance of postsecondary education, and the goals of Accelerating Opportunity are aligned with these national trends. JFF and our partners and funders believe that Accelerating Opportunity has the potential to raise the profile of adult basic education, ensure its inclusion in the college completion agenda, and ultimately provide thousands of adults with access to economic opportunity. In all this work, there is shared commitment with other national initiatives like Achieving the Dream and DEI as well as collaboration and peer learning toward a shared goal: accelerating progress for all students toward postsecondary credentials.