Thursday, December 13, 2012

Y'all Have Been Busy!

We want to make sure you know about three new publications that have come out in the last few weeks, all courtesy of our DEI state policy partner, Jobs for the Future (and some of their esteemed colleagues):

  • In Ahead of the Curve: State Success in the Developmental Education Initiative, David Altstadt presents the major reforms that the six DEI states enacted over the course of the three-year initiative. The report follows the DEI State Policy Framework, with examples of policy change on all three fronts: data-driven improvement, commitment to innovation, and policy supports. Each of the six states is featured in a case study; Altstadt also includes an analysis of the states’ progress as measured by the DEI State Policy Framework Self-Assessment Tool. You can read about new data systems, new curriculum, and new assessment practices, all deployed across states, that are paving the way for college innovation and improved outcomes for students who are underprepared for postsecondary study.
 
  • In Cornerstones of Completion: State Policy Support for Accelerated, Structured Pathways to College Credentials and Transfer, Lara K. Couturier recommends ten state policies that can support colleges who are creating “accelerated, structured pathways to completion” (just like the title says!). Using the colleges participating in Completion by Design as a backdrop, the report begins by contrasting the current experience of a typical community college student with what that student might experience in a more structured pathway program. Couturier then lays out the ten policy recommendations—from transfer agreements, to use of labor market information, to faculty professional development—and backs each one up with a summary of recent research. 
 
  • Just yesterday, Jobs for the Future, along with the Charles A. Dana Center, Complete College America, Education Commission of the States, released Core Principles for Transforming Remedial Education: A Joint Statement. The statement lays out seven principles for a new approach to ensure “that all students are ready for and can successfully complete college-level work that leads to a postsecondary credential of value.” Many of the principles will be familiar to those that have been following DEI states and colleges and other national reform efforts, among them more accurate assessment and course options that accelerate students’ progression through remediation to gateway courses. The challenge will be to provide the institutional and state support for faculty, staff, and administration as they find the mix of policy and practice that works best for their students. Also imperative is ensuring that new methods of assessment and instruction don’t leave students with the most significant barriers to education and training without a way to access postsecondary training. You can read more commentary on the statement in today’s Chronicle of Higher Education.

Thursday, November 1, 2012

Coming Soon: Publications Highlight Lessons Learned from the Developmental Education Initiative

The three-year Developmental Education Initiative (DEI) is drawing to a close. While the participating colleges and states are moving ahead with many of their expanded developmental education efforts, this also is a time to reflect on what we’ve learned over the last three years. We wanted to alert you to four upcoming publications from DEI partners that will delve into questions about success, challenges, and insights into where college, state, and funder priorities ought to be going forward. These publications, summarized below, will be released over the next three months. We’ll alert you when they hit the streets! We hope you’ll read them, share them, and make connections to your own work and learning.
 

Bringing Developmental Education to Scale: Lessons from the Developmental Education Initiative
Janet C. Quint, Shanna S. Jaggars, D. Crystal Byndloss, and Asya Magazinnik, MDRC and Community College Research Center
This second and final report from the official evaluation of the DEI colleges examines the degree to which the institutions scaled up their chosen developmental education reforms to serve more students, the factors that affected their ability to expand these programs and practices, and the extent to which these strategies were associated with improved student outcomes. It also considers ways that participation in DEI influenced the colleges more broadly. For these reasons, it may be of interest to other colleges looking to scale-up reforms, especially those related to instruction and the provision of student supports, as well as to funders concerned about how best to help community colleges bring promising ideas to scale. The evaluation, conducted by MDRC and its partner, the Community College Research Center at Teachers College, Columbia University, draws on both qualitative data (primarily interviews with key personnel at all 15 institutions) and quantitative data (information on participation and on student outcomes that the colleges regularly collected).
 

Ahead of the Curve: State Success in the Developmental Education Initiative
David Altstadt for Jobs for the Future
Building on their work through Achieving the Dream, six states and 15 community colleges joined the Developmental Education Initiative in 2009 to take on the daunting challenge of improving the success of students who enter the community college academically underprepared. Teams from the six DEI states— Connecticut, Florida, North Carolina, Ohio, Texas, and Virginia —working with Jobs for the Future, which has managed the DEI state policy effort, co-developed an ambitious, evidence-based state policy framework to guide large-scale, multi-faceted reforms in how community colleges provide underprepared students with the skills they need to succeed in college courses. Three years later, with the initiative winding down, these states have made significant progress in adopting the DEI policy recommendations and, as a result, they have augmented, accelerated, and spread developmental education systems change across their community colleges. Ahead of the Curve spotlights the major policy accomplishments of the Developmental Education Initiative by profiling specific innovations in each of the six states and by documenting the degree to which these states have pursued common strategies and policy levers contained in the initiative’s systems-change framework.
 

Presidential Reflections: What DEI Taught Us
Edited by Madeline Patton for MDC
The Developmental Education Initiative asked 15 college leaders to take what they’d learned in early Achieving the Dream efforts and apply that to the challenge of scaling up: what resources, policies, and practices are essential to scaling up effective developmental education efforts? Finding ways to move more students through developmental education more quickly—or bypass it altogether—while maintaining successful student outcomes required leadership and commitment from every level of the organization. In this essay collection, the presidents of the 15 DEI colleges reflect on what they learned about building, embedding, and maintaining systemic change in their institutions—particularly in the difficult field of developmental education— through work with their trustees, students, faculty, staff, and community. They discuss how they and their colleges took on identifying successful innovations and scaling them up in the midst of leadership transitions, serious reductions in financial resources, and major changes in organizational structure.


What We Know: A Synthesis of DEI College Learning
Abby Parcell, MDC, and Cynthia Ferrell, Community College Leadership Program
In February 2012, MDC convened DEI college teams composed of faculty, administrators, and presidents. We mixed them up—different colleges, different states, different roles—and asked them to create the ideal path for underprepared students to get from college entry to credential completion. Drawing on their collective knowledge, particularly what they’d learned during DEI, the teams considered four points of interaction with students or potential students: early intervention and access, advising and support services, developmental education instruction, and alignment with credential and degree programs. Six teams and six hours later, we had six designs that displayed a remarkable amount of consensus about the programs, policies, and institutional supports needed to help any student be successful on the path from college enrollment to credential completion. What We Know is a synthesis of our DEI experts’ recommended best program bets, and related critical institutional policies for helping all students succeed at what they set out to accomplish in community college.

Wednesday, September 26, 2012

New Workshop Helps Peer Leaders Keep Students On Course!

Today’s post comes from Ruth Silon, executive director of the Ohio Association of Community Colleges’ Student Success Center and former DEI director and English faculty member at Cuyahoga Community College (CCC). Ruth describes how CCC combined what they learned from two of their DEI initiatives to increase the impact of student leaders on their campuses.

As Cuyahoga Community College’s Developmental Education Initiative (DEI) grant came to an end, making good use of the remaining funds was not a challenge. The College’s grant used peer leaders in two of its major initiatives: Supplemental Instruction (SI) and Peer Mentoring. For another initiative, a redesign of Math 0850, Skip Downing’s On Course principles were a major part of the course. Most math faculty who taught the course attended On Course Level I and II workshops during the college’s involvement with Achieving the Dream and DEI. During DEI, the workshops were offered to all faculty.

Two occurrences led us to the decision to hold an On Course workshop for student peer leaders. First, during the second year of DEI, English professor Mary Ward, who used On Course in her class, asked if her SI leader, Tiffany, could be trained. Having Tiffany in the workshop with the faculty was a real benefit as her presence added a valuable perspective to the training. Instead of just talking about On Course in an abstract setting, we talked about its principles and their value for students with a student. Tiffany then went on to promote On Course as an SI Leader with students in her English class. Then, during the third year of DEI, Robin Middleton from Jamestown Community College (JCC) facilitated the Level II workshop. From her we learned about a new workshop “Creating a Culture of Success: On Course for Front Line Staff.” In order to promote culture change across the college, front-line staff who work in admissions and financial aid were trained in On Course principles and techniques. Although a workshop for front-line peer student leaders had not been tested, Skip and Robin agreed to put it together.

On September 7, 2012, over forty student leaders from multiple Cuyahoga Community College (CCC) campuses attended the one-day workshop. The students were supplemental instruction leaders, student ambassadors, peer advocates with the C2C program, student coaches with Cleveland Transfer Connection, and an AmeriCorps coach. The students dove deep into discussions about the following topics: “Eight Choices of Successful Students,” “On Course Core Beliefs,” “The Language of Responsibility,” “Victim versus Creator Language,” and “Staying On Course When at a Fork in the Road of Life.” They practiced working with each other on the wise choice process and the problem resolution process. One of the students told me that she worries about students who are very belligerent: “They don’t know how to react to anything except by getting angry and using their street behavior.”  Student leaders quickly saw that using the wise choice process would slow down students’ reaction process and get them to think more critically. Another student said, “I think On Course will help me save my marriage!” 

The group activity that impressed many attendees was “Creating a Mission Statement for Front Line Staff.”  The task was to create a statement of “purpose you would be proud to post in your work space for all to see…. a Mission Statement that will guide your important work with students.”  Below are a few statements from the student collaboration:
  • To serve as a support system with students to help them get their Associate’s Degree and achieve a rewarding career
  • To provide an environment where students feel that they are part of a community
  • Help students to learn and achieve the tools to be successful students and citizens

Giving these leaders the opportunity to put into words what their goals are and recognize how important their position is was quite moving. They felt empowered and then would go off to empower students. Clearly, this is a workshop that other colleges should think about bringing to their institution.

It is also important to mention that staff who work with these student leaders attended the workshop and many have attended On Course Level I and II training. If this program for student leaders is to work, trained staff will have to continue the conversation with them and encourage the use of On Course principles and methods.

Often as a grant ends, you find yourself saying, “We wish we knew then what we know now; we could have done some things differently or sooner.”  Blending together two of our initiatives, using student leaders and On Course training, was a great example of the culmination of our learning. I am so pleased that we were able to make it happen.

Tuesday, September 4, 2012

A President's Reflection: Scaling with Fidelity

Last week, we discussed different ways to describe the return on investment from a particular program. A convincing ROI can go a long way in generating support for a scaling strategy. In today’s post, Sanford C. “Sandy” Shugart, president of Valencia College, reminds us of how important it is to keep testing that ROI. The returns from a pilot might change dramatically when you expand to more people and disciplines. Maintaining fidelity to the proven model is essential if you want to maintain (and continuing improving) results.

Valencia College entered DEI with a history of innovation and a clear sense of areas to scale for effect in developmental programs. Specifically, the college planned to:
  • expand dramatically the use of supplemental instruction (SI) in developmental mathematics and other gateway courses 
  • grow the number of paired courses designed to function as learning communities 
  • expand the offering of student success courses while embedding the success strategies in other first-year courses 
  • expand a transition program for certain at-risk students called Bridges to Success
In the early going, we believed the initial stages of scaling—clarifying the model, recruiting adopters from the faculty and students, training the implementers, and having the will and support to reallocate scarce resources—would present the greatest challenges. We were wrong. Early stages of implementation were marked by enthusiasm, collaboration, and high expectations for success. The performance data were eagerly anticipated each term and became important levers in the conversations and revisions to the programs that followed.

Scaling went rather well, with each treatment showing positive early results and the general trend in student performance moving toward accelerated success—more credit hours earned in fewer academic terms by each succeeding cohort of students. With this success and the word of mouth among both students and faculty about the work, recruiting adopters became much easier. In fact, demand for SI-supported sections quickly grew beyond the initial disciplines and courses selected. (Valencia served more than 11,000 students in SI-supported course sections last year.) These were generally positive signs. However, as we scaled to greater and greater numbers of course sections, faculty, and students, we began to discover a new set of challenges.

SI will serve as an example, though we had similar challenges in other innovations. As we reviewed data from the SI sections versus “non-SI” sections, we began to find significant differences in the effect on indicators like student persistence to the end of course and grade distribution. Some of these seemed almost random, while others clearly showed a pattern (by campus, by discipline, etc.). Deeper inquiry revealed that the effect for students who actually participated in the supplemental activities that make SI effective were almost uniformly high for all students. What the discrepancy in the data revealed were major differences in implementation. It seems some faculty had found ways to encourage, perhaps even require, active student engagement in supplemental learning activities, while others had managed to achieve very little of this engagement. These differences existed in spite of aggressive training efforts prior to implementation.

Similarly, we found that while SI was especially effective in many of the mathematics gateway courses, the effect was much less striking in many of the courses to which it was added in the later years of our work (economics, political science, etc.) Again, we inquired more deeply and discovered not only differences in methods, but very different understandings among the faculty (and students) about the purposes of SI and the outcomes we were seeking.

We had, as our projects reached maturity, encountered the problem of scaling with fidelity to the model. It seems that our discipline of innovation needs to include ongoing evaluation of the purposes and steps to implementation, continuing staff training, and rigorous data analysis to assure that the treatment in which we are investing doesn’t take on a life of its own after institutionalization. A useful lesson!

Thursday, August 23, 2012

R to the O to the I

In March, we released More to Most: Scaling Up Effective Community College Practices, a guidebook that lays out a process for identifying promising practices and developing an expansion plan so that good programs reach as many students as possible. Central to the process is the SCALERS framework, developed at Duke University and discussed extensively on this blog. There are other steps that form the foundation for SCALERS application. We’re going to tell you about some of them in our next few posts. To download a copy of More to Most, visit the website: www.more2most.org.

The first step in the process is determining program value; you can read a rundown of that step here. One part of that rationale—and a rather popular one in these austere days of budget cuts—is demonstrating the return on investment (ROI), the sometimes-hard-to-quantify case that shows program spending generates a financial return (in increased FTE or other revenues) that offsets operating costs. Standardizing this process, where possible, can help you compare different programs and make decisions about which programs should be discontinued and which should be expanded. This kind of information, of course, needs to be combined with other qualitative data about program effects, and considerations of faculty and student engagement and institutional priorities.

Program ROI
There are a number of ways to analyze the connection between a program’s results and its costs. In 2009, Jobs For the Future and the Delta Cost Project released Calculating Cost-Return on Investments in Student Success. The report determines cost-return on investment by assessing student retention for program and non-program students, the resources required for program operation, and the revenue gained by additional retention using the following data and calculations:
  1. Additional number of students enrolled into the next year because of the program: Calculated using number of students served, one-year retention rates for program participants, number of participating students retained, and one-year retention rates for non-program students.
  2. Direct costs of program operation: Calculated based on expenditures for personnel, supplies and equipment, stipends for students, facilities, etc
  3. Spending and revenue data: Calculated from average expenditures per student, and additional tuition, state, and federal revenue gained from increased retention.

These simple calculations could be the beginning of a formula that is unique to your institution, one that incorporates the costs, revenue, and priorities that are most relevant to your program goals.

Local and State ROI
You can also think about ROI beyond the bounds of a program. The various student success programs and policies at a college add up to increasing numbers of students completing credentials. (At least, that’s where we’re headed, right?) And increased completion outcomes can lead to positive returns for communities and states, even the nation as a whole. Being able to make such a case at your college—and across a system—could garner support from policy makers at the state and federal level. The Center for Law and Social Policy (CLASP) partnered with the National Center for Higher Education Management Systems (NCHEMS) to create a Return on Investment Dashboard. The dashboard combines data from the Census Bureau, National Center for Education Statistics, and Department of Education to project short- and long-term effects of maintaining the educational attainment status quo vs. increasing the number of credentialed adults in a particular state.

In addition to credential attainment, state summaries include projections for personal economic gain, as well as returns to the state in the form of state income, property, and sales taxes. The dashboard also has figures for Medicaid savings and correctional system expenditures. The dashboard allows you to manipulate high school completion, college going rates, credential attainment, and enrollment patterns to see how increases or decreases might affect individual and state returns on investment.

ROI and External Funding

The growth of social innovation financing or social impact bonds is one way that demonstrating return-on-investment is tied directly to external funding. Organizations cover initial program costs by borrowing money from foundations and other investors. If the organization meets agreed upon outcomes, they’ll receive additional funds from the state, and original investors will receive a portion of those funds. While this form of financing is primarily occurring in the nonprofit and social enterprise sectors, it could have implications for the education sector as well. You can read a great summary of social impact bonds on the Nonprofit Finance Fund website and check out how some social programs in Massachusetts are implementing this type of financing.

Whether you’re trying to quantify program costs, make a state-wide case for investing in a particular innovation, or looking for new ways to fund your efforts, understanding and articulating the return-on-investment is an important piece of the scaling up puzzle.

Tuesday, August 21, 2012

Is it worth it?

In March, we released More to Most: Scaling Up Effective Community College Practices, a guidebook that lays out a process for identifying promising practices and developing an expansion plan so that good programs reach as many students as possible. Central to the process is the SCALERS framework, developed at Duke University and discussed extensively on this blog. There are other steps that form the foundation for SCALERS application. We’re going to tell you about some of them in our next few posts. To download a copy of More to Most, visit the website: www.more2most.org.

Before you create a plan for scaling up, you need to decide if the program or practice you want to scale is actually effective. The last thing you need is to do more of something that isn’t even worthwhile. (Such logic only applies to things like eating Doritos or watching episodes of Chopped.) Chapter 2 of More to Most, Determining Program Value, lays out a process for assessing the value of a program.

The first stage of this process is defining the problem you are trying to address with your program and identifying your desired outcome. An example of a well-defined problem:
Too many students who test into three developmental education courses never successfully complete a college-level math or English course.
An example of a concrete outcome:
Currently, X percent of students successfully complete gateway math in one year. We will increase this by X percentage points by [YEAR].
You’ve got a program that’s designed to address this problem and help you reach this outcome. During the next stage, you collect evidence that will help you decide how well the program is achieving the desired outcome. What evidence of the program’s impact, both quantitative and qualitative, is available? Possible sources include:
  • Basic demographic profile of student body, a target group, and/or program participants
  • Course completion data: by course; by student cohort
  • Focus group data from students, staff, and/or faculty
  • Data from national surveys like CCSSE, SENSE, Noel-Levitz, etc.
Now, you can determine whether the program or practice meets criteria for effectiveness. We focus on two important criteria:
  • Does the evidence show that the program delivers the desired outcome?
  • Would scaling up the program align with institutional objectives?
It is important to consider both of these questions when determining the value of scaling up the program. Many institutions analyze data disaggregated by race, income, and other demographic factors and identify achievement gaps among student populations. If closing these gaps is an institutional priority for your college and one of the desired outcomes of your program, then make sure you analyze the evidence for how effectively the program accomplishes this goal. When comparing the evidence for several programs, keep in mind that if the program has positive outcomes for a designated population that is generally less successful, it may show a lesser impact on overall student success outcomes, at least in the short term. This does not make the program less valuable, however. The value comes from how well the program matches your desired outcomes and the institution’s priorities — or makes a case for changing those priorities.

Once you’ve collected your evidence, it’s time to make the case for expansion and developing a scaling strategy. We’ll discuss those steps in upcoming posts.

Wednesday, August 8, 2012

Guest Post: A Closer Look at Accelerating Opportunity

Today, Rachel Pleasants, senior project manager at Jobs for the Future, shares the inside scoop on a national effort to restructure adult basic education, another part of the postsecondary pipeline that shares some characteristics—and students—with developmental education programs.

Accelerating Opportunity, an initiative managed by Jobs for the Future, has ambitious goals: to change the way adult basic education (ABE) is structured and delivered at the state and college levels so that substantially more low-skilled adults get the education and credentials they need to access family-supporting careers. Building on Washington’s Integrated Basic Education and Skills Training (I-BEST) program and the Breaking Through initiative, Accelerating Opportunity promotes the development of integrated pathway models that combine ABE with career and technical training.

It’s clear that postsecondary credentials are essential for accessing jobs that pay a living wage, but these credentials are out of reach for many adults without a high school diploma or GED. Low-skilled adults seeking to advance their education and career face numerous barriers to success, including a lack of career guidance and disconnected educational systems. Like developmental education students, ABE students often find themselves in long remedial sequences, with very few ultimately transitioning to postsecondary credit-bearing programs.

Through Accelerating Opportunity, JFF and our partners and funders aim to address the systemic barriers that prevent low-skilled adults from achieving their goals. We believe that in order for this to happen, states and their colleges have to focus on three areas: developing career pathways, shifting their culture to one that views ABE as an important part of the postsecondary pipeline, and building in plans for scale and sustainability. This is a major undertaking that includes changes in policy as well as practice. And not only are we asking states and colleges to engage in systems change, we are asking them to do it at scale: each implementation state in the initiative (five so far) has committed to awarding at least 3,600 credentials to students in the target population within three years.

The initiative began with a one-year design phase; in November 2011, the leadership team selected Illinois, Kansas, Kentucky, and North Carolina to move into the implementation phase. In May 2012, we added Louisiana as a fifth state. Across these five states more than forty colleges are developing and implementing integrated pathways.

Far from being deterred from the ambitious goals set out by Jobs for the Future along with its funders and partners, the states have embraced the Accelerating Opportunity vision and are already producing results. Less than a year into the implementation work, nearly all the participating colleges have pathway programs in place, enrolling a total of more than 800 students. Students and faculty are beginning to see the benefits of an integrated pathway approach and the team teaching model. Partnerships between ABE, career and technical education, the workforce system, and TANF agencies are being developed and strengthened. Some states have even begun to move toward policy changes. In Illinois, for example, ABE outcomes, including transition to postsecondary, is now part of the state’s performance funding formula. In Kansas, the eligibility criteria of a state scholarship fund have been revamped to better target AO students.

There are still many challenges ahead for the five states, including funding (especially given the loss of Pell’s Ability to Benefit provision), recruitment, professional development, and stakeholder engagement.  But we see a remarkable commitment on the part of state and college leaders to developing the types of pathways and structures that will enable many more low skilled-adults to access and succeed in postsecondary training. For example, the governors in many of the implementation states have supported JFF and other national organizations in advocating for the inclusion of an exception to the Ability to Benefit change for students enrolled in career pathway programs.

There is a growing national emphasis on career pathway development and an increasing awareness of the importance of postsecondary education, and the goals of Accelerating Opportunity are aligned with these national trends. JFF and our partners and funders believe that Accelerating Opportunity has the potential to raise the profile of adult basic education, ensure its inclusion in the college completion agenda, and ultimately provide thousands of adults with access to economic opportunity. In all this work, there is shared commitment with other national initiatives like Achieving the Dream and DEI as well as collaboration and peer learning toward a shared goal: accelerating progress for all students toward postsecondary credentials.

Tuesday, July 31, 2012

Improving Student Success at Sinclair CC: Lessons from Three Initiatives

In today’s post, Kathleen Cleary, associate provost for student success at Sinclair Community College, recaps her presentation from the June 2012 Conference on Developmental Education sponsored by the National Center for Postsecondary Research. Kathleen recounts how Sinclair’s student success efforts have evolved through the institutional change work of Achieving the Dream and the Developmental Education Initiative, setting them up for statewide transformation as part of the Completion By Design network.

When Sinclair began our work with Achieving the Dream (ATD) in 2005, we opted to infuse ATD principles and goals into standing college processes and systems to avoid perceptions that this was an “add-on” to existing work. The strength of this work is that we approached improving student success, particularly for underserved populations, as a way of life at the college, rather than a program that would have a beginning and ending. With the Developmental Education Initiative (DEI), we pushed the envelope of possibility even further and began to make bolder, more aggressive changes in our pedagogy, structures, and curriculum. When we learned that we were granted funding for Completion by Design (CBD), we took a different approach and made the conscious decision to be high profile, even creating a statewide Completion by Design office on campus. The evolution of our student success work is built on the solid, data-driven, culture-changing work of ATD, through groundbreaking efforts in DEI, to a systemic, campus-wide ownership of the need to move the needle on student outcomes in Completion by Design.

The gains for Achieving the Dream centered on use of data, policy changes, and a commitment to enhancing teaching and learning. Faculty began tracking student success in gatekeeper courses in developmental and college-level English, reading, and math. When faculty saw their success rates, they began to experiment with new ways of teaching and structuring courses. Policy changes such as the no late registration policy were watershed moments for the college as we made a cultural shift from an access-centric institution to an access and success focus. Another hallmark of our ATD work was the creation of the Center for Teaching and Learning, which has provided professional development on topics including student engagement, diversity in the classroom, and increasing student success and completion. Through such DEI initiatives as boot camps, math modules, accelerated English, and early support in high school, we began to accelerate students’ progress through developmental education, while continually tracking student success into the next course in the sequence. The work of ATD and DEI became the cornerstone for Completion by Design both at Sinclair and across the state of Ohio. The goal is to create a seamless pathway for students that helps them graduate in higher numbers and more quickly, with fewer excess credits.

The four strategic priorities for the state have their roots in the ATD/DEI colleges of Ohio:
1.    Academic Program Redesign and Contextualization
2.    Accelerating Students through the Pathway
3.    Integrated Student Support and Development
4.    Institutional and State Policies
All four of these priorities have already been addressed through the ATD and DEI colleges in Ohio and are the natural choices for the state’s continuing student success work with the Completion by Design colleges. While Sinclair is the only Ohio CBD college to be involved in all three initiatives, the work reflects the lessons learned by colleges across the state in all three initiatives. It is an exciting time to be working with community colleges as students across Ohio and the nation are poised to graduate in greater numbers as a result of the exciting findings of this work.

Friday, July 27, 2012

Link Link Link

  • Just in case you missed The New York Times piece on CUNY’s New Community College, you can read about efforts to build a community college from scratch here. Dean Dad provided some commentary this week on his Inside Higher Ed blog.
  • MDRC released two studies about learning communities this week: The Effects of Learning Communities for Students in Developmental Education: A Synthesis of Findings from Six Community Colleges and Commencement Day: Six-Year Effects of a Freshman Learning Community Program at Kingsborough Community College. You can read a summary of both reports here, along with MDRC’s take on what this research suggests about implementing and scaling up this approach at community colleges. 
  • More webinar fun next week, this time from the Tennessee College Access and Success Network. They’ve got two offerings focused on adult learners coming up. On July 30 at 10am CDT, you can dial in for a presentation about Roane State’s H20 Program, which connects adult learners with opportunities in both higher education and the workforce. The presenters will also lead a discussion about the importance of connecting workforce needs with learning among adult learners.Sign up here. On July 31 at 9am CDT, Dr. Doyle Brinson of East Tennessee State University weighs in on integrating instructional resources and student services for adult learners. Sign up here.

Friday, July 13, 2012

Reporting for Duty

There’s always a long list of reports you could be reading, but here’s a few we think are worth downloading. C’mon—you’ve at least got time for the executive summary!
  • Our colleagues here at MDC have just released a report examining the recent experiences of community colleges across the United States that are implementing the Center for Working Families (CWF) approach to help low-income families attain financial stability and move up the economic ladder. The approach combines what community colleges do so well—provide individuals with training that connects them to dynamic careers—with the financial support necessary to complete education and connect with a career path. “Center for Working Families at Community Colleges: Clearing the Financial Barriers to Student Success” takes a closer look at the emerging CWF Community College learning network and shows how the individual colleges provide their CWF services, whom they seek to serve, how the CWF fits and adapts within local college contexts, what outcomes they are accomplishing, and provides the answers to other key learning questions.
  • Jack Rotman, over at Developmental Math Revival, has been sharing a thoughtful exploration and critique of recent developmental ed press and research.

Friday, July 6, 2012

Let's Try This Again: Links!

  • On June 21-22, the National Center for Postsecondary Research at Columbia University hosted a conference, Strengthening Developmental Education: What Have We Learned and What’s Next, featuring dev ed experts from across the country, including some from our own DEI colleges and states. You can check out an overview of the conference and download materials here. Jennifer Gonzales at the Chronicle of Higher Ed gave this recap of a conference keynote address from U.S. undersecretary of education, Martha Kanter: “Rather than abolish remedial education, Ms. Kanter implored the scholars to continue their work to reform and improve it.” Hear! Hear!
  • Last month, Jobs for the Future published a new policy brief about financial aid that needs to be on your reading list. Aid and Innovation identifies financial aid rules and regulations that block innovations meant to help low-income students; describes how policy leaders and financial aid experts in several states are building work-arounds to those issues; and recommends how states can work together to better meet students’ financial aid needs. You can download it here.
  • In addition to that policy brief, JFF also released another edition of Achieving Success, the newsletter all about Achieving the Dream and Developmental Education Initiative state policy. This one’s got more on the Virginia Community College System dev math redesign, Florida legislation that is strengthening community college transfer pathways, and more. Download it here.
  • Recently, the MDC-led Financial Empowerment for Student Success Learning Network hosted an enlightening discussion on how to use a college’s student success course to teach students about financial literacy and management, with speakers from two Achieving the Dream institutions. The presentations from Debby King of Phillips Community College of the University of Arkansas and Sonya Caesar of the Community College of Baltimore County answered questions about how they have effectively made financial capability an integral part of course curriculum. They shared many lessons, including some important advice on how to involve administration and how to get student input to develop the most effective course of study. You can link to a recording of the webinar here. (Go ahead and follow the registration instructions—that will set you up to view the recording!) 
  • Just can’t get enough of webinars? Inside Higher Ed is hosting one next Tuesday, July 10 at 1pm Eastern to share results from a new study of faculty perspectives on online education. “Conflicted: Faculty and Online Education 2012” will feature Scott Jaschik, editor of Inside Higher Ed, Joshua Kim, director of learning and technology, Master of Health Care Delivery Science program, Dartmouth College, and blogger at Inside Higher Ed; Steve Kolowich, technology reporter at Inside Higher Ed; and Jeff Seaman, co-director, Babson Survey Research Group. Register here.

Tuesday, July 3, 2012

Scaling Social Impact: NYC Edition

Earlier this month, we attended the Social Impact Exchange’s annual Symposium on Scaling Social Impact. The Symposium brought together nonprofit organizations, funders, consultants, and evaluators to share knowledge about bringing social solutions to scale. We were there to share what we learned as we created More to Most, a guidebook on scaling up effective community college practices, and to learn from the experiences of others. Here are a few themes from the conference that we’d like to share:
  • If “scaling social impact” sounds like a nebulous phrase, that’s because it is. As the meeting organizers readily admitted, “There are many ways to achieve scaled impact—from replicating programs in new locations to developing breakthrough products and services; from scaling policy initiatives and social movements to online expansions through the use of toolkits and platforms. And there are other types of expansions too that include knowledge sharing, network building and collaborations.” While many of the symposium attendees were focused on replicating a nonprofit’s services across multiple geographies, More to Most is focused on scaling within a system. Our big question: how can community colleges go from serving some students in effective programs, to expanding those programs to more students, and finally reach most of those who can benefit from them.
  • Is the ideal funder a thought partner, too? According to a panel on grantmakers and nonprofits working in partnership, yes. One panelist used a food metaphor (always our favorite) to explain: she said funders should help build the kitchen, but they don’t need to be in there cooking with you; in other words, there’s a place for funder input in program design, but implementation should be left to the delivery organization. Another pointed out that when funders are engaged as thought partners, they are usually willing to be more flexible with timelines and shifting plans. 
  • Making the most of opportunities is usually good, but being too opportunistic and losing coherent priorities is bad. Know yourself and your non-negotiables. If you adjust a working solution to meet the preferences of a new funder or for the sake of simply making the program larger, you may risk your effectiveness. 
  • To get results, make talent development a priority. A panel of philanthropic leaders drove the point home at the symposium, reminding us that a key indicator of job satisfaction across sectors is the feeling of continual challenge. The panelists recommended that the social sector pay more attention to cultivating talent and leadership over time. For community colleges, this means that any successful scaling effort should be linked with professional development and faculty engagement
  • Sustainable funding is crucial to scaling success. A program can’t rely on a continuous stream of grants to operate. While grant funds work great for start up and proof of concept, a program needs to identify a long-term sustainable funding stream. Many grantmakers are hesitant to fund an idea that doesn’t have a plan for revenue generation. At the community college, this usually means finding a way to get resources reallocated in the general fund. Use the grant money to demonstrate the effectiveness of your program, so that college administrators recognize the value of incorporating the program into the annual budget.

Friday, May 25, 2012

You Can't Handle the Links

  • College strategic plans for increasing student success are by nature long-term efforts, so concrete measures of progress often take years to appear. What do you do when your work to “move the needle” is slow going, or when initiative fatigue sets in at your institution? According to Inside Higher Ed, Monroe Community College has “started a series of modest but tangible 100-day projects to improve the college.” These projects are intended as small steps toward larger goals, but they also foster broad engagement and keep motivation high. Their first project is to “streamline the application and enrollment process so that prospective students have to create one password instead of three.” As blogger Dean Dad points out, this idea requires widespread institutional buy-in. If people don’t take it seriously, it won’t work. Wondering how to get that buy-in? Nick Bekas of Valencia Colleges offers his advice on building alliances in an Accelerating Achievement post earlier this year. 
  • A substantial portion of our nation’s workforce is unemployed or underemployed, but many companies can’t find the workers they need to fill high-skill jobs. Why are we struggling to train workers for existing positions when so many are in need of work? Maureen Conway of the Aspen Institute says that workforce training and education programs don’t do enough to address the real-world challenges adult students face. She sees a need for increased funding and budgetary flexibility for integrated student support services. Check out Colin Austin’s guest post on an approach that weaves together education and training, income supports, and financial services. Another reason that we can’t fill those open positions? It isn’t readily apparent to students or colleges what employers are looking for. Jobs for the Future’s Credentials that Work initiative uses real-time labor market information to help students choose credentials that will get them jobs, and to help institutions craft programs with local labor market value.  
  • Last week in The Chronicle of Higher Ed, developmental English professor Brian Hall of Cuyahoga Community College (a DEI Institution) shared some insight into “What Really Matters to Working Students.” Frustrated by students’ seemingly constant absence and inattention, Hall asked one of his developmental English classes to explain why so few students are successful. The biggest reason his students gave: the difficulty of balancing academics with life. Between work schedules and family responsibilities, many students feel that their motivation to do well in class is eclipsed by unforeseen hurdles. While developmental educators can’t eliminate these hurdles (see the previous bullet for what colleges can do), Hall and his students recommend ways professors can keep students on track, and caution against behavior that could knock them off course permanently. They propose that professors should make expectations and rules apparent from the start, treat students with the respect they require in return, make class work relevant and engaging, and show students that it is ok to make mistakes if you learn from them. 
  • The College Board has released a new “web based tool that provides quick and easy access to national, state and initiative-level data that describe the progress and success of community college students.” The Completion Arch establishes indicators for five areas: enrollment, developmental education placement, progress, transfer and completion, and workforce preparation and employment outcomes. You can filter the indicators by data source, state, and student characteristics. The site is easy to navigate, so check it out for yourself
  • The Hartford Courant ran an op-ed from the Community College Research Center at Columbia University last week on Connecticut’s recent developmental education legislation. Tom Bailey, Katherine Hughes, and Shana Smith Jaggers expressed their concerns over the potential negative impact that the legislation could have on students in need of significant skill development before they are ready for college-level coursework. They also noted their concerns about buy-in from college faculty and staff: “A policy that gives community college practitioners flexibility and support to try out new models — and that includes accountability measures to accelerate real change — would make them far more likely to embrace reforms on an institutional and state level.”

Wednesday, May 23, 2012

Patrick Henry Community College: Predicting Success

We’re indebted to Greg Hodges, dean of developmental education and transitional programs, and Kevin Shropshire, director of institutional research, at Patrick Henry Community College for their contributions to this post.

In 2009, at the beginning of their Developmental Education Initiative grant, Patrick Henry Community College planned two different strategies for accelerating students’ progress through developmental education: Fast Track math that allowed students to complete two dev math courses in one semester and Accelerated Learning Program (ALP) courses, based on the model pioneered at the Community College of Baltimore County. PHCC knew that these different accelerated strategies weren’t a perfect fit for every student and wanted a way for advisors to assess which students would be most likely to succeed in these courses. They created a computer-based predictor tool with an easy-to-use interface that was built on top of a database that calculated a student’s likelihood of success based on his/her response to a series of questions asked during an advising visit. Preliminary results with the tool were promising. And then the Virginia Community College System launched a system-wide redesign of developmental mathematics that essentially fast-tracked all students, with a requirement to complete all dev math in one year, using a series of nine modules and a new diagnostic instrument. Here’s the story of the early days of the advising tool and how PHCC plans for revisions that will enable them to continue using it to place students in the appropriate type of acceleration.

Version One
PHCC first deployed the predictor tool in fall 2010. In the past, the assessment of advising at the college was difficult because different advisors focused on different characteristics to guide students to specific courses; thus, trying to sort out which student characteristics and situations would affect success in a particular instructional method was nearly impossible. The advising tool—and the database on which it is built—provides for consistency of some key variables and documents decision-making of different advisors. Variables included: 
  • expected absences
  • expected use of tutorial services
  • placement test scores, including math, reading, and writing
  • previous developmental math success .
The database returned a statistical likelihood (high, medium, low) of a student’s success in an accelerated course. (The key characteristics and probabilities were based on a risk model of prior students’ behaviors and rates of success.) Here’s a screen shot of the interface created by the college's webmaster:

An advisor considered this computer-driven recommendation along with his/her own professional judgment, and the student’s perspective to make a proactive decision about acceleration possibilities that were in the best interest of the student. 

Preliminary data suggested promise—and the need for some refinements. In fall 2010, the pass rate for Fast Track students with a record in the database (meaning the advisors used the advising database), was 82 percent. The rate for students with no records in the database (those that enrolled on their own or whose advisors did not use the advising database) was 65 percent. This suggests the possibility that the advising system helped advisors identify students who were more likely to be successful in this accelerated course.

The input predictor model was 80-85 percent accurate, mainly driven by student absences. In addition, the predictor can also be applied to all students at the end of a semester, though it must be acknowledged that some of the inputs—like math lab visits and expected absences—are student-driven. These estimates can then be correlated with student success measures such as course success and retention.

Redesign the Curriculum – Redesign the Predictor
The folks at Patrick Henry weren’t the only Virginians doing DEI work. In 2009 -2010, the Virginia Community College System convened a state-wide task force and redesigned the entire developmental mathematics curriculum. (You can read about the redesign here.) When the final curriculum was launched at all VCCS colleges in January of 2012, content that was once delivered in 3-4 credit developmental courses was now captured in nine one-credit modules, with the aim that all developmental math course work could be completed in one year. That meant that acceleration became the norm—not just one of many options—at Patrick Henry. And that meant that a predictor that might recommend a traditional, non-accelerated course for a student was no longer the right tool for advisors.
However, PHCC was confident that there was still great value in advisors having more information on which to base advising recommendations. Since VCCS allowed colleges to decide on the delivery mechanisms for the new developmental math modules, PHCC undertook three different delivery options for the new VA math: face-to-face instruction, computer-based instruction, and ALP (combining developmental with on-level coursework). This summer and fall they will be hard at work creating and implementing a revised predictor that will help advisors counsel students about their best instructional option. They have already identified some key characteristics that affect student success in the different course models.

We’re looking forward to seeing how the next iteration of the predictor shapes up. We’ll be sure to update our blog readers. If you’re interested in more specifics about how the college built their model, please contact Greg Hodges at ghodges@patrickhenry.edu or Kevin Shropshire at kshropshire@patrickhenry.edu. (Achieving the Dream colleges: they’re even willing to share SAS code that’s already set up for ATD data!)

Wednesday, May 16, 2012

The Blog is Back in Town

Accelerating Achievement has been a bit quiet for a couple of weeks. The elves have been called to other areas of the workshop, but they’re back in business now. Good thing our colleagues across the dev ed field have still been cranking out all sorts stuff that’s worth reading:
  • This month, Getting Past Go, a developmental education state policy project of the Education Commission of the States, released a brief titled “Using State Policies to Ensure Effective Assessment and Placement in Remedial Education.” Mary Fulton, policy analyst at ECS, gives an excellent explanation of community college assessment and placement procedures and then walks through current research evaluating the effectiveness (or ineffectiveness) of current instruments and efforts to test the validity of those instruments. She then gives examples from states that are enacting policies to improve assessment by implementing multiple measures and clarifying the student intake process. The piece concludes with recommended assessment and placement policies for both states and postsecondary systems—and a chart that shows you which policies are currently in place in your state!
  • The California Acceleration Project (CAP) just released their May newsletter, Acceleration News. You can read about progress among the CAP Community of Practice—how they’re scaling up and who’s joining the ranks. There also are details on how California community colleges are shrinking developmental course sequences, negotiating articulation agreements with four-year schools, and publishing new data on student performance in accelerated English and math courses. And a whole lot more—so check it out.
Looking for some professional development opportunities? Want to meet more folks that are committed to improving outcomes for community college students? Here are two upcoming events you should consider:
  • The Community College of Baltimore County is hosting the Fourth Annual Conference on Acceleration in Baltimore, MD, June 7-8, with pre-conference workshops on writing, reading, and math on June 6. You can check out the conference schedule here and register here.
  • The National Center for Postsecondary Research 2012 conference, “Strengthening Developmental Education: What Have We Learned and What’s Next,” is scheduled for June 21-22, 2012 at Columbia University in New York. See the agenda here and register here.

Wednesday, April 25, 2012

Crowdsourcing the Placement Test Dilemma

On Monday, the Inside Higher Ed blog Confessions of a Community College Dean, was all about how you know a student needs remediation. Blogger Dean Dad gives a succinct overview of the often frustrating process: cutoff scores, preparing (or not preparing) students for a high-stakes test, and mandated tests that have little predictive value for a student’s performance. And then there’s “thousands of new students showing up in a compressed timeframe, ranging in age from fresh out of high school to retirement, and you need to place them all quickly.” He includes a few possible responses—using high school GPA and other diagnostics, embedded remediation, or the “let them fail” approach.

He then tosses the ball to his readers, requesting examples of efficient methods for placing a lot of students in the right place in a relative short time. While it’s often dangerous to read comments on blog posts, there are some interesting suggestions in the mix—from software solutions to diagnostic tests that are directly tied to instructional modules. We’ve covered some of these approaches that are happening in DEI colleges and states here on Accelerating Achievement:
What’s working on your campus? Obviously, there are a lot of inquiring minds that want to know!

Tuesday, April 24, 2012

Closing Doors

The title of a Kevin Carey article in The New Republic is a question we’ve all been pondering for the past few years: “Why Are Community Colleges Being Treated Worst When They’re Needed Most?” Since the recession began, community colleges have been increasingly looked to as engines of economic recovery and to provide training for unemployed and low-income workers. Last week, President Obama once again touted their value during a speech at Lorain County Community College, an Achieving the Dream institution, in Ohio:
“When you take classes at a community college like this one and you learn the skills that you need to get a job right away, that does not just benefit you; it benefits the company that ends up hiring and profiting from your skills. It makes the entire region stronger economically. It makes this country stronger economically.”
Carey outlines at least three facets of a community college mission: “continuing education for adults, job training for local labor markets, and the first two years of a baccalaureate education.” Shining a spotlight on that mission and asking colleges to increase their productivity and flexibility isn’t a bad thing, but it has been accompanied by unprecedented resource cuts in state legislatures across the country. Calls for community colleges to do a better job are matched with slashed budgets, rather than with the investment and support that are needed for successful reform.

For decades, we’ve been working to expand access to higher education, while simultaneously trying to improve student success rates. Rising costs and reduced public investment are now threatening to reverse hard-won progress in higher education access and success. A new report by Gary Rhoades of the Center for the Future of Higher Education reveals that as enrollment caps expand and the number of educational programs narrow, many lower-income students and students of color are losing access points to postsecondary education. Rhoades explains:
“In a complicated ‘cascade effect,’ higher tuition and enrollment limitations at four-year institutions have pushed middle-class and upper middle-class students toward community colleges. This, in turn, increases competition for seats in community college classrooms at a time when funding for community colleges is being slashed and fees are increasing. As community colleges draw more affluent students, opportunity is being rationed and lower-income students (many of whom are students of color) are being denied access to higher education.”
Pushing low-income students out of the educational pipeline can only further entrench an increasingly immobile class system. A New York Times column from last month assembled some alarming data on the relationship between education and inequality. In 1970, 6.2 percent of students from low-income families attained a bachelor’s degree by the age of 24, compared to 40.2 percent of students from high-income families. By 2009, 82.4 of students from high-income families had completed a bachelor’s by age 24, but only 8.3 percent of students from low-income families were able to do so. Given that workers with a bachelor’s degree earn 82.8 percent more annually than workers with only a high school diploma, low-income youth are increasingly fated to remain low-income for their entire lives.

Remember, the Truman Commission warned us about this in 1947. When education is “prerequisite to occupational and social advance,” but is available only to the affluent, it will “become the means, not of eliminating race and class distinctions, but of deepening and solidifying them.”

With just over half of all entering community college students requiring developmental education courses, dev ed remains a main point of access to higher education. As we continue our push to improve the outcomes for students in these programs, we must not allow these programs to be rationed or slashed. Well-structured reforms can lead developmental education programs to accomplish what they are intended to do: help students, regardless of background and level of preparation, obtain a credential or degree and put them on the path to economic stability. As Carey explains, “opening the doors of higher education to ever more Americans is a perpetually unfinished project. But it’s a tragedy that we are simply choosing to watch some of those doors swing shut.”

Friday, April 20, 2012

Burning Bridges

A new report released this week by Complete College America (CCA) drawing on recent Community College Research Center studies and data from CCA states, is recommending major changes to developmental education programs at two- and four-year colleges to secure better outcomes—in both cost and credentials—for students. From administering assessments in high school, to using multiple diagnostics for placement, to instituting co-requisite models that place students in college-level courses with built-in support, the report highlights promising practices from across the country. The title, REMEDIATION: Higher Education’s Bridge to Nowhere, is provocative, to be sure, and the recommendations are bold—and boldness is certainly required when so few students who begin their postsecondary journey in developmental education actually complete it. However, that “bridge to nowhere” language undervalues the successes of reformers cheered in the report who are making the system work for their students—and doesn’t fully appreciate the hard work of students who are doing the best they can with the options they’ve got. We know sometimes the truth hurts, that sometimes someone has to schedule An Intervention, and sometimes feelings are going to be hurt before you can see the hard work that really has to be done. But if we’re going to burn this “bridge to nowhere,” we’re going to need some help rebuilding it. This new report has solid recommendations for the pieces of the structure, but this isn’t a “no assembly required” kind of project. Colleges and universities will need equally solid support for the execution of these new approaches.

Maybe what we’ve got is not so much a bridge to nowhere as a rickety one-lane bridge that needs to be an eight-lane superhighway (or maybe a track for a bullet train, or an easily accessible and frequently used bike path if we want to be carbon neutral). Whatever the path, it needs to lead to credentials that set students up for family-sustaining employment and career advancement, a point noted in the report, particular regarding getting students into career-tracked programs of study as soon as possible. While CCA encourages the critical commitment of state-level government in “Governors Who Get It,” and of state legislators, reform can’t be driven only from that arena. And it can’t be driven without any gas in the tank—the resources (financial and human) to implement them. We tend to agree with Nancy Zimpher, chancellor of SUNY, in Insider Higher Ed last week. Responding to legislation in Connecticut to end most remedial education in public higher education, Zimpher said, “I applaud Connecticut’s intent to abolish remediation, but this is not a legislative issue. It’s a community issue….” Mandates like the one proposed in Connecticut (which would eliminate traditional dev ed and replace it with embedded support in credit-bearing course and require college readiness bridge programs), while drawing on practices that are showing positive preliminary results, also must consider the complexity of this kind of change in a higher ed institution—especially given increasing resource constraints. Knowing what to do doesn’t mean you know how to do it, or that your current funding allocations can support it. Such reform must take into consideration the extensive professional development that will be required to ensure that new programs are delivered effectively, as well of the cost of that training. Also essential to successful implementation will be federal, state, and institutional policy changes that align funding with new models of instruction, among other structural changes. Finally, bringing faculty along in the process (or leaving them out) can have significant impact on the success of new initiatives and the students who participate. (Katie Hern of the California Acceleration Project, one of the reformers lifted up in the CCA report—and deservedly so!—wrote about the necessity of faculty engagement in reform efforts here.) Changing the delivery mechanism probably won’t be much more effective if those doing the delivery feel like they haven’t been part of the decision-making process and feel that they don’t have the support they need to be successful.

DEI colleges and states have been hard at work on many of the CCA-recommended practices over the course of the initiative, and they’ve seen good results for their students. They’ve also experienced the sometimes labored process of building relationships across previously siloed departments, of responding at the system and institutional level to state-wide changes, and of fine-tuning messaging—repeatedly—for students, faculty, and staff so everyone really understands what is at stake and how they can benefit from doing things in a new way. We commend Complete College America for making the case for a new approach and declaring their vision for the best way forward; we hope that policy makers responding to these recommendations will carefully consider an approach to implementing these bold changes that draws on experience throughout a college, and that they’ll provide the resources—human, financial, and capital—to build a bridge to college completion that is long-lasting and gets individuals, institutions and our nation where we need to go.

Wednesday, April 18, 2012

Replicating Impact

Today, we’re returning to our “SCALERS: Round 2” series. Originally created by Paul Bloom, the SCALERS model identifies seven organizational capacities that support the successful scaling of a social enterprise: Staffing, Communicating, Alliance-building, Lobbying, Earnings Generation, Replicating Impact, and Stimulating Market Forces. (You can read an introduction to each driver in our first SCALERS series.) Now, we’re asking DEI colleges about how particular SCALERS drivers have contributed to their scaling efforts. We’ve looked at staffing. communicating, alliance-building, lobbying, and earnings generation. In the post below, Becky Ament, associate dean for developmental education at Zane State College, discusses how Zane State went about replicating the impact of their intrusive advising approach for developmental education students.

When Zane State College joined Achieving the Dream in 2005, initial data analysis suggested interventions with two groups of students had the greatest potential to improve the year-to-year retention rates:
  • Students who tested two levels below college level math, but failed to complete at least one developmental class during their first year in college; nearly 100 percent of students who fell into this category were not retained from one fall to the next.
  • Students most at risk for dropping out as measured by the Noel-Levitz College Student Inventory (CSI); students scoring highest (7, 8, or 9) in “dropout proneness” on the CSI were significantly less likely to be retained from one fall to the next.
The resulting interventions were:
  • Developmental math advising: Developmental student outcome data indicated strong course retention and successful completion rates as well as strong success rates in targeted gatekeeper courses. Confident that the curriculum was well aligned and meeting students’ needs, new intervention strategies focused on intrusive advising. Academic advisors developed an unmet prerequisite intervention process that monitors students’ participation in developmental mathematics through enrollment and completion of a college-level math course. Any student not attending or dropping out of a developmental mathematics course was targeted for intervention advising and required to continue the appropriate sequence of developmental and college-level math courses. In two years this intervention resulted in a 5 percent increase in students successfully completing developmental math courses within their first year. By 2008, the increase grew to 10 percent.
  • Advising for students most at-risk: This program provides personal contacts and individual support to students scoring in a high profile range for “drop-out proneness.” Advisors interpret the CSI results for the students immediately upon completion during the placement test session and discuss support options for counseling, tutoring, and communicating with a contact person who cares about the student’s entire experience and success. Student Success Center personnel maintain contacts within the first three weeks of the student’s first term and then at least quarterly throughout the student’s first year to assist them with support plans as needed. Analysis of the 2007 cohort showed a 16 percent increase in fall-to-fall retention of the high-risk group as a result of this intervention. 
Coupled with both of these approaches are early alert referrals from faculty to initiate intervention advising during the course of a term.

The Developmental Education Initiative afforded Zane State the opportunity to build on these initial successes and scale the intrusive advising program. The comprehensive intervention program touches all developmental students in some way, from the initial group placement test and CSI interpretation sessions to the intrusive interventions. Despite significant enrollment growth, the CSI case management style intervention has been maintained by employing three part-time paraprofessional advisors in the Success Center to make the personal student contacts and refer students to professional support services as needed. Their work frees time for the professional advisors to focus on the other interventions. To ensure quality service delivery, advisors and paraprofessional advisors participated in orientation sessions with the director of the Student Success Center. The academic advisors in the Success Center who had been working with the various interventions then trained the new advisors. Additionally, the new academic advisor attended a National Academic Advising Association event for further professional development. The unmet prerequisite intervention for math has been expanded to developmental reading and English with the addition of another academic advisor.

Collectively, all of these initiatives are contributing to the goal of improved first year fall-to-fall retention rates: data have shown that students who began one fall and returned the following fall had a three-year graduation rate of 87 percent.

Wednesday, April 11, 2012

Come and Get It!

Hey blog readers! There's some new material in the Resources section of the Developmental Education Initiative website. The DEI team at Cuyahoga Community College has agreed to share three pieces that they've developed as part of their DEI work. In the "Curricular and Instructional Revisions" category you'll find:
  • two manuals chock full of cooperative learning activities for developmental English courses. There are exercises to get students talking about reading and writing, as well as ideas for forming groups and evaluating group dynamics--and much more!
  • a supplemental instruction leader training manual. This one covers student-to-student interactions, session planning, session activities, and case studies
 Many thanks to Cuyahoga for being willing to share this great work. Remember, if you use the manuals, please give the authors credit for their effort.

Thursday, April 5, 2012

This Week in Links: Equity, Policy, Analytics, and Peeps!

  • Sara Goldrick-Rab posted this week about the assault on community colleges (and, by extension, on equity): “That's right—students are showing up at ‘open door’ colleges and being effectively turned away.  Welcome to the ‘new normal.’”
  • Getting Past Go posted a video on Tuesday of Katie Hern’s presentation to the National Association of Latino Elected Officials (NALEO). Katie, previously featured on Accelerating Achievement and director of the California Acceleration Project, gives a five-point policy agenda:
  1. Set a statewide policy directive that limits the amount of time students spend in remediation
  2. Incentivize colleges to develop accelerated pathways in reading/writing, ESL, and math
  3. Fund professional development to train faculty to develop and teach in new accelerated models
  4. Maintain a commitment to access while increasing completion – we need to cut the lower levels from our remedial sequences, not the students unlucky enough to be placed there
  5.  Reject solutions focusing on the need for more and better placement testing, including “diagnostic testing.”
  • The Carnegie Foundation for the Advancement of Teaching and Learning is “designing a system that will incorporate institutional records going back to 2008 on the longitudinal performance of cohorts of students designated for developmental mathematics at each of the 30 colleges participating in [their] community college mathematics pathways initiative. These data constitute a baseline for understanding institutional performance over time, for establishing college-by-college improvement targets, and for exploring the antecedents and conditions of performance going forward.” They are also “prototyping continuous data feed reports to faculty on their classroom context and individual student progress.” Pretty cool stuff.
  • Remember last year, when we were inspired by the Washington Post’s Peeps Diorama Contest to use peeps to demonstrate key developmental education reforms, like contextualization and a strong peer support network? Sadly, we don’t have any new dev ed peep-oramas this year. But be sure to check out the winners of this year’s Washington Post Peeps Diorama contest.

Thursday, March 29, 2012

For the Reading List

It’s a newsy week on Accelerating Achievement. Today, we’re highlighting recent releases from the Community College Research Center. And while we’ve not figured out how to provide a link that downloads directly to your brain, we hope that you’ll be able to find time to read the pieces that are relevant to your work.