Showing posts with label Scaling Up. Show all posts
Showing posts with label Scaling Up. Show all posts

Tuesday, September 4, 2012

A President's Reflection: Scaling with Fidelity

Last week, we discussed different ways to describe the return on investment from a particular program. A convincing ROI can go a long way in generating support for a scaling strategy. In today’s post, Sanford C. “Sandy” Shugart, president of Valencia College, reminds us of how important it is to keep testing that ROI. The returns from a pilot might change dramatically when you expand to more people and disciplines. Maintaining fidelity to the proven model is essential if you want to maintain (and continuing improving) results.

Valencia College entered DEI with a history of innovation and a clear sense of areas to scale for effect in developmental programs. Specifically, the college planned to:
  • expand dramatically the use of supplemental instruction (SI) in developmental mathematics and other gateway courses 
  • grow the number of paired courses designed to function as learning communities 
  • expand the offering of student success courses while embedding the success strategies in other first-year courses 
  • expand a transition program for certain at-risk students called Bridges to Success
In the early going, we believed the initial stages of scaling—clarifying the model, recruiting adopters from the faculty and students, training the implementers, and having the will and support to reallocate scarce resources—would present the greatest challenges. We were wrong. Early stages of implementation were marked by enthusiasm, collaboration, and high expectations for success. The performance data were eagerly anticipated each term and became important levers in the conversations and revisions to the programs that followed.

Scaling went rather well, with each treatment showing positive early results and the general trend in student performance moving toward accelerated success—more credit hours earned in fewer academic terms by each succeeding cohort of students. With this success and the word of mouth among both students and faculty about the work, recruiting adopters became much easier. In fact, demand for SI-supported sections quickly grew beyond the initial disciplines and courses selected. (Valencia served more than 11,000 students in SI-supported course sections last year.) These were generally positive signs. However, as we scaled to greater and greater numbers of course sections, faculty, and students, we began to discover a new set of challenges.

SI will serve as an example, though we had similar challenges in other innovations. As we reviewed data from the SI sections versus “non-SI” sections, we began to find significant differences in the effect on indicators like student persistence to the end of course and grade distribution. Some of these seemed almost random, while others clearly showed a pattern (by campus, by discipline, etc.). Deeper inquiry revealed that the effect for students who actually participated in the supplemental activities that make SI effective were almost uniformly high for all students. What the discrepancy in the data revealed were major differences in implementation. It seems some faculty had found ways to encourage, perhaps even require, active student engagement in supplemental learning activities, while others had managed to achieve very little of this engagement. These differences existed in spite of aggressive training efforts prior to implementation.

Similarly, we found that while SI was especially effective in many of the mathematics gateway courses, the effect was much less striking in many of the courses to which it was added in the later years of our work (economics, political science, etc.) Again, we inquired more deeply and discovered not only differences in methods, but very different understandings among the faculty (and students) about the purposes of SI and the outcomes we were seeking.

We had, as our projects reached maturity, encountered the problem of scaling with fidelity to the model. It seems that our discipline of innovation needs to include ongoing evaluation of the purposes and steps to implementation, continuing staff training, and rigorous data analysis to assure that the treatment in which we are investing doesn’t take on a life of its own after institutionalization. A useful lesson!

Thursday, August 23, 2012

R to the O to the I

In March, we released More to Most: Scaling Up Effective Community College Practices, a guidebook that lays out a process for identifying promising practices and developing an expansion plan so that good programs reach as many students as possible. Central to the process is the SCALERS framework, developed at Duke University and discussed extensively on this blog. There are other steps that form the foundation for SCALERS application. We’re going to tell you about some of them in our next few posts. To download a copy of More to Most, visit the website: www.more2most.org.

The first step in the process is determining program value; you can read a rundown of that step here. One part of that rationale—and a rather popular one in these austere days of budget cuts—is demonstrating the return on investment (ROI), the sometimes-hard-to-quantify case that shows program spending generates a financial return (in increased FTE or other revenues) that offsets operating costs. Standardizing this process, where possible, can help you compare different programs and make decisions about which programs should be discontinued and which should be expanded. This kind of information, of course, needs to be combined with other qualitative data about program effects, and considerations of faculty and student engagement and institutional priorities.

Program ROI
There are a number of ways to analyze the connection between a program’s results and its costs. In 2009, Jobs For the Future and the Delta Cost Project released Calculating Cost-Return on Investments in Student Success. The report determines cost-return on investment by assessing student retention for program and non-program students, the resources required for program operation, and the revenue gained by additional retention using the following data and calculations:
  1. Additional number of students enrolled into the next year because of the program: Calculated using number of students served, one-year retention rates for program participants, number of participating students retained, and one-year retention rates for non-program students.
  2. Direct costs of program operation: Calculated based on expenditures for personnel, supplies and equipment, stipends for students, facilities, etc
  3. Spending and revenue data: Calculated from average expenditures per student, and additional tuition, state, and federal revenue gained from increased retention.

These simple calculations could be the beginning of a formula that is unique to your institution, one that incorporates the costs, revenue, and priorities that are most relevant to your program goals.

Local and State ROI
You can also think about ROI beyond the bounds of a program. The various student success programs and policies at a college add up to increasing numbers of students completing credentials. (At least, that’s where we’re headed, right?) And increased completion outcomes can lead to positive returns for communities and states, even the nation as a whole. Being able to make such a case at your college—and across a system—could garner support from policy makers at the state and federal level. The Center for Law and Social Policy (CLASP) partnered with the National Center for Higher Education Management Systems (NCHEMS) to create a Return on Investment Dashboard. The dashboard combines data from the Census Bureau, National Center for Education Statistics, and Department of Education to project short- and long-term effects of maintaining the educational attainment status quo vs. increasing the number of credentialed adults in a particular state.

In addition to credential attainment, state summaries include projections for personal economic gain, as well as returns to the state in the form of state income, property, and sales taxes. The dashboard also has figures for Medicaid savings and correctional system expenditures. The dashboard allows you to manipulate high school completion, college going rates, credential attainment, and enrollment patterns to see how increases or decreases might affect individual and state returns on investment.

ROI and External Funding

The growth of social innovation financing or social impact bonds is one way that demonstrating return-on-investment is tied directly to external funding. Organizations cover initial program costs by borrowing money from foundations and other investors. If the organization meets agreed upon outcomes, they’ll receive additional funds from the state, and original investors will receive a portion of those funds. While this form of financing is primarily occurring in the nonprofit and social enterprise sectors, it could have implications for the education sector as well. You can read a great summary of social impact bonds on the Nonprofit Finance Fund website and check out how some social programs in Massachusetts are implementing this type of financing.

Whether you’re trying to quantify program costs, make a state-wide case for investing in a particular innovation, or looking for new ways to fund your efforts, understanding and articulating the return-on-investment is an important piece of the scaling up puzzle.

Tuesday, August 21, 2012

Is it worth it?

In March, we released More to Most: Scaling Up Effective Community College Practices, a guidebook that lays out a process for identifying promising practices and developing an expansion plan so that good programs reach as many students as possible. Central to the process is the SCALERS framework, developed at Duke University and discussed extensively on this blog. There are other steps that form the foundation for SCALERS application. We’re going to tell you about some of them in our next few posts. To download a copy of More to Most, visit the website: www.more2most.org.

Before you create a plan for scaling up, you need to decide if the program or practice you want to scale is actually effective. The last thing you need is to do more of something that isn’t even worthwhile. (Such logic only applies to things like eating Doritos or watching episodes of Chopped.) Chapter 2 of More to Most, Determining Program Value, lays out a process for assessing the value of a program.

The first stage of this process is defining the problem you are trying to address with your program and identifying your desired outcome. An example of a well-defined problem:
Too many students who test into three developmental education courses never successfully complete a college-level math or English course.
An example of a concrete outcome:
Currently, X percent of students successfully complete gateway math in one year. We will increase this by X percentage points by [YEAR].
You’ve got a program that’s designed to address this problem and help you reach this outcome. During the next stage, you collect evidence that will help you decide how well the program is achieving the desired outcome. What evidence of the program’s impact, both quantitative and qualitative, is available? Possible sources include:
  • Basic demographic profile of student body, a target group, and/or program participants
  • Course completion data: by course; by student cohort
  • Focus group data from students, staff, and/or faculty
  • Data from national surveys like CCSSE, SENSE, Noel-Levitz, etc.
Now, you can determine whether the program or practice meets criteria for effectiveness. We focus on two important criteria:
  • Does the evidence show that the program delivers the desired outcome?
  • Would scaling up the program align with institutional objectives?
It is important to consider both of these questions when determining the value of scaling up the program. Many institutions analyze data disaggregated by race, income, and other demographic factors and identify achievement gaps among student populations. If closing these gaps is an institutional priority for your college and one of the desired outcomes of your program, then make sure you analyze the evidence for how effectively the program accomplishes this goal. When comparing the evidence for several programs, keep in mind that if the program has positive outcomes for a designated population that is generally less successful, it may show a lesser impact on overall student success outcomes, at least in the short term. This does not make the program less valuable, however. The value comes from how well the program matches your desired outcomes and the institution’s priorities — or makes a case for changing those priorities.

Once you’ve collected your evidence, it’s time to make the case for expansion and developing a scaling strategy. We’ll discuss those steps in upcoming posts.

Tuesday, July 3, 2012

Scaling Social Impact: NYC Edition

Earlier this month, we attended the Social Impact Exchange’s annual Symposium on Scaling Social Impact. The Symposium brought together nonprofit organizations, funders, consultants, and evaluators to share knowledge about bringing social solutions to scale. We were there to share what we learned as we created More to Most, a guidebook on scaling up effective community college practices, and to learn from the experiences of others. Here are a few themes from the conference that we’d like to share:
  • If “scaling social impact” sounds like a nebulous phrase, that’s because it is. As the meeting organizers readily admitted, “There are many ways to achieve scaled impact—from replicating programs in new locations to developing breakthrough products and services; from scaling policy initiatives and social movements to online expansions through the use of toolkits and platforms. And there are other types of expansions too that include knowledge sharing, network building and collaborations.” While many of the symposium attendees were focused on replicating a nonprofit’s services across multiple geographies, More to Most is focused on scaling within a system. Our big question: how can community colleges go from serving some students in effective programs, to expanding those programs to more students, and finally reach most of those who can benefit from them.
  • Is the ideal funder a thought partner, too? According to a panel on grantmakers and nonprofits working in partnership, yes. One panelist used a food metaphor (always our favorite) to explain: she said funders should help build the kitchen, but they don’t need to be in there cooking with you; in other words, there’s a place for funder input in program design, but implementation should be left to the delivery organization. Another pointed out that when funders are engaged as thought partners, they are usually willing to be more flexible with timelines and shifting plans. 
  • Making the most of opportunities is usually good, but being too opportunistic and losing coherent priorities is bad. Know yourself and your non-negotiables. If you adjust a working solution to meet the preferences of a new funder or for the sake of simply making the program larger, you may risk your effectiveness. 
  • To get results, make talent development a priority. A panel of philanthropic leaders drove the point home at the symposium, reminding us that a key indicator of job satisfaction across sectors is the feeling of continual challenge. The panelists recommended that the social sector pay more attention to cultivating talent and leadership over time. For community colleges, this means that any successful scaling effort should be linked with professional development and faculty engagement
  • Sustainable funding is crucial to scaling success. A program can’t rely on a continuous stream of grants to operate. While grant funds work great for start up and proof of concept, a program needs to identify a long-term sustainable funding stream. Many grantmakers are hesitant to fund an idea that doesn’t have a plan for revenue generation. At the community college, this usually means finding a way to get resources reallocated in the general fund. Use the grant money to demonstrate the effectiveness of your program, so that college administrators recognize the value of incorporating the program into the annual budget.

Wednesday, April 18, 2012

Replicating Impact

Today, we’re returning to our “SCALERS: Round 2” series. Originally created by Paul Bloom, the SCALERS model identifies seven organizational capacities that support the successful scaling of a social enterprise: Staffing, Communicating, Alliance-building, Lobbying, Earnings Generation, Replicating Impact, and Stimulating Market Forces. (You can read an introduction to each driver in our first SCALERS series.) Now, we’re asking DEI colleges about how particular SCALERS drivers have contributed to their scaling efforts. We’ve looked at staffing. communicating, alliance-building, lobbying, and earnings generation. In the post below, Becky Ament, associate dean for developmental education at Zane State College, discusses how Zane State went about replicating the impact of their intrusive advising approach for developmental education students.

When Zane State College joined Achieving the Dream in 2005, initial data analysis suggested interventions with two groups of students had the greatest potential to improve the year-to-year retention rates:
  • Students who tested two levels below college level math, but failed to complete at least one developmental class during their first year in college; nearly 100 percent of students who fell into this category were not retained from one fall to the next.
  • Students most at risk for dropping out as measured by the Noel-Levitz College Student Inventory (CSI); students scoring highest (7, 8, or 9) in “dropout proneness” on the CSI were significantly less likely to be retained from one fall to the next.
The resulting interventions were:
  • Developmental math advising: Developmental student outcome data indicated strong course retention and successful completion rates as well as strong success rates in targeted gatekeeper courses. Confident that the curriculum was well aligned and meeting students’ needs, new intervention strategies focused on intrusive advising. Academic advisors developed an unmet prerequisite intervention process that monitors students’ participation in developmental mathematics through enrollment and completion of a college-level math course. Any student not attending or dropping out of a developmental mathematics course was targeted for intervention advising and required to continue the appropriate sequence of developmental and college-level math courses. In two years this intervention resulted in a 5 percent increase in students successfully completing developmental math courses within their first year. By 2008, the increase grew to 10 percent.
  • Advising for students most at-risk: This program provides personal contacts and individual support to students scoring in a high profile range for “drop-out proneness.” Advisors interpret the CSI results for the students immediately upon completion during the placement test session and discuss support options for counseling, tutoring, and communicating with a contact person who cares about the student’s entire experience and success. Student Success Center personnel maintain contacts within the first three weeks of the student’s first term and then at least quarterly throughout the student’s first year to assist them with support plans as needed. Analysis of the 2007 cohort showed a 16 percent increase in fall-to-fall retention of the high-risk group as a result of this intervention. 
Coupled with both of these approaches are early alert referrals from faculty to initiate intervention advising during the course of a term.

The Developmental Education Initiative afforded Zane State the opportunity to build on these initial successes and scale the intrusive advising program. The comprehensive intervention program touches all developmental students in some way, from the initial group placement test and CSI interpretation sessions to the intrusive interventions. Despite significant enrollment growth, the CSI case management style intervention has been maintained by employing three part-time paraprofessional advisors in the Success Center to make the personal student contacts and refer students to professional support services as needed. Their work frees time for the professional advisors to focus on the other interventions. To ensure quality service delivery, advisors and paraprofessional advisors participated in orientation sessions with the director of the Student Success Center. The academic advisors in the Success Center who had been working with the various interventions then trained the new advisors. Additionally, the new academic advisor attended a National Academic Advising Association event for further professional development. The unmet prerequisite intervention for math has been expanded to developmental reading and English with the addition of another academic advisor.

Collectively, all of these initiatives are contributing to the goal of improved first year fall-to-fall retention rates: data have shown that students who began one fall and returned the following fall had a three-year graduation rate of 87 percent.

Wednesday, March 28, 2012

State Policy Update

Our colleagues at Jobs for the Future have been compiling newsletters, writing reports, and posting blogs, all about developmental education, state policy, and every good thing. We wanted to be sure that you had the latest links to all this great material:
  • The March 2012 edition of Achieving Success is available here. Achieving Success is the state policy newsletter of Achieving the Dream and the Developmental Education Initiative, with features on all three elements of the DEI state policy framework. In data driven improvement, you'll find a synopsis of a recent convening on faculty engagement; in investment in innovation, there's a conversation with Shanna Smith Jaggers from CCRC discussing state policy implications of her recent work on the opposing forces that shape developmental education (she blogged about it here); and finally, in policy supports, you'll find viewpoints about placement polices from both SMARTER Balanced and PARCC common core assessment consortia.
  •  JFF has also has released a new set of tools to help states design performance-based funding systems. You can download Tying Funding to Community College Outcomes: Models, Tools, and Recommendations for States here.

Wednesday, March 21, 2012

Redesigning & Restructuring Math Emporium Facilities: Challenging, But Not Impossible

Today, we’re returning to our “SCALERS: Round 2” series. Originally created by Paul Bloom, the SCALERS model identifies seven organizational capacities that support the successful scaling of a social enterprise: Staffing, Communicating, Alliance-building, Lobbying, Earnings Generation, Replicating Impact, and Stimulating Market Forces. (You can read an introduction to each driver in our first SCALERS series.) Now, we’re asking DEI colleges about how particular SCALERS drivers have contributed to their scaling efforts. We’ve looked at staffing. communicating, alliance-building, and lobbying. In the post below, Lucy Michal, mathematics professor at El Paso Community College, discusses earnings and resource generation as she shows how EPCC had to tackle the redesign of physical spaces to scale up the college’s math emporiums.

When El Paso Community College’s Developmental Education (DE) Math Standing Committee faculty restructured the DE math course sequence, redesigning curriculum presented familiar challenges, but redesigning facilities presented unfamiliar challenges. To help with this, college administrators sent committee members and deans to visit colleges who had undergone redesign.  After several site visits, committee members did not see any one design that would work for EPCC’s population of students and its multi-campus district. They had a lot of planning and resource building to do before scaling up DE math emporium course offerings.

The Committee’s initial proposal had three options:  joining two classrooms to create mini-emporium areas, identifying a store or warehouse to transform into a math emporium, or using an area in the Administrative Services Center to construct a math emporium for the District.

While college administrators reviewed these options, El Paso’s Developmental Education Initiative went into its first year with funding for identified needed technology—computers and printers—but redesigning facilities presented a greater challenge. One of the campus deans volunteered her campus to be the first to tear down a wall and construct a math emporium. Reconstruction occurred during winter break and in Spring 2010; the first mini-emporium offered DE math emporium courses in an area with 48 computers and one printer. The second emporium facility was made possible when funding for a new science wing included expansion of the math lab, an emporium with 34 computers, a tutoring area, and a seminar room. The third campus identified two classrooms and underwent the same construction used in the first campus to create a mini-emporium with 48 computers. 

The fourth campus in downtown El Paso presented a challenge because of its older buildings. At that time, the college purchased a building to expand the downtown student service offices; however, it was too small. The college president identified a different building for the downtown campus math emporium. The building, an old bakery across the street from the downtown campus, was transformed into a large area for the math emporium, a math lab, a tutoring area, three faculty offices, a math faculty meeting room, and a computer classroom. The old bakery is now an innovative Learning Emporium.

And finally, for its largest campus, the College purchased a portable building and is currently being constructed to house two large emporium rooms with 40 computers in each room, one computer classroom, and an area for tutoring and student study tables. Construction will be completed with enough time to allow for Fall 2012 DE math courses. 

The DE math committee learned more than just how to restructure their courses; they also learned about:
  • expanding and reconstructing facilities
  • furnishing learning areas
  • wiring for new technological classrooms
  • rescheduling class sections
  • designing learning spaces for students. 

Among the most important of the lessons learned was: always have a plan B in case construction deadlines are not met. With the emporium now in place on the final campus, the college will be offering over 80 percent of its DE math courses in campus-based math mini-emporium areas.

Tuesday, March 20, 2012

Scaling Up at Grad Nation

We’re at the 2012 Building a Grad Nation Summit, hosted by America’s Promise Alliance. According to board chair Alma Powell, building a grad nation is all about “bringing people together from all sectors of our communities to work in new, more coordinated ways and to inspire action.” You can read about their goals here. While they’ve got immense energy focused on the high school dropout crisis, there have been multiple sessions discussing connections to postsecondary education and careers (though we’re hoping to see a larger community college presence next year!)

In addition to postsecondary connection, there's also been talk about “scaling what works.” Since we’ve just released a guidebook on scaling up effective practices (check it out!), I’m interested in how other organizations are thinking and talking about this topic. (Really, I’m always interested in it—let’s be honest.) A session yesterday included panelists from large national organizations (the Corporation for Public Broadcasting, Big Brothers Big Sisters), organizations taking the first steps to start affiliated organizations in a new city (Self Enhancement, Inc.), and smaller organizations beginning the process at the community level (Boston Rising, Zone 126).

One thing that’s been on my mind is the idea of starting small, but thinking big—in other words, how scale fits into early program design. I wanted to hear what these organizations had learned about planning for scale and ensuring that the resources invested in pilots went into something scale-able. Lucky for me, there was a Q&A session. Here’s what I heard:
 
  • Limit the number of moving parts. Reducing the complexity of your approach will make it easier to replicate.
  •  Articulate a simple, clear theory of change. Identify the inputs necessary to reach the outcome you want. Then determine the supports you’ll need to do just that. 
  • Consider cost from the beginning. Pilot design should not just be about the mechanisms for delivering the program—it’s about the entire operation. And that includes resources. 
  • Think about who you can partner with to disseminate your strategy and expand your reach. Are there technology options? Are there organizations that already do something that you want to integrate?

I like this idea of keeping it simple—both the theory of change and the program. But I know that it’s difficult to do when you’re addressing complex issues and working in equally complex organizations. Just one more reason to tackle these big questions from the very beginning and do everything you can to set your program up for success.

Wednesday, March 14, 2012

Announcing More to Most!


We are delighted to announce the release of MDC’s latest publication, More to Most: Scaling Effective Community College Practices. More to Most is a guide for community colleges that are expanding small or pilot programs into larger, sustainable efforts that serve most—if not all—of the students who can benefit from them. You can watch an introductory slide show and download a copy from www.more2most.org.

As we discuss often on this blog, community colleges across the country have developed unique programs that help students succeed and put them on a path to a better life. The Developmental Education Initiative colleges and states have been vital partners as we’ve learned together about the resources and practices that are required to scale-up effective developmental education programming—and to help students accelerate through or bypass remediation altogether. Much of that learning is reflected in the pages of More to Most, including examples of promising practices from many of the DEI colleges and states. We’ve also included material that’s been featured on Accelerating Achievement, including the SCALERS model developed by the Center for the Advancement of Social Enterprise at Duke University's Fuqua School of Business and case examples from Kingsborough Community College, the Academy for College Excellence, and Chaffey College.

Even in focused efforts like DEI and Achieving the Dream, deciding which programs to expand and how to do it is a complex process that can waste valuable time and resources if not conducted thoughtfully. The comprehensive—but not prescriptive!—process outlined in More to Most helps you assess which programs are ripe for expansion, and gives direction on how to design a scale-up plan—and it’s all designed to dovetail with planning structures already in place.

To test out the process, we reached beyond the DEI network to Jackson Community College, an Achieving the Dream college in Jackson, MI. JCC recently used More to Most in its decision to expand three student success initiatives. With the strategies outlined in the guidebook, faculty, staff, and administrators at Jackson demonstrated the program’s effectiveness and connected that success to the college’s strategic plan. They examined the expansion’s budget implications, how it would be evaluated, and created a work plan. Finally, they examined the policy implications of the expansion. The undertaking involved deep conversations among faculty, student services, and business office staff, and the resulting plan is a great example of how the process can be customized to dovetail with any college’s specific needs or culture. (Click here to learn more about Jackson’s experience from a March 1 spotlight session at the recent ATD D.R.E.A.M. event.)

If you want to see how the process might work on your campus, head on over to more2most.org and download your copy. Let us know what you think!

Wednesday, February 15, 2012

This can save you money.

Today’s post is our fourth installment of “SCALERS: Round 2.” Originally created by Paul Bloom at the Duke University Fuqua School of Business’s Center for the Advancement of Social Entrepreneurship, the SCALERS model identifies seven organizational capacities that support the successful scaling of a social enterprise: staffing, communicating, alliance-building, lobbying, earnings generation, replicating impact, and stimulating market forces. (You can read an introduction to each driver in our first SCALERS series.)

Now, we’re asking DEI colleges about how particular SCALERS drivers have contributed to their scaling efforts. So far, we’ve covered staffing, communicating, and alliance-building. Below, Ginger Miller of Guilford Technical Community College shares what GTCC has learned about lobbying--or demonstrating impact, as we like to call it.



With a headcount enrollment of about 15,100 students, Guilford Technical Community College (GTCC) is the third largest community college in North Carolina. Its developmental education program serves 4,780 students on three campuses. As with any developmental education program, the primary goal at GTCC is for students to take the fewest number of developmental education courses necessary and complete them as quickly as possible. Our focus here is to describe how COMPASS testing supports this goal. Reviewing and re-testing the COMPASS test can save you time and money. That is the message we emphasize to incoming students. As an example, if a student places out of two developmental classes in English, that translates into tuition savings for the credit hours as well as two semesters—a full academic year—of their time.

As part of the application process, students complete the COMPASS placement test. Among the biggest obstacles to accomplishing proper placement in development education classes is student misconception about the test itself. Students may confuse COMPASS as an entrance test, rather than a placement test. Since they know they are accepted by a community college, they may not do their best to score well. To address this, we emphasize during registration the importance of taking the review workshop and re-testing, depending on their score. They must complete a review workshop before they are permitted to re-test. This workshop, available online or face-to-face, reviews the question format and the content for the math, English, and reading sections of the test.

As a result of completing a review workshop, followed by a re-test, 1,288 students have tested out of one or more developmental classes from fall 2010 through fall 2011. This represents a total estimated tuition savings of about $370,600 during the past three semesters. The largest percentage of students to test out of developmental education coursework appears in English and reading.  For three semesters between fall 2010 and fall 2011, an average of about 61 percent placed out of at least one developmental course in English. About 59 percent placed out of reading courses. In math, the results are lower, at about 33, 39, and 29 percent, for fall 2010, Spring 2011, and fall 2011.





 
At GTCC, the numbers of students to use these workshop reviews have increased from 981 students in fall 2010 to 1,241 students in fall 2011. The greatest increase has appeared in the online reviews, as compared to face-to-face workshops. For example, in fall 2011, 73 students attended the face-to-face reviews, compared to 1,168, who worked online. We also continue to reach out to local high schools, explaining the importance of the placement test; we have arranged for graduating seniors to take the test, the review workshop, and re-test again.

Friday, January 20, 2012

Many Happy Returns

Today’s blog birthday post is brought to you by the letter S. We look back at Accelerating Achievement posts on two of our favorite topics: scaling and state policy.

In Scaling Up, we’ve been harvesting the latest thinking on scaling from the social innovation field, calling attention to tools and resources that can help colleges and states increase the impact of developmental education advancements. We’ve also highlight stories of colleges and states that have found ways to expand the reach of promising practices. The Joy of Scaling launched a seven-week series on seven organizational capacities that support successful scaling of a social enterprise, represented by the acronym SCALERS: Staffing, Communicating, Alliance-building, Lobbying, Earnings Generation, Replicating Impact, and Stimulating Market Forces. MDC adapted this model from Duke University’s Fuqua School of Business for use in community colleges. An ongoing series is delving deeper into the individual SCALERS, seeing how they apply to supplemental instruction, self-paced courses, and faculty engagement.

Supportive state policy is an essential component in any institutional plan to expand innovation to more students. This year’s Statewise posts have followed DEI state policy teams, coordinated by Jobs For the Future, as they work within state community college systems and legislatures to change outdated rules, funding, and incentive structures that stand in the way of innovation. Michael Collins, associate vice president of postsecondary state policy at JFF, laid out the Developmental Education Initiative State Policy strategy in a three-part series. The first segment showed how collecting the right data can inform state policy to accelerate dev ed innovation across a system. Part two detailed how states are investing resources in that innovation. The final installment, our most read Statewise post, made the case for a continuous improvement cycle focused on strengthening policy supports.

Finding ways to bring what works to more students will remain a vital concern for higher education—and for Accelerating Achievement—as colleges and states continue to face increasing enrollments, diminishing resources, and intensifying pressure to move students to credentials more quickly and efficiently.

Wednesday, December 14, 2011

Guest Post: Allied Forces

Today’s post is our third installment of “SCALERS: Round 2.” Originally created by Paul Bloom at the Duke University Fuqua School of Business’s Center for the Advancement of Social Entrepreneurship, the SCALERS model identifies seven organizational capacities that support the successful scaling of a social enterprise: staffing, communicating, alliance-building, lobbying, earnings generation, replicating impact, and stimulating market forces. (You can read an introduction to each driver in our first SCALERS series.)

Now, we’re asking DEI colleges about how particular SCALERS drivers have contributed to their scaling efforts. So far, we’ve covered staffing and communicating. Below, Nick Bekas of Valencia College in Orlando, Florida, shares what he’s learned about successful alliance-building over the years.


Ten years ago, we had an initiative at Valencia focused on developmental math. It failed. However, it did not fail because of its ineffectiveness in improving student learning. It failed because of people, and no one person could be blamed. It was really an organizational failure. I watched from the sidelines as a promising initiative that showed real learning gains for students unraveled because the key players never developed working relationships. (Note: Watching from the sidelines makes you part of the failure. Had I stepped on the field and tripped someone, I might have made a difference.) One would think that professors who are committed to learning could put aside philosophical differences or perceived slights for the sake of improving student learning. This one failure taught me a very important lesson about the importance of building alliances. All relationships are personal even if they are professional. In my checkered past at Valencia directing a variety of initiatives both successful and not so successful, here is what I have learned about building alliances.

1. Find the acid drippers.
We use this term for people at our institution who will criticize everything even if they proposed it. By seeking their advice and participation, I head off issues down the road. This is not to say that we will agree on the direction and scope of the project or that they will actively participate, but it does make them part of the conversation and validates their voice. I don’t try and convert them; I listen to them, hear what they have to say, and tell them how they can help. If they choose not to, it’s on them, but at least I tried. I have found that I get less interference and more cooperation even though it is mostly passive. And maybe on the next project, they will participate.

2. Engage people on the ideas, not just the process or the product.
If I want an initiative to get off the ground, I don’t talk just about the initiative. I focus on the ideas informing the initiative and get people to have conversation about the ideas. When you give someone a finished product and ask them to comment on it, you have already divested them from it. Educational initiatives are not new products; you are not showing them the latest version of a Snickers bar and asking for comment. You are asking their opinion on something they are experts on, so they want to be part of the process, not just the product. Ideas excite people.

3. Build on natural alliances.
You have to know your institution and your people. Find people of like mind and purpose and put them at the core of your work. I am not advocating for a “clone” army, but for a group of people who are philosophically aligned with the goals of your project. This group should be the “true believers” who help you shape the scope and direction of your work. You can then use them as “subversive” agents to help build support for your project. The director of a project or the lead on a project is often at a disadvantage when it comes to getting buy-in simply because he or she is the face of the project and not a person. I am not “Nick” but DEI. However, someone else talking about DEI is perceived differently and may get more of a response. A project director is perceived as having an agenda, which is true, but sometimes this perception precludes engagement with different groups, especially if it is not immediately clear how the goals of the initiative align with their everyday work.

4. Tap the “newbies” and “oldies.”
New faculty are always willing to participate and bring fresh ideas to the game. They see things through a different lens because they have not been part of the organization long enough to have been assimilated into its culture. They are also willing to be “exploited” for a small stipend and food because they are excited to be part of something, and if they are adjuncts, they need the small stipend and food. Also, veteran faculty are sometimes not involved because they are not asked to be. You can’t assume that they just don’t want to participate because they don’t respond to an all call. You have to give a personal invitation and tell them why you need their experience and expertise.

5. Be persistent and consistent.

I learned this from my kids. It applies to other forms of life as well. People crave consistency and reward persistence. I don’t stalk, but I do suddenly show up at an office to say, “Hi.”  You have to work at building relationships, and you have to be consistent in your message. This is the only way to change behaviors and to get wider participation. I have failed at this with my kids up to this point, but I have been pretty good with my colleagues.

Nick Bekas is DEI project director and professor of English at Valencia College.

Wednesday, November 16, 2011

Guest Post: Getting the Message Right

It’s time for the second installment of “SCALERS: Round 2.” Originally created by Paul Bloom at the Duke University Fuqua School of Business Center for the Advancement of Social Entrepreneurship, the SCALERS model identifies seven organizational capacities that support the successful scaling of a social enterprise: Staffing, Communicating, Alliance-building, Lobbying, Earnings Generation, Replicating Impact, and Stimulating Market Forces. (You can read an introduction to each driver in our first SCALERS series.) 

Now, we’re asking DEI colleges about how particular SCALERS drivers have contributed to their scaling efforts. Last month, we looked at staffing. Today, Becky Samberg of Housatonic Community College in Bridgeport, CT, shows what HCC is learning about how changing the message can change behavior.

Remember playing the childhood game of telephone? As the message was whispered from one child’s ear to another, one thing was certain: the game always ended with peals of laughter when the message whispered by the first child never was the same message repeated by the last in the chain. From this childhood game, we learn a very valuable lesson: communication needs to be deliberate and precise.

In our DEI work, we strive to be deliberate and precise in communicating with our HCC community about the initiatives on which we are working. Over the last year, however, lower-than-expected success rates in self-paced courses led us to change how we communicate to our students the nature of the self-paced program and the unique expectations of students enrolled in these courses.

Our self-paced program began in 2007 with developmental math courses. Originally, students could enroll in self-paced math courses—what we then called “Open Entry/Open Exit”—at any time in the semester and exit whenever they finished the course. Students also could start the next course in the same semester. We learned, however, that “Open Entry/Open Exit” was a misnomer.  Students were not accelerating as quickly or as successfully as we anticipated, and students’ financial aid obligations and status as a full- or part-time impeded their ability to move from one course to the next during a semester. As we considered these challenges, we concluded that the individualized instructional format and our expectations of students in these courses were not made explicit by the Open Entry/Open Exit title, so we made the following changes:

  • We adopted the name “Self-Paced,” emphasizing the focus on individualized mathematics and English instruction. This new course name more explicitly communicates that students enrolling in these courses will have individualized instruction at a self-determined pace.  
  • We defined our expectations of students and developed an orientation so students hear a consistent message from their instructors and the Self-Paced studies lab coordinator, who conducts the in-class orientation and oversees students’ visits to the lab. 
  • We made lab visits mandatory, deliberately delivering the message that self-paced does not mean no pace, and regular engagement with the course material is essential to student success. 
  • We re-designed the courses, communicating to students their obligation to make consistent progress throughout the semester and establishing the expectation that students work toward the goal of successfully completing the course within or in less than a traditional semester.  
  • We created a schedule for completing the self-paced courses in a single semester, sharing it with students, and embedding it in the course software. Students are told the pace at which they need to work and the benchmarks they need to reach to successfully complete the course in one semester.
Moving forward, we hope that the changes we have made will increase student success in our self-paced courses. In changing the course title to better communicate the nature of the course and in deliberately and precisely communicating class policies and our expectations of students, we hope to avoid the inevitable outcome of a game of telephone.

Becky Samberg is the chair of developmental studies and DEI director at Housatonic Community College.

Wednesday, October 12, 2011

Guest Post: Supplemental Instruction Leaders Don't Do Optional Either

In the first of our in-depth look at each of the seven SCALERS drivers, staffing, Ruth Silon of Cuyahoga Community College (Tri-C) delves into staffing a supplemental instruction (SI) program. The SCALERS staffing driver calls for effective use of resources to meet personnel needs, from administration to faculty to student services to student employees. In this candid post, Ruth describes the ups and downs of Tri-C’s approach to training student SI leaders in a one-credit special topics course.

 “Students don’t do optional.”

Where have we heard this before? It certainly applies to many developmental students’ use of the tutoring labs, optional orientations, and attendance at Supplemental Instruction (SI) sessions.

But what about the SI leaders themselves? Although we at Cuyahoga Community College (Tri-C), have a well thought out hiring and training process, I have found that if we do not have a very concrete way to manage and observe our SI leaders, they, too, will not do optional.

In June 2010, I attended the International Conference on Supplemental Instruction, and listened to Joyce Zaritsky from LaGuardia Community College discuss her one credit class for SI leaders. This approach seemed to make sense. The students could not be leaders unless they attended a weekly SI course. Here was the place where leaders could share and debrief and experience ongoing training.

During fall 2010, faculty and SI staff met to design the course and it was first implemented in spring 2011 as a special topics course – one session at each campus. The course would meet once a week for one credit.
  • First Problem: The students had to pay for the course, which at that time cost $84.00.
         Solution: Pay SI leaders for an extra hour and hope that is enough to offset the
         cost of the course. 

  • Second Problem: One of our leaders had already graduated.
         Solution: She still had to attend the class in order to be an SI leader. 
  • Third Problem: There was not a common time available for all the leaders to take the course.
         No good solution here: Not all the leaders attended, but about 80 percent
         did participate.

Meeting every week was a great experience, both for me, the teacher, and the leaders. I got to know almost everything that was going on in the SI-supported classes and the related sessions. I learned firsthand about the struggles leaders were having with their students and also with the classroom teachers. The class was more like a support group for the leaders than an academic class. This learning is really important as we are asking students to perform tasks that may be well out of their comfort zone. If we talk to each other about our students and our pedagogy, shouldn’t SI leaders be afforded that same experience?

This was a course, so the students had to complete certain tasks to get a grade. I asked them to turn in weekly journals, telling me what happened in their sessions. (This could be the basis of our conversations each week.) They also had to visit each other’s classes and write up an observation. At the end of the semester, they wrote an essay to a new SI leader, explaining the high and low points of the job and offering the new leader advice. The end result was twofold: a deeper understanding of what goes on in SI for both me and the leaders, and a very supportive environment to help the leaders do a better job.

Even though the course went well, we decided not to offer it this semester. Why not? I did not want to make SI leaders pay for the course again. I did not want to ask for more money for the SI leaders. And I thought “Last semester’s meetings went well. Of course this semester’s leaders will come to a weekly session, even if it’s not for credit!”

I was wrong. Sadly, I forgot that students, SI leaders included, may not do optional! Just like in any other class, some student leaders come every week, some attend occasionally, and others never show up at all. I am sorry to have to have learned the same lesson again: accountability is everything. But I have learned the lesson, so next semester we’ll be offering the course for our SI leaders again.

Ruth Silon is an associate professor of English and DEI project director at Cuyahoga Community College.

Thursday, October 6, 2011

S-C-A-L-E-R-S: Round 2

One of our favorite topics here on Accelerating Achievement is scaling up. Regular readers will remember our multi-week SCALERS series. Originally created by Paul Bloom, the SCALERS model identifies seven organizational capacities that support the successful scaling of a social enterprise: Staffing, Communicating, Alliance-building, Lobbying, Earnings Generation, Replicating Impact, and Stimulating Market Forces. In the Accelerating Achievement SCALERS blog series, we translated the model for application at community colleges.

Next week, we’ll launch a new SCALERS series. Each month, we’ll have a guest post from a DEI college about how a particular SCALERS driver has contributed to their scaling efforts. A little less conversation, a little more action! Today, we’re applying all seven SCALERS to a program from Chaffey College that has successfully scaled up. Thanks to Ricardo Diaz at Chaffey sharing his story with us!


The goal of Opening Doors to Excellence (ODE) at Chaffey College in Rancho Cucamonga, C.A., is to move students off of academic probation and back into good standing with the college. Participants develop an educational plan with an advisor, take a student success course, and complete a series of directed activities in the college’s student success center. Chaffey defined scale as an institutionalized program that, when fully implemented, would serve all students on academic probation college-wide; by this definition, the program is, in fact, scaled up. According to Ricardo Diaz, ODE coordinator, the successful expansion of the program has required attention to all seven SCALERS drivers:

Staffing. Since there are 300 to 400 students in the program each semester, Diaz is able to meet with each student only once prior to enrolling in the student success course. To address the need for continuous student follow-up, ODE is staffed by counselor apprentices. These counselors are paid graduate students from local universities who use the experience to complete required clinical hours for their program of study. Chaffey’s Human Resources department provides structure and support for hiring the apprentices; program leadership and coordination functions have been integrated into existing staff workloads.

Communicating. To expand ODE, Chaffey embarked on a strategic planning process that drew together key parties from across the college. The plan they constructed involved integrating services into existing programs, rather than creating a program with a stand-alone structure. During program development, the core planning committee held regular discussions with governance departments.

Alliance-Building. As mentioned above, ODE was developed with input from college-wide representatives. The program had the support of the president and board of trustees from the beginning. A crucial alliance for ODE was the purposeful collaboration between academic affairs and student services.

Lobbying/Demonstrating Impact. Chaffey’s Institutional Research department collaborated with MDRC to establish outcomes and evaluate ODE as part of MDRC’s Opening Doors project. When MDRC concluded their study, Chaffey’s institutional research continued. The strength of the evaluation allowed the program to obtain additional resources, recognition, and support for expansion.

Earnings Generation/Resource Generation. The initial MDRC funding for the program was matched by college funding commitments. With future expansion in mind, Chaffey integrated the core expenditures for the program into the college’s general fund. The MDRC grant was used as start-up money, funding program development, paraprofessional staff, books, supplies, travel, and training.

Replicating Impact. As the program grew, the core planning committee developed a continuous improvement process. Student learning outcomes and focus group feedback were used to refine program design. The committee encouraged regular sharing of practices among instructors along with professional development activities.

Stimulating Market Forces/Sustaining Engagement. Because ODE was integrated into the college’s core operational components from the beginning, it quickly became a regular function of how the college operates. Students embraced the program because enrollment incentives were put into place. The MDRC study allowed for easy dissemination of the model to other colleges. This gained national recognition for Chaffey, which ensured continued buy-in from leadership and the campus community.

What’s next? Chaffey has created a solution to their initial problem: ODE moves students from academic probation back into good standing. However, an MDRC study looking at ODE’s impact on moving students to completion revealed that the intervention does not result in increased rates of graduation or certificate attainment. While not the original intent of this intervention, it is none the less a critical objective that presents a new challenge in program development and scaling.  Now that Chaffey has a broad strategy that reaches the entire target population, it’s time to look at ways to scale the depth of the program’s impact, intensifying the intervention to amplify the impact or reach a new aim. The college intends to reconvene the core planning committee to explore strategies that can improve the likelihood that students who overcome their probationary standing also complete a degree and/or certificate.

Thursday, July 7, 2011

Guest Post: Choose Your Own Adventure

Today’s post comes from Sharon Miller, professor of transitional English at Lone Star College-CyFair in Cypress, Texas, outside of Houston. The Lone Star Community College District joined Achieving the Dream in 2006. Sharon’s post describes a classroom-focused intervention that has improved student success in CyFair’s transitional (or developmental) English department.

When an Achieving the Dream pilot at Lone Star College’s CyFair campus resulted in a 7 percent increase in overall student success, it got our attention. Applying what we learned, our success rates have continued to improve to above the 90th percentile of national benchmarks because the project is affordable, do-able, scalable, and sustainable to any discipline, anywhere. The secret: target the most important contact a student has with an institution: the classroom.

A Non-Boutique Project
Two years ago, our college system’s Achieving the Dream grant funding was coming to an end just as the Transitional English department was charged with devising a pilot to improve student success. The state legislature was slashing budgets just as our college system was experiencing an “enrollment tsunami” that hit our campus particularly hard. With a 29/71 full-time to adjunct professor ratio, there was no one available to take on a new project and no space to host a boutique student success center. We needed an ATD intervention that would work for us.
    
The Classroom-Embedded Intervention Pilot
Our Classroom-Embedded Intervention Project was “choose your own adventure” style. The concept was deceptively simple:
  • Ask the faculty to share what strategies were working to improve student success.
  • List the best and distribute them.
  • Track out-of-class interventions and withdrawals on student profile cards. 
  • Look at the results measured by end-of-course grades and performance on System Common Final Exams. 
Sharing success strategies for our ATD pilot was intense, energizing, and fun. Faculty circulated ideas via e-mail, and adjunct instructors joined the conversation. Soon we had an amazing list of strategies which we distributed at our spring kick-off meeting and posted in our online repository, The Sandbox. We all started the semester focused on good teaching and intensively intervening according to each instructor’s best professional judgment. Interventions were recorded on a student profile card that also included student contact information.

Analyze Results; Build On What Works
When student success rose 7 percent, we knew that we were on the right track. Looking at the student profile cards helped us to see patterns in interventions and withdrawals, and we have used this information to more carefully devise targeted interventions. We continue to share resources on our online learning repository and have added a Saturday work day a month into the semester to focus on pedagogy. Week 11 of each semester is now designated as “Advising Week.” Instructors use a one-page handout to tell students about the next course in their sequence, course delivery options, and answer questions. 

Ongoing professional development is imperative and challenging. Many of us are pursuing the International Alliance for Learning Certification in Accelerated Learning, and we are sharing the program’s researched-based, brain-friendly methods that improve student success and retention.

CyFair’s has expanded the Classroom Embedded Intervention throughout the Transitional English department. The Transitional Math and ESOL Departments are considering implementing the approach in the fall semester, and plans are under way for a presentation at an all-college faculty forum on student success. Faculty will have an opportunity to learn about what the Transitional English department has accomplished and to consider how the approach can be expanded college-wide in appropriate, discipline-specific ways. 

Interested?
You are invited to download our list of interventions, advising handout, and contacts for accelerated learning, which are posted on my faculty webpage. Together, we can achieve the dream one classroom at a time!

Sharon Miller is a professor of transitional English at Lone Star-CyFair. The Classroom-Embedded Intervention has been nominated for a Texas Higher Education Coordinating Board STAR Award.

Thursday, June 16, 2011

Guest Post: What’s Next for SCALERS?

Today’s post comes from Paul Bloom, Faculty Director and Adjunct Professor of Social Entrepreneurship and Marketing with the Center for the Advancement of Social Entrepreneurship (CASE) at Duke University’s Fuqua School of Business. Paul is also the creator of the SCALERS model. (Thanks, Paul!) Below, he shares the genesis of the model and his ideas for the next iteration of the work.

The idea of developing a model like SCALERS came to me a few years ago while reading the best-selling book, Made to Stick, written by the brothers Chip and Dan Heath. Among other things, they point out that acronyms can help ideas catch on and be remembered. I thought of my home discipline of marketing, a subject I taught for years and years, and how it has used the device of the “4P’s” (Product, Price, Place, and Promotion) to help students remember the essence of marketing. It occurred to me that the emerging field of social entrepreneurship needed a similar hook to help students and practitioners understand what it was all about.

As a newcomer to social entrepreneurship, I was struck by how possessed everyone was with the concept of “scaling.” Indeed, I have come to believe that more than anything else, what distinguishes social entrepreneurs from more conventional leaders of social-purpose organizations is the former groups’ obsession with scaling social impact. These folks want to change the world, not just run a sustainable and effective do-gooder organization. So whatever acronym or words I generated to help the field “stick” had to relate to the concept of scaling.

Fortunately, the letters of the acronym SCALERS suited what I was detecting from my own research and the research of others as the organizational capabilities that were the key drivers of successful scaling (i.e., Staffing, Communicating, Alliance-Building, Lobbying, Earnings-Generation, Replicating, and Stimulating Market Forces). I had to make some compromises – the word “Staffing” does not completely cover the range of human resource management capabilities that are needed for scaling, and “Lobbying” is just a portion of “Advocacy” (but the letter “A” was taken already). Nevertheless, I am very pleased by all the positive feedback and attention my original writing on the model has received (see Bloom and Chatterji, California Management Review, 2009 and Bloom and Smith, Journal of Social Entrepreneurship, 2010).

Still, my thinking about scaling is evolving and there are aspects of the SCALERS model that I have modified since the earlier articles were published. I am currently putting the finishing touches on a short book that will introduce these modifications and also try to explain the model to a wider audience. As much as anything, the modified model stresses that what makes one capability or driver important in one specific situation may not be the case in other situations. The importance of any particular SCALER will depend on the resources the organization possesses as it starts its scaling and the theory of change on which the organization is building its initiatives. For example, if the organization is poorly endowed with certain types of human or financial resources, then the importance of building capabilities to improve those resources (i.e., Staffing, Earnings-Generation) should become even more important for scaling success. And if the organization is theorizing that desired social impacts will occur if it informs more people about a problem or if new regulations are introduced, then the importance of building capabilities in Communicating or Lobbying will become paramount. Conducting an honest assessment of your organization’s unique situation is necessary for getting the most out of the SCALERS model.

Paul Bloom is Faculty Director and Adjunct Professor of Social Entrepreneurship and Marketing with the Center for the Advancement of Social Entrepreneurship (CASE) at Duke University’s Fuqua School of Business.