Showing posts with label SCALERS. Show all posts
Showing posts with label SCALERS. Show all posts

Thursday, August 23, 2012

R to the O to the I

In March, we released More to Most: Scaling Up Effective Community College Practices, a guidebook that lays out a process for identifying promising practices and developing an expansion plan so that good programs reach as many students as possible. Central to the process is the SCALERS framework, developed at Duke University and discussed extensively on this blog. There are other steps that form the foundation for SCALERS application. We’re going to tell you about some of them in our next few posts. To download a copy of More to Most, visit the website: www.more2most.org.

The first step in the process is determining program value; you can read a rundown of that step here. One part of that rationale—and a rather popular one in these austere days of budget cuts—is demonstrating the return on investment (ROI), the sometimes-hard-to-quantify case that shows program spending generates a financial return (in increased FTE or other revenues) that offsets operating costs. Standardizing this process, where possible, can help you compare different programs and make decisions about which programs should be discontinued and which should be expanded. This kind of information, of course, needs to be combined with other qualitative data about program effects, and considerations of faculty and student engagement and institutional priorities.

Program ROI
There are a number of ways to analyze the connection between a program’s results and its costs. In 2009, Jobs For the Future and the Delta Cost Project released Calculating Cost-Return on Investments in Student Success. The report determines cost-return on investment by assessing student retention for program and non-program students, the resources required for program operation, and the revenue gained by additional retention using the following data and calculations:
  1. Additional number of students enrolled into the next year because of the program: Calculated using number of students served, one-year retention rates for program participants, number of participating students retained, and one-year retention rates for non-program students.
  2. Direct costs of program operation: Calculated based on expenditures for personnel, supplies and equipment, stipends for students, facilities, etc
  3. Spending and revenue data: Calculated from average expenditures per student, and additional tuition, state, and federal revenue gained from increased retention.

These simple calculations could be the beginning of a formula that is unique to your institution, one that incorporates the costs, revenue, and priorities that are most relevant to your program goals.

Local and State ROI
You can also think about ROI beyond the bounds of a program. The various student success programs and policies at a college add up to increasing numbers of students completing credentials. (At least, that’s where we’re headed, right?) And increased completion outcomes can lead to positive returns for communities and states, even the nation as a whole. Being able to make such a case at your college—and across a system—could garner support from policy makers at the state and federal level. The Center for Law and Social Policy (CLASP) partnered with the National Center for Higher Education Management Systems (NCHEMS) to create a Return on Investment Dashboard. The dashboard combines data from the Census Bureau, National Center for Education Statistics, and Department of Education to project short- and long-term effects of maintaining the educational attainment status quo vs. increasing the number of credentialed adults in a particular state.

In addition to credential attainment, state summaries include projections for personal economic gain, as well as returns to the state in the form of state income, property, and sales taxes. The dashboard also has figures for Medicaid savings and correctional system expenditures. The dashboard allows you to manipulate high school completion, college going rates, credential attainment, and enrollment patterns to see how increases or decreases might affect individual and state returns on investment.

ROI and External Funding

The growth of social innovation financing or social impact bonds is one way that demonstrating return-on-investment is tied directly to external funding. Organizations cover initial program costs by borrowing money from foundations and other investors. If the organization meets agreed upon outcomes, they’ll receive additional funds from the state, and original investors will receive a portion of those funds. While this form of financing is primarily occurring in the nonprofit and social enterprise sectors, it could have implications for the education sector as well. You can read a great summary of social impact bonds on the Nonprofit Finance Fund website and check out how some social programs in Massachusetts are implementing this type of financing.

Whether you’re trying to quantify program costs, make a state-wide case for investing in a particular innovation, or looking for new ways to fund your efforts, understanding and articulating the return-on-investment is an important piece of the scaling up puzzle.

Wednesday, March 21, 2012

Redesigning & Restructuring Math Emporium Facilities: Challenging, But Not Impossible

Today, we’re returning to our “SCALERS: Round 2” series. Originally created by Paul Bloom, the SCALERS model identifies seven organizational capacities that support the successful scaling of a social enterprise: Staffing, Communicating, Alliance-building, Lobbying, Earnings Generation, Replicating Impact, and Stimulating Market Forces. (You can read an introduction to each driver in our first SCALERS series.) Now, we’re asking DEI colleges about how particular SCALERS drivers have contributed to their scaling efforts. We’ve looked at staffing. communicating, alliance-building, and lobbying. In the post below, Lucy Michal, mathematics professor at El Paso Community College, discusses earnings and resource generation as she shows how EPCC had to tackle the redesign of physical spaces to scale up the college’s math emporiums.

When El Paso Community College’s Developmental Education (DE) Math Standing Committee faculty restructured the DE math course sequence, redesigning curriculum presented familiar challenges, but redesigning facilities presented unfamiliar challenges. To help with this, college administrators sent committee members and deans to visit colleges who had undergone redesign.  After several site visits, committee members did not see any one design that would work for EPCC’s population of students and its multi-campus district. They had a lot of planning and resource building to do before scaling up DE math emporium course offerings.

The Committee’s initial proposal had three options:  joining two classrooms to create mini-emporium areas, identifying a store or warehouse to transform into a math emporium, or using an area in the Administrative Services Center to construct a math emporium for the District.

While college administrators reviewed these options, El Paso’s Developmental Education Initiative went into its first year with funding for identified needed technology—computers and printers—but redesigning facilities presented a greater challenge. One of the campus deans volunteered her campus to be the first to tear down a wall and construct a math emporium. Reconstruction occurred during winter break and in Spring 2010; the first mini-emporium offered DE math emporium courses in an area with 48 computers and one printer. The second emporium facility was made possible when funding for a new science wing included expansion of the math lab, an emporium with 34 computers, a tutoring area, and a seminar room. The third campus identified two classrooms and underwent the same construction used in the first campus to create a mini-emporium with 48 computers. 

The fourth campus in downtown El Paso presented a challenge because of its older buildings. At that time, the college purchased a building to expand the downtown student service offices; however, it was too small. The college president identified a different building for the downtown campus math emporium. The building, an old bakery across the street from the downtown campus, was transformed into a large area for the math emporium, a math lab, a tutoring area, three faculty offices, a math faculty meeting room, and a computer classroom. The old bakery is now an innovative Learning Emporium.

And finally, for its largest campus, the College purchased a portable building and is currently being constructed to house two large emporium rooms with 40 computers in each room, one computer classroom, and an area for tutoring and student study tables. Construction will be completed with enough time to allow for Fall 2012 DE math courses. 

The DE math committee learned more than just how to restructure their courses; they also learned about:
  • expanding and reconstructing facilities
  • furnishing learning areas
  • wiring for new technological classrooms
  • rescheduling class sections
  • designing learning spaces for students. 

Among the most important of the lessons learned was: always have a plan B in case construction deadlines are not met. With the emporium now in place on the final campus, the college will be offering over 80 percent of its DE math courses in campus-based math mini-emporium areas.

Wednesday, March 14, 2012

Announcing More to Most!


We are delighted to announce the release of MDC’s latest publication, More to Most: Scaling Effective Community College Practices. More to Most is a guide for community colleges that are expanding small or pilot programs into larger, sustainable efforts that serve most—if not all—of the students who can benefit from them. You can watch an introductory slide show and download a copy from www.more2most.org.

As we discuss often on this blog, community colleges across the country have developed unique programs that help students succeed and put them on a path to a better life. The Developmental Education Initiative colleges and states have been vital partners as we’ve learned together about the resources and practices that are required to scale-up effective developmental education programming—and to help students accelerate through or bypass remediation altogether. Much of that learning is reflected in the pages of More to Most, including examples of promising practices from many of the DEI colleges and states. We’ve also included material that’s been featured on Accelerating Achievement, including the SCALERS model developed by the Center for the Advancement of Social Enterprise at Duke University's Fuqua School of Business and case examples from Kingsborough Community College, the Academy for College Excellence, and Chaffey College.

Even in focused efforts like DEI and Achieving the Dream, deciding which programs to expand and how to do it is a complex process that can waste valuable time and resources if not conducted thoughtfully. The comprehensive—but not prescriptive!—process outlined in More to Most helps you assess which programs are ripe for expansion, and gives direction on how to design a scale-up plan—and it’s all designed to dovetail with planning structures already in place.

To test out the process, we reached beyond the DEI network to Jackson Community College, an Achieving the Dream college in Jackson, MI. JCC recently used More to Most in its decision to expand three student success initiatives. With the strategies outlined in the guidebook, faculty, staff, and administrators at Jackson demonstrated the program’s effectiveness and connected that success to the college’s strategic plan. They examined the expansion’s budget implications, how it would be evaluated, and created a work plan. Finally, they examined the policy implications of the expansion. The undertaking involved deep conversations among faculty, student services, and business office staff, and the resulting plan is a great example of how the process can be customized to dovetail with any college’s specific needs or culture. (Click here to learn more about Jackson’s experience from a March 1 spotlight session at the recent ATD D.R.E.A.M. event.)

If you want to see how the process might work on your campus, head on over to more2most.org and download your copy. Let us know what you think!

Wednesday, February 15, 2012

This can save you money.

Today’s post is our fourth installment of “SCALERS: Round 2.” Originally created by Paul Bloom at the Duke University Fuqua School of Business’s Center for the Advancement of Social Entrepreneurship, the SCALERS model identifies seven organizational capacities that support the successful scaling of a social enterprise: staffing, communicating, alliance-building, lobbying, earnings generation, replicating impact, and stimulating market forces. (You can read an introduction to each driver in our first SCALERS series.)

Now, we’re asking DEI colleges about how particular SCALERS drivers have contributed to their scaling efforts. So far, we’ve covered staffing, communicating, and alliance-building. Below, Ginger Miller of Guilford Technical Community College shares what GTCC has learned about lobbying--or demonstrating impact, as we like to call it.



With a headcount enrollment of about 15,100 students, Guilford Technical Community College (GTCC) is the third largest community college in North Carolina. Its developmental education program serves 4,780 students on three campuses. As with any developmental education program, the primary goal at GTCC is for students to take the fewest number of developmental education courses necessary and complete them as quickly as possible. Our focus here is to describe how COMPASS testing supports this goal. Reviewing and re-testing the COMPASS test can save you time and money. That is the message we emphasize to incoming students. As an example, if a student places out of two developmental classes in English, that translates into tuition savings for the credit hours as well as two semesters—a full academic year—of their time.

As part of the application process, students complete the COMPASS placement test. Among the biggest obstacles to accomplishing proper placement in development education classes is student misconception about the test itself. Students may confuse COMPASS as an entrance test, rather than a placement test. Since they know they are accepted by a community college, they may not do their best to score well. To address this, we emphasize during registration the importance of taking the review workshop and re-testing, depending on their score. They must complete a review workshop before they are permitted to re-test. This workshop, available online or face-to-face, reviews the question format and the content for the math, English, and reading sections of the test.

As a result of completing a review workshop, followed by a re-test, 1,288 students have tested out of one or more developmental classes from fall 2010 through fall 2011. This represents a total estimated tuition savings of about $370,600 during the past three semesters. The largest percentage of students to test out of developmental education coursework appears in English and reading.  For three semesters between fall 2010 and fall 2011, an average of about 61 percent placed out of at least one developmental course in English. About 59 percent placed out of reading courses. In math, the results are lower, at about 33, 39, and 29 percent, for fall 2010, Spring 2011, and fall 2011.





 
At GTCC, the numbers of students to use these workshop reviews have increased from 981 students in fall 2010 to 1,241 students in fall 2011. The greatest increase has appeared in the online reviews, as compared to face-to-face workshops. For example, in fall 2011, 73 students attended the face-to-face reviews, compared to 1,168, who worked online. We also continue to reach out to local high schools, explaining the importance of the placement test; we have arranged for graduating seniors to take the test, the review workshop, and re-test again.

Friday, January 20, 2012

Many Happy Returns

Today’s blog birthday post is brought to you by the letter S. We look back at Accelerating Achievement posts on two of our favorite topics: scaling and state policy.

In Scaling Up, we’ve been harvesting the latest thinking on scaling from the social innovation field, calling attention to tools and resources that can help colleges and states increase the impact of developmental education advancements. We’ve also highlight stories of colleges and states that have found ways to expand the reach of promising practices. The Joy of Scaling launched a seven-week series on seven organizational capacities that support successful scaling of a social enterprise, represented by the acronym SCALERS: Staffing, Communicating, Alliance-building, Lobbying, Earnings Generation, Replicating Impact, and Stimulating Market Forces. MDC adapted this model from Duke University’s Fuqua School of Business for use in community colleges. An ongoing series is delving deeper into the individual SCALERS, seeing how they apply to supplemental instruction, self-paced courses, and faculty engagement.

Supportive state policy is an essential component in any institutional plan to expand innovation to more students. This year’s Statewise posts have followed DEI state policy teams, coordinated by Jobs For the Future, as they work within state community college systems and legislatures to change outdated rules, funding, and incentive structures that stand in the way of innovation. Michael Collins, associate vice president of postsecondary state policy at JFF, laid out the Developmental Education Initiative State Policy strategy in a three-part series. The first segment showed how collecting the right data can inform state policy to accelerate dev ed innovation across a system. Part two detailed how states are investing resources in that innovation. The final installment, our most read Statewise post, made the case for a continuous improvement cycle focused on strengthening policy supports.

Finding ways to bring what works to more students will remain a vital concern for higher education—and for Accelerating Achievement—as colleges and states continue to face increasing enrollments, diminishing resources, and intensifying pressure to move students to credentials more quickly and efficiently.

Wednesday, November 16, 2011

Guest Post: Getting the Message Right

It’s time for the second installment of “SCALERS: Round 2.” Originally created by Paul Bloom at the Duke University Fuqua School of Business Center for the Advancement of Social Entrepreneurship, the SCALERS model identifies seven organizational capacities that support the successful scaling of a social enterprise: Staffing, Communicating, Alliance-building, Lobbying, Earnings Generation, Replicating Impact, and Stimulating Market Forces. (You can read an introduction to each driver in our first SCALERS series.) 

Now, we’re asking DEI colleges about how particular SCALERS drivers have contributed to their scaling efforts. Last month, we looked at staffing. Today, Becky Samberg of Housatonic Community College in Bridgeport, CT, shows what HCC is learning about how changing the message can change behavior.

Remember playing the childhood game of telephone? As the message was whispered from one child’s ear to another, one thing was certain: the game always ended with peals of laughter when the message whispered by the first child never was the same message repeated by the last in the chain. From this childhood game, we learn a very valuable lesson: communication needs to be deliberate and precise.

In our DEI work, we strive to be deliberate and precise in communicating with our HCC community about the initiatives on which we are working. Over the last year, however, lower-than-expected success rates in self-paced courses led us to change how we communicate to our students the nature of the self-paced program and the unique expectations of students enrolled in these courses.

Our self-paced program began in 2007 with developmental math courses. Originally, students could enroll in self-paced math courses—what we then called “Open Entry/Open Exit”—at any time in the semester and exit whenever they finished the course. Students also could start the next course in the same semester. We learned, however, that “Open Entry/Open Exit” was a misnomer.  Students were not accelerating as quickly or as successfully as we anticipated, and students’ financial aid obligations and status as a full- or part-time impeded their ability to move from one course to the next during a semester. As we considered these challenges, we concluded that the individualized instructional format and our expectations of students in these courses were not made explicit by the Open Entry/Open Exit title, so we made the following changes:

  • We adopted the name “Self-Paced,” emphasizing the focus on individualized mathematics and English instruction. This new course name more explicitly communicates that students enrolling in these courses will have individualized instruction at a self-determined pace.  
  • We defined our expectations of students and developed an orientation so students hear a consistent message from their instructors and the Self-Paced studies lab coordinator, who conducts the in-class orientation and oversees students’ visits to the lab. 
  • We made lab visits mandatory, deliberately delivering the message that self-paced does not mean no pace, and regular engagement with the course material is essential to student success. 
  • We re-designed the courses, communicating to students their obligation to make consistent progress throughout the semester and establishing the expectation that students work toward the goal of successfully completing the course within or in less than a traditional semester.  
  • We created a schedule for completing the self-paced courses in a single semester, sharing it with students, and embedding it in the course software. Students are told the pace at which they need to work and the benchmarks they need to reach to successfully complete the course in one semester.
Moving forward, we hope that the changes we have made will increase student success in our self-paced courses. In changing the course title to better communicate the nature of the course and in deliberately and precisely communicating class policies and our expectations of students, we hope to avoid the inevitable outcome of a game of telephone.

Becky Samberg is the chair of developmental studies and DEI director at Housatonic Community College.

Wednesday, October 12, 2011

Guest Post: Supplemental Instruction Leaders Don't Do Optional Either

In the first of our in-depth look at each of the seven SCALERS drivers, staffing, Ruth Silon of Cuyahoga Community College (Tri-C) delves into staffing a supplemental instruction (SI) program. The SCALERS staffing driver calls for effective use of resources to meet personnel needs, from administration to faculty to student services to student employees. In this candid post, Ruth describes the ups and downs of Tri-C’s approach to training student SI leaders in a one-credit special topics course.

 “Students don’t do optional.”

Where have we heard this before? It certainly applies to many developmental students’ use of the tutoring labs, optional orientations, and attendance at Supplemental Instruction (SI) sessions.

But what about the SI leaders themselves? Although we at Cuyahoga Community College (Tri-C), have a well thought out hiring and training process, I have found that if we do not have a very concrete way to manage and observe our SI leaders, they, too, will not do optional.

In June 2010, I attended the International Conference on Supplemental Instruction, and listened to Joyce Zaritsky from LaGuardia Community College discuss her one credit class for SI leaders. This approach seemed to make sense. The students could not be leaders unless they attended a weekly SI course. Here was the place where leaders could share and debrief and experience ongoing training.

During fall 2010, faculty and SI staff met to design the course and it was first implemented in spring 2011 as a special topics course – one session at each campus. The course would meet once a week for one credit.
  • First Problem: The students had to pay for the course, which at that time cost $84.00.
         Solution: Pay SI leaders for an extra hour and hope that is enough to offset the
         cost of the course. 

  • Second Problem: One of our leaders had already graduated.
         Solution: She still had to attend the class in order to be an SI leader. 
  • Third Problem: There was not a common time available for all the leaders to take the course.
         No good solution here: Not all the leaders attended, but about 80 percent
         did participate.

Meeting every week was a great experience, both for me, the teacher, and the leaders. I got to know almost everything that was going on in the SI-supported classes and the related sessions. I learned firsthand about the struggles leaders were having with their students and also with the classroom teachers. The class was more like a support group for the leaders than an academic class. This learning is really important as we are asking students to perform tasks that may be well out of their comfort zone. If we talk to each other about our students and our pedagogy, shouldn’t SI leaders be afforded that same experience?

This was a course, so the students had to complete certain tasks to get a grade. I asked them to turn in weekly journals, telling me what happened in their sessions. (This could be the basis of our conversations each week.) They also had to visit each other’s classes and write up an observation. At the end of the semester, they wrote an essay to a new SI leader, explaining the high and low points of the job and offering the new leader advice. The end result was twofold: a deeper understanding of what goes on in SI for both me and the leaders, and a very supportive environment to help the leaders do a better job.

Even though the course went well, we decided not to offer it this semester. Why not? I did not want to make SI leaders pay for the course again. I did not want to ask for more money for the SI leaders. And I thought “Last semester’s meetings went well. Of course this semester’s leaders will come to a weekly session, even if it’s not for credit!”

I was wrong. Sadly, I forgot that students, SI leaders included, may not do optional! Just like in any other class, some student leaders come every week, some attend occasionally, and others never show up at all. I am sorry to have to have learned the same lesson again: accountability is everything. But I have learned the lesson, so next semester we’ll be offering the course for our SI leaders again.

Ruth Silon is an associate professor of English and DEI project director at Cuyahoga Community College.

Thursday, October 6, 2011

S-C-A-L-E-R-S: Round 2

One of our favorite topics here on Accelerating Achievement is scaling up. Regular readers will remember our multi-week SCALERS series. Originally created by Paul Bloom, the SCALERS model identifies seven organizational capacities that support the successful scaling of a social enterprise: Staffing, Communicating, Alliance-building, Lobbying, Earnings Generation, Replicating Impact, and Stimulating Market Forces. In the Accelerating Achievement SCALERS blog series, we translated the model for application at community colleges.

Next week, we’ll launch a new SCALERS series. Each month, we’ll have a guest post from a DEI college about how a particular SCALERS driver has contributed to their scaling efforts. A little less conversation, a little more action! Today, we’re applying all seven SCALERS to a program from Chaffey College that has successfully scaled up. Thanks to Ricardo Diaz at Chaffey sharing his story with us!


The goal of Opening Doors to Excellence (ODE) at Chaffey College in Rancho Cucamonga, C.A., is to move students off of academic probation and back into good standing with the college. Participants develop an educational plan with an advisor, take a student success course, and complete a series of directed activities in the college’s student success center. Chaffey defined scale as an institutionalized program that, when fully implemented, would serve all students on academic probation college-wide; by this definition, the program is, in fact, scaled up. According to Ricardo Diaz, ODE coordinator, the successful expansion of the program has required attention to all seven SCALERS drivers:

Staffing. Since there are 300 to 400 students in the program each semester, Diaz is able to meet with each student only once prior to enrolling in the student success course. To address the need for continuous student follow-up, ODE is staffed by counselor apprentices. These counselors are paid graduate students from local universities who use the experience to complete required clinical hours for their program of study. Chaffey’s Human Resources department provides structure and support for hiring the apprentices; program leadership and coordination functions have been integrated into existing staff workloads.

Communicating. To expand ODE, Chaffey embarked on a strategic planning process that drew together key parties from across the college. The plan they constructed involved integrating services into existing programs, rather than creating a program with a stand-alone structure. During program development, the core planning committee held regular discussions with governance departments.

Alliance-Building. As mentioned above, ODE was developed with input from college-wide representatives. The program had the support of the president and board of trustees from the beginning. A crucial alliance for ODE was the purposeful collaboration between academic affairs and student services.

Lobbying/Demonstrating Impact. Chaffey’s Institutional Research department collaborated with MDRC to establish outcomes and evaluate ODE as part of MDRC’s Opening Doors project. When MDRC concluded their study, Chaffey’s institutional research continued. The strength of the evaluation allowed the program to obtain additional resources, recognition, and support for expansion.

Earnings Generation/Resource Generation. The initial MDRC funding for the program was matched by college funding commitments. With future expansion in mind, Chaffey integrated the core expenditures for the program into the college’s general fund. The MDRC grant was used as start-up money, funding program development, paraprofessional staff, books, supplies, travel, and training.

Replicating Impact. As the program grew, the core planning committee developed a continuous improvement process. Student learning outcomes and focus group feedback were used to refine program design. The committee encouraged regular sharing of practices among instructors along with professional development activities.

Stimulating Market Forces/Sustaining Engagement. Because ODE was integrated into the college’s core operational components from the beginning, it quickly became a regular function of how the college operates. Students embraced the program because enrollment incentives were put into place. The MDRC study allowed for easy dissemination of the model to other colleges. This gained national recognition for Chaffey, which ensured continued buy-in from leadership and the campus community.

What’s next? Chaffey has created a solution to their initial problem: ODE moves students from academic probation back into good standing. However, an MDRC study looking at ODE’s impact on moving students to completion revealed that the intervention does not result in increased rates of graduation or certificate attainment. While not the original intent of this intervention, it is none the less a critical objective that presents a new challenge in program development and scaling.  Now that Chaffey has a broad strategy that reaches the entire target population, it’s time to look at ways to scale the depth of the program’s impact, intensifying the intervention to amplify the impact or reach a new aim. The college intends to reconvene the core planning committee to explore strategies that can improve the likelihood that students who overcome their probationary standing also complete a degree and/or certificate.

Thursday, June 16, 2011

Guest Post: What’s Next for SCALERS?

Today’s post comes from Paul Bloom, Faculty Director and Adjunct Professor of Social Entrepreneurship and Marketing with the Center for the Advancement of Social Entrepreneurship (CASE) at Duke University’s Fuqua School of Business. Paul is also the creator of the SCALERS model. (Thanks, Paul!) Below, he shares the genesis of the model and his ideas for the next iteration of the work.

The idea of developing a model like SCALERS came to me a few years ago while reading the best-selling book, Made to Stick, written by the brothers Chip and Dan Heath. Among other things, they point out that acronyms can help ideas catch on and be remembered. I thought of my home discipline of marketing, a subject I taught for years and years, and how it has used the device of the “4P’s” (Product, Price, Place, and Promotion) to help students remember the essence of marketing. It occurred to me that the emerging field of social entrepreneurship needed a similar hook to help students and practitioners understand what it was all about.

As a newcomer to social entrepreneurship, I was struck by how possessed everyone was with the concept of “scaling.” Indeed, I have come to believe that more than anything else, what distinguishes social entrepreneurs from more conventional leaders of social-purpose organizations is the former groups’ obsession with scaling social impact. These folks want to change the world, not just run a sustainable and effective do-gooder organization. So whatever acronym or words I generated to help the field “stick” had to relate to the concept of scaling.

Fortunately, the letters of the acronym SCALERS suited what I was detecting from my own research and the research of others as the organizational capabilities that were the key drivers of successful scaling (i.e., Staffing, Communicating, Alliance-Building, Lobbying, Earnings-Generation, Replicating, and Stimulating Market Forces). I had to make some compromises – the word “Staffing” does not completely cover the range of human resource management capabilities that are needed for scaling, and “Lobbying” is just a portion of “Advocacy” (but the letter “A” was taken already). Nevertheless, I am very pleased by all the positive feedback and attention my original writing on the model has received (see Bloom and Chatterji, California Management Review, 2009 and Bloom and Smith, Journal of Social Entrepreneurship, 2010).

Still, my thinking about scaling is evolving and there are aspects of the SCALERS model that I have modified since the earlier articles were published. I am currently putting the finishing touches on a short book that will introduce these modifications and also try to explain the model to a wider audience. As much as anything, the modified model stresses that what makes one capability or driver important in one specific situation may not be the case in other situations. The importance of any particular SCALER will depend on the resources the organization possesses as it starts its scaling and the theory of change on which the organization is building its initiatives. For example, if the organization is poorly endowed with certain types of human or financial resources, then the importance of building capabilities to improve those resources (i.e., Staffing, Earnings-Generation) should become even more important for scaling success. And if the organization is theorizing that desired social impacts will occur if it informs more people about a problem or if new regulations are introduced, then the importance of building capabilities in Communicating or Lobbying will become paramount. Conducting an honest assessment of your organization’s unique situation is necessary for getting the most out of the SCALERS model.

Paul Bloom is Faculty Director and Adjunct Professor of Social Entrepreneurship and Marketing with the Center for the Advancement of Social Entrepreneurship (CASE) at Duke University’s Fuqua School of Business.

Wednesday, June 15, 2011

SCALERS: S is for Stimulating Market Forces


Welcome to the final week of our seven-week series exploring the seven drivers of the SCALERS model, a framework of organizational capacities that are essential for successfully scaling-up effective programs. If you’re joining us for the first time, check out the series intro and the posts on the first six drivers: Staffing, Communicating, Alliance-Building, Lobbying/Demonstrating Impact, Earnings Generation/Resources, and Replicating Impact.

As with earnings generation, most community colleges may not think of their work in terms of market forces. However, the concept of creating demand for a product or service still applies. We named this driver “sustaining engagement” and define it as the effectiveness with which a college can create incentives that encourage institutional leadership, program staff, faculty, and students to be involved in and value the expanded solution.

The college should consider the types of incentives that will appeal to different constituent groups: while everyone will want to hear about positive program outcomes, leaders might be most interested in access to return-on-investment calculations; program staff and faculty might want flexibility, support, and time for their own development; students might want to see direct connection between individual needs and program services, or even monetary incentives. The incentives may change depending on the phase of implementation: encouraging adoption and enrollment require different motivators than encouraging support for expansion; continuing support and participation may require still others.

Sustaining engagement has significant overlap with other drivers, particularly communicating, alliance building, and demonstrating impact. An evaluation plan with clear short-, intermediate-, and long-term outcome targets enables an organization to routinely measure, report, and make necessary revisions. A systematic approach to professional development ensures that these revisions are incorporated into curricula, training, and implementation practices. When the evaluation data and professional development learning are tied to a communication plan that addresses marketing concerns, as well as internal messaging, leadership, program staff, and students are all made aware of the program or practice, know about the associated positive outcomes, and know how to participate. 

A restructuring of leadership teams at El Paso Community College (EPCC) provides an example of how to sustain engagement at one level of the institution. When EPCC was selected to participate in the Developmental Education Initiative, they initially planned to create a DEI-specific core team of relevant faculty, staff, and administrators, similar to their Achieving the Dream Core Team. However, in an effort to increase coordination and reduce overlap, the college created the President’s Student Success Core Team, comprised of representatives from all of the college’s major developmental education reform efforts. The following chart shows the organization and membership of the Student Success Core Team:



This organizational structure allows representatives from each major initiative to be at the table with the president and his cabinet to share updates and discuss their impact before final decisions are made. This structure also makes it easy to bring new initiatives to the table and integrate the work into existing efforts. 

Abby Parcell is a Program Manager at MDC.

Wednesday, June 8, 2011

SCALERS Series: R is for Replicating Impact


Welcome to the sixth week of our seven-week series exploring the seven drivers of the SCALERS model, a framework of organizational capacities that are essential for successfully scaling-up effective programs. If you’re joining us for the first time, check out the series intro and the posts on the first five drivers: Staffing, Communicating, Alliance-Building, and Lobbying/Demonstrating Impact, and Earnings Generation/Resources.

“Replicating impact” means developing and maintaining institutional expertise and commitment as you scale up a program. This driver is an important part of sustainability planning and broader institutional improvement. Maintaining quality while scaling up effective programs is part of the college getting better at what it does.

Consider your college’s track record at expanding interventions in the past:
  • How do you capture organizational learning? 
  • What is your system for process improvement?
  • How do you involve the individuals responsible for implementing the strategy in learning and process improvement?
Some of this learning will be gleaned from your analysis of program outcome data, as discussed in the demonstrating impact post, but your institution should make space for interpretation of these data and integration with qualitative information.

To ensure the continuous improvement of your expanded strategy, you will need to systematically approach professional development. Expectations for participation in professional development should be clearly communicated to everyone involved in program delivery and management. Another essential piece is a plan to capture learning—both program and process-related—that can be incorporated into an existing continuous improvement strategy. Such knowledge development can actually be part of professional development, encouraging those who are implementing the strategy to innovate. The college should compare pre- and post-expansion data and take time to consider necessary modification. All of these processes and relationships will incorporate parts of other SCALERS drivers, including staffing, communicating and sustaining engagement.

The Faculty Inquiry Group model from the Carnegie Foundation for the Advancement of Teaching is an example of an approach to professional development that sets the stage for continuous learning.  As defined by the Foundation, faculty inquiry is:
“…a form of professional development by which teachers identify and investigate questions about their students’ learning. The inquiry process is ongoing, informed by evidence of student learning, and undertaken in a collaborative setting. Findings from the process come back in the form of new curricula, new assessments, and new pedagogies, which in turn become subjects for further inquiry.”
Danville Community College has employed this model in their Developmental Education Initiative work, convening faculty inquiry groups to pursue curriculum alignment among local high schools, adult basic ed programs, dev ed faculty, and college-level faculty. The groups have also proved vital to the college’s response to major dev ed redesign efforts led by the Virginia Community College System.

Abby Parcell is a Program Manager at MDC.

Wednesday, June 1, 2011

SCALERS: E is for Earnings Generation


Welcome to the fifth week of our seven-week series exploring the seven drivers of the SCALERS model, a framework of organizational capacities that are essential for successfully scaling-up effective programs. If you’re joining us for the first time, check out the series intro and the posts on the first four drivers: Staffing, Communicating, Alliance-Building, and Lobbying.

In the original SCALERS model, the “Earnings Generation” driver is focused on creating additional revenue to support a particular enterprise; public institutions’ revenue-generating activity will be limited and not likely associated with the specific program or practice being scaled up. However, the expansion team still needs to consider the resources required to grow and sustain the program—and not just financial resources. We call this the “Resources” driver, and it helps focus the institution on securing and managing a program’s necessary staffing, space, technology, and other infrastructure needs.

Your team does have to think about funds; how do grants—local, state, and federal—influence the design of your program and the strategy you use to take a program to scale. Make sure that funds for expansion are included in an approved budget. Have a sustainability plan to secure continued funding over the life of the program. Consider a 2-3 year plan, as well as a longer-term plan, looking out 5-10 years. This attention to funding is especially important if the program was launched with time-limited monies. The institution will need to consider staffing here, too; an individual responsible for expansion must understand the hiring process and have the authority to make hiring decisions and authorize related expenditures.

If a program is expanding, there likely will be expanding space and technology needs. A scaling plan must include time to secure necessary office, training, and service accommodations. Depending on the nature of the program, the college also must acquire additional hardware, software, and telecommunications equipment. It is not just matter of purchasing equipment and clearing out space; the college also should ensure vital facilities and technical support are available. Clear communication is essential here as well; affected individuals must be apprised of any space or technology modifications and the organization should secure their commitment to support expansion.

Developmental Education Initiative campuses have dealt with resource issues in a variety of ways. Cuyahoga Community College has stretched professional development dollars by bringing trainers to campus rather than sending individuals to offsite conferences and courses. This allows more faculty to take advantage of professional development opportunities. Zane State College addressed limited computer lab space by purchasing laptop computers to create mobile labs that can move from classroom to classroom.   

Abby Parcell is a Program Manager at MDC.

Wednesday, May 25, 2011

SCALERS Series: L is for Lobbying




Welcome to the fourth week of our seven-week series exploring the seven drivers of the SCALERS model, a framework of organizational capacities that are essential for successfully scaling up effective programs. If you’re joining us for the first time, check out the series intro and the posts on the first three drivers: Staffing, Communicating, and Alliance-Building.

Since “lobbying” has very specific—and sometimes negative—connotations, for some people, we like to call this driver “demonstrating impact.” In order to secure and sustain support for an expansion plan, you’ve got to articulate to institutional, state, and federal decision makers that expanding (and/or continuing) a particular practice or program will have substantial benefits relative to costs. These same arguments must be made to individuals delivering the program as well as program participants. Scaling up a program or practice that has been successful on a small scale may require some disruption of organizational culture; this intensifies the imperative to clearly demonstrate how such change will advance institutional priorities—or why those institutional priorities need to change.

No matter what program you’re expanding, you should start by articulating the rationale for expansion and the connection to the college’s larger strategic plan. Then, consider what data you need to show how effective the strategy is at meeting the specified goal for the specified target population. Since it’s Equity Week at Accelerating Achievement, we encourage you to analyze data disaggregated by race, income, and other demographic factors and identify achievement gaps among student populations. If closing these gaps is an institutional priority for your college and one of the desired outcomes of your program, then it is essential that you analyze the evidence for how effectively the program accomplishes this goal. You should also ensure that your organization has the institutional research capacity to collect, measure, and communicate all of these data elements.

Collecting and analyzing data only serves this driver if you get to the demonstrating step. Make a plan to share information about program outcomes—within the organization, within the broader community, and with individuals who are in positions to influence program continuation, innovation, and further expansion. Your team should include individuals who can connect to state and federal policy decision makers; these individuals must have access to up-to-date information about program outcomes. Consider ways that those delivering program services and those participating can inform policy decisions through advocacy and information sharing. All these relationships and practices require that the organization consider other SCALERS drivers, in particular Communicating, Alliance-Building, and Sustaining Engagement.

For an example of the power of data, we refer you back to one of this blog’s first posts from Michael Collins, program director at Jobs for the Future.  JFF developed the DEI State Policy Strategy, a state-level developmental education improvement strategy, with three action priorities:

A data-driven improvement process that ensures the right conditions for innovation.
A state-level innovation investment strategy that helps states align and coordinate support from multiple sources to provide incentives for the development, testing, and scaling up of effective models for helping underprepared students succeed.
Policy supports that provide a foundation for improved outcomes for underprepared students, facilitate the implementation of effective and promising models, and encourage the spread of successful practices.

By focusing on data-driven planning, resource coordination, and policy that supports effective practice, JFF’s strategy provides a framework for demonstrating impact at the college, system, and state level.

Abby Parcell is a Program Manager at MDC.

Wednesday, May 18, 2011

SCALERS Series: A is for Alliance-Building



Welcome to the third week of our series exploring the seven drivers of the SCALERS model, a framework of organizational capacities that are essential for successfully scaling up effective programs. If you’re joining us for the first time, check out the series intro and the posts on the first two drivers, Staffing and Communicating.

We all need somebody to lean on. Alliance-building, the third driver of the SCALERS model, focuses on the importance of a network of individuals and groups that will support your scaling effort. As defined by Paul Bloom and Aaron Chatterji, the model’s creators from the Fuqua School of Business at Duke University, alliance-building is “the effectiveness with which the organization has forged partnerships, coalitions, joint ventures, and other linkages to bring about desired social changes.” Colleges need the same ability to create partnerships and coalitions, engaging the necessary parties to support the expansion of a particular strategy.

Start by conducting an analysis of potential alliances that you could build to increase the likelihood of successful scaling up. These can be existing or new relationships, and can include individuals or groups representing faculty, staff, students, and departments. It might be people outside of the college, too. Consider parties that will be a champion for the work, as well as ones that are likely to resist change. If you invite those who could present roadblocks to participate in the planning process early on, you may prevent them from turning into opposition.

Once you have identified the necessary parties, develop a plan for engaging each group or individual. Secure commitments of implementation support from as many as possible. To do this, you’ll need to have an individual on your team who has the necessary positional authority to convene and invite new allies to participate. As the program expansion begins, put a system in place to provide for regular convenings to keep allies informed about program progress and changes. Your alliance-building plan should be informed by your plan for the other SCALERS drivers, especially communicating, demonstrating impact, and sustaining engagement.

We’ve blogged previously about an example of effective alliance-building. In April, Karen Scheid, director of the Developmental Education Initiative for the Ohio Board of Regents, described Ohio’s efforts to align adult basic and literacy education (ABLE) programs with developmental education. This effort has required the integration of the state policy team, the colleges, and the local basic education providers. As Karen told us, this alliance has already started to bear fruit: “Since the launch of the pilot at the end of July 2010, 22 of Ohio’s 23 community colleges and their ABLE partners have submitted agreements for colleges to make ABLE referrals for students who score below an agreed level on a placement test.”

Check back tomorrow for a guest post from Gay Clyburn, associate vice president for public affairs at Carnegie Foundation for the Advancement of Teaching, to learn more about how Carnegie is using alliance-building to perfect and scale an initiative to develop a one-year pathway from remedial math to college statistics.

Wednesday, May 11, 2011

SCALERS Series: C is for Communicating



Last week, we began our series exploring the seven drivers of the SCALERS model, a framework of organizational capacities that are essential for successfully scaling up effective programs. You can read the series intro here, and you can read the post about Staffing, the first driver, here.

When you think communications, it's not just marketing, it's telling the story in a way that will make the value of your work clear to everyone on campus. A compelling message will help students, faculty and staff understand that your change strategy is essential to student success and worth adopting and supporting.

In order to ensure the necessary participation in scaling up your strategy, you’ll need to clearly articulate the rationale, expectations, commitment, and process for the expansion. Once you figure out how to say it, figure out how to share it. What formats are appropriate for getting your information out to faculty, staff, and students? Consider websites and course catalogs, as well as program-specific convenings and marketing materials.

Communication is an ongoing need, so put processes in place to share up-to-date information about the program to responsible faculty and staff as well as students and all departments and individuals responsible for enrolling, counseling, and advising students. Pay close attention to making sure individuals with authority understand the enrollment, registration, and scheduling changes that are required for successful expansion of your program.

When Patrick Henry Community College began the Developmental Education Initiative, they formed a committee to launch and maintain a marketing campaign for their DEI work, known as the Progress Initiative. The Progress Initiative focuses on fast-tracking students through developmental education in the Accelerated Learning Program (ALP), which also incorporates cooperative learning and case-management advising. To create buy-in across the campus for this program, the committee developed an exciting verbal and visual identity for the Progress Initiative. They launched the campaign with a public event featuring a nationally known speaker, and the team made presentations at a variety of campus meetings to acquaint faculty and staff with the initiative. Once PHCC had effectively established an identity for the Progress Initiative, they worked to reinforce it over time. All faculty who present about the initiative are given a thumb drive loaded with the logo and the theme music as well as T-shirts with the logo on it.

You probably don’t need a full marketing campaign for every program you expand, but you do need to create a communications plan that determines the appropriate methods and processes for sharing the necessary information with your campus. PHCC is a stellar example of using communications to persuade faculty, staff, and students to support a strategy. What are your communication success stories? What problems have you encountered in getting the word out?

Wednesday, May 4, 2011

SCALERS Series: S is for Staffing




Today, we begin our series exploring the seven drivers of the SCALERS model, a framework of organizational capacities that are essential for successfully scaling-up effective programs. You can read the series intro here.

People who need people are, indeed, the luckiest people; but it may not feel that way when you’re trying to find the resources and individuals to expand a program. The SCALERS staffing driver calls for effective use of resources to meet labor needs; in a community college setting, this includes administration, faculty, student services, and student employee positions, as well as individuals responsible for data collection, analysis, and evaluation.

As you look at a program slated for expansion, you must consider how labor-intensive it is and whether it requires skilled services. This necessitates a clear definition of the labor needs and the local labor market. An organization also must look at the existing recruitment pool and the institution’s ability to recruit sufficient staff to sustain expansion. Such efforts are supported by a staffing plan that includes job descriptions for all requisite positions that details the essential knowledge, skills, and abilities. Such a plan should include the required administrative, student services, academic, and student employee positions. It’s also important to review current staffing levels and identify any existing positions that may need to be redeployed or those that will see additional work volume under expansion.

While a team responsible for day-to-day implementation of a particular program can make a good start on a staffing plan, there are broader organizational considerations that may require support from administration. Adding or redeploying positions necessitates discussions about a broader human resources strategy; does the organization have capacity (and will) to recruit, train, retain, and sustain the requisite expertise? You must ensure that HR processes for recruitment and hiring are in place; someone on the “scaling-up team” should be familiar with these processes and have the authority to initiate and execute hiring.

Of course, once individuals are hired, the organization should see to their continued development and training. Another part of the staffing consideration is the organization’s approach to professional development; a sustainable scaled-up solution requires a professional development system that specifically addresses the needs of the faculty and staff implementing the program, as well as the processes and resources to ensure quality delivery and continuous improvement. These concerns are closely related to other SCALERS drivers that will be featured in coming weeks, including communicating, alliance-building, resources, and sustaining engagement.

Chaffey College came up with a unique solution to a staffing issue as they expanded their Opening Doors to Excellence (ODE) program. The goal of ODE is to move students off of probation and back into good standing. Participating students develop an educational plan with an advisor, take a student success course, and complete a series of directed activities in the college’s student success center. The director of the program meets with every student (between 300 and 400 students per semester), but student follow-up is carried out by a cadre of Counselor Apprentices. These Counselor Apprentices are graduate students from a local university who can apply the experience to completing required clinical hours, allowing the college to expand its advising force. For more information about Opening Doors to Excellence, check out the presentation ODE Program Director Ricardo Diaz made at the 2011 Achieving the Dream Strategy Institute pre-institute workshop, “Bringing Innovation to Scale.” You can find the presentation in the Resources section of our website, under the “Scaling Up” category.

Abby Parcell is MDC's Program Manager for the Developmental Education Initiative.