Thursday, June 30, 2011

Guest Post: Good, Better, I-BEST

Today, we welcome John Wachen of the Community College Research Center to Accelerating Achievement. Below, John summarizes findings from a recent CCRC study of Washington State’s I-BEST model for integrating basic skills and workforce training.

To meet the ambitious goals set forth by the federal government and private foundations to increase substantially the number of students with high-quality postsecondary credentials, the higher education system must focus on retaining students and accelerating completion, particularly among underrepresented populations. One promising program model that works with basic skills students is the Integrated Basic Education and Skills Training (I-BEST) model in Washington State’s two-year colleges.

I-BEST was developed to increase the rate at which ABE and ESL students advance to college-level coursework and completion by integrating basic skills and career-technical instruction. In the model, basic skills instructors and career-technical instructors jointly teach college-level occupational courses that admit basic skills students. An I-BEST program is a series of these integrated courses in a career-technical field that leads to a credential. The I-BEST model has received a significant amount of attention in recent years as policy makers and practitioners in other states look for effective strategies to help low-skilled students move farther and faster along educational pathways.

Researchers at the Community College Research Center (CCRC) conducted several studies of the I-BEST model over the past few years, including a field study of implemented programs in Washington State’s colleges. Findings from our field study include information about program structure and management, integrated instruction and support services, and program costs and sustainability.

Below are some highlights from our findings:
  • Structured pathways. All I-BEST programs are part of long-term educational pathways that can yield increasingly valuable credentials. Our research suggests that it is important to provide structured, coherent pathways for basic skills students, who might otherwise find it difficult to navigate a broad array of choices.
  • Integrated instruction. For each I-BEST course, the basic skills instructor and career-technical instructor must jointly teach in the same classroom with at least a 50 percent overlap of the instructional time. The degree to which basic skills and career-technical instruction is integrated in the I-BEST classroom varies considerably across programs. Fully integrated instruction is uncommon and difficult to achieve but we did find several examples of highly collaborative team-teaching.
  • Faculty selection and collaboration. The team-teaching model is challenging for the instructors and facility with it often takes time to develop. The relationship between the instructors is critical and it is therefore important to identify and select instructors with the willingness and ability to work with a co-instructor.
  • Funding and sustaining I-BEST. Approved programs are funded at 1.75 times the normal rate per full-time equivalent student (FTEs) to help cover the higher program costs. At many colleges, however, the expense of running the programs was a primary concern. Colleges identified several factors needed to sustain I-BEST programs, including maintaining strong enrollments, solid commitment from senior administrators, and continued financial support through enhanced FTEs.

Our report in Community College Review contains additional discussion of these and other findings, including a profile of I-BEST students, information on student financial support, and lessons for other states and colleges.

Monday, June 27, 2011

Stuck on You

Problem Solved!

Last week, DEI college teams, partners, and Achieving the Dream coaches gathered in Durham, N.C., for the Annual DEI Summer Workshop. As we mentioned last Thursday, one goal of the workshop was to address current implementation challenges by giving college innovators time to share their ideas and problem-solve.

We organized a group problem-solving activity we like to call "Sticky Issues." On Friday afternoon, we broke into small groups of five to seven people and gave each person a chance to share one implementation frustration, obstacle, or difficulty that they are “stuck” on. The rest of the group had ten minutes to brainstorm solutions to help the person overcome the stated challenge. Traditional rules of brainstorming apply: no idea is too off-the-wall and no one is allowed to say “that will never work!”

The Sticky Issues session isn’t about commiseration; it’s about colleges comparing ideas and spreading good practices. Here’s a sampling of the challenges raised during the session and some of the interesting ideas that were generated:

  • Sticky Issue: How can we get students to sign up and show up for boot camps?
    Possible Solution: The message you use to market a boot camp is important. Put a price tag on it by sharing a cost/benefit analysis and focusing on student success data. Market it to parents. Charge students and pay them back upon completion.
  • Sticky Issue: How can I get faculty to believe the data about program results?
    Possible Solution: Host an annual half day Data Summit for faculty. Take them off-site and make it fun. Ask faculty champions to present about how they use data. There is a fear of judgment, so these kinds of events can be used to demystify the data.

It was great to see the wisdom of the DEI crowd targeted at specific challenges and to see the enthusiasm of participants as they thought about implementing the new ideas on their own campuses. We might just try bringing sticky issues to Accelerating Achievement…stay tuned!

Thursday, June 23, 2011

Wish You Were Here

Hello from the DEI Summer Workshop! We've gathered together the DEI college teams, partners, and Achieving the Dream coaches in Durham, NC to cogitate, collaborate, and come up with answers to big questions about developmental education. We’re asking some big questions about scaling and sustainability, too. 

When we began DEI, we were focused on expanding effective pilots and policies for students needing developmental education. When we started, colleges were serving some students in those programs and designed their work to serve even more of those who can benefit; the next challenge will be to sustain this work to serve most of those who can benefit from these effective programs—because a solution that isn’t available to most of those who need it, isn’t really a solution.

We began our day by asking attendees to envision this goal of extending their work from more to most of the students who could benefit. What would that look like? What would change on the campus? What would change among faculty? What would change for students?

One college participant shared her vision of the future: there is very little need for developmental education at the college, because most students arrive prepared or accelerate quickly to college-level work. Every student has a case manager, who serves as a friend and an advisor to make sure that they "start right, stay right, and end right." 

With that mantra still echoing in our heads, we're spending the rest of our time here reflecting on the first 18 months of DEI and capturing our accomplishments, addressing current implementation challenges, making plans for incorporating DEI learning into other student success efforts, and building a strong, sustainable learning community of developmental education innovators. Check back Monday for a full recap of the event.

Alyson Zandt is a Program Associate at MDC.

Wednesday, June 22, 2011

GSCC Says Pedagogy Matters

Are you a developmental education instructor interested in improving pedagogy? Global Skills for College Completion (GSCC) wants you to get involved:
  • Join the GSCC Adjunct Faculty Community. We invite adjunct faculty to launch the community, and connect with the GSCC project. You will help advance and build upon GSCC's approach to improving developmental education pedagogy by connecting with adjunct faculty from across the country. This summer, GSCC is offering a stipend to developmental math and English adjunct instructors who will design and launch the online community. For details and to apply to the GSCC Adjunct Community workgroup, click here.
  • GSCC is preparing for the fall launch of the Pedagogy Matters Campaign Community.  GSCC will hire and pay a stipend to ten 10 developmental education Math and English faculty to plan the Pedagogy Matters Campaign. This community will use crowd sourcing to build a viral campaign of faculty advocates to raise national awareness about pedagogy and advance a Pedagogy Matters Manifesto. If you want to help the country improve developmental education pedagogy to achieve better student outcomes, click here to apply to the Pedagogy Matters Campaign community workgroup.

Thursday, June 16, 2011

Guest Post: What’s Next for SCALERS?

Today’s post comes from Paul Bloom, Faculty Director and Adjunct Professor of Social Entrepreneurship and Marketing with the Center for the Advancement of Social Entrepreneurship (CASE) at Duke University’s Fuqua School of Business. Paul is also the creator of the SCALERS model. (Thanks, Paul!) Below, he shares the genesis of the model and his ideas for the next iteration of the work.

The idea of developing a model like SCALERS came to me a few years ago while reading the best-selling book, Made to Stick, written by the brothers Chip and Dan Heath. Among other things, they point out that acronyms can help ideas catch on and be remembered. I thought of my home discipline of marketing, a subject I taught for years and years, and how it has used the device of the “4P’s” (Product, Price, Place, and Promotion) to help students remember the essence of marketing. It occurred to me that the emerging field of social entrepreneurship needed a similar hook to help students and practitioners understand what it was all about.

As a newcomer to social entrepreneurship, I was struck by how possessed everyone was with the concept of “scaling.” Indeed, I have come to believe that more than anything else, what distinguishes social entrepreneurs from more conventional leaders of social-purpose organizations is the former groups’ obsession with scaling social impact. These folks want to change the world, not just run a sustainable and effective do-gooder organization. So whatever acronym or words I generated to help the field “stick” had to relate to the concept of scaling.

Fortunately, the letters of the acronym SCALERS suited what I was detecting from my own research and the research of others as the organizational capabilities that were the key drivers of successful scaling (i.e., Staffing, Communicating, Alliance-Building, Lobbying, Earnings-Generation, Replicating, and Stimulating Market Forces). I had to make some compromises – the word “Staffing” does not completely cover the range of human resource management capabilities that are needed for scaling, and “Lobbying” is just a portion of “Advocacy” (but the letter “A” was taken already). Nevertheless, I am very pleased by all the positive feedback and attention my original writing on the model has received (see Bloom and Chatterji, California Management Review, 2009 and Bloom and Smith, Journal of Social Entrepreneurship, 2010).

Still, my thinking about scaling is evolving and there are aspects of the SCALERS model that I have modified since the earlier articles were published. I am currently putting the finishing touches on a short book that will introduce these modifications and also try to explain the model to a wider audience. As much as anything, the modified model stresses that what makes one capability or driver important in one specific situation may not be the case in other situations. The importance of any particular SCALER will depend on the resources the organization possesses as it starts its scaling and the theory of change on which the organization is building its initiatives. For example, if the organization is poorly endowed with certain types of human or financial resources, then the importance of building capabilities to improve those resources (i.e., Staffing, Earnings-Generation) should become even more important for scaling success. And if the organization is theorizing that desired social impacts will occur if it informs more people about a problem or if new regulations are introduced, then the importance of building capabilities in Communicating or Lobbying will become paramount. Conducting an honest assessment of your organization’s unique situation is necessary for getting the most out of the SCALERS model.

Paul Bloom is Faculty Director and Adjunct Professor of Social Entrepreneurship and Marketing with the Center for the Advancement of Social Entrepreneurship (CASE) at Duke University’s Fuqua School of Business.

Wednesday, June 15, 2011

SCALERS: S is for Stimulating Market Forces


Welcome to the final week of our seven-week series exploring the seven drivers of the SCALERS model, a framework of organizational capacities that are essential for successfully scaling-up effective programs. If you’re joining us for the first time, check out the series intro and the posts on the first six drivers: Staffing, Communicating, Alliance-Building, Lobbying/Demonstrating Impact, Earnings Generation/Resources, and Replicating Impact.

As with earnings generation, most community colleges may not think of their work in terms of market forces. However, the concept of creating demand for a product or service still applies. We named this driver “sustaining engagement” and define it as the effectiveness with which a college can create incentives that encourage institutional leadership, program staff, faculty, and students to be involved in and value the expanded solution.

The college should consider the types of incentives that will appeal to different constituent groups: while everyone will want to hear about positive program outcomes, leaders might be most interested in access to return-on-investment calculations; program staff and faculty might want flexibility, support, and time for their own development; students might want to see direct connection between individual needs and program services, or even monetary incentives. The incentives may change depending on the phase of implementation: encouraging adoption and enrollment require different motivators than encouraging support for expansion; continuing support and participation may require still others.

Sustaining engagement has significant overlap with other drivers, particularly communicating, alliance building, and demonstrating impact. An evaluation plan with clear short-, intermediate-, and long-term outcome targets enables an organization to routinely measure, report, and make necessary revisions. A systematic approach to professional development ensures that these revisions are incorporated into curricula, training, and implementation practices. When the evaluation data and professional development learning are tied to a communication plan that addresses marketing concerns, as well as internal messaging, leadership, program staff, and students are all made aware of the program or practice, know about the associated positive outcomes, and know how to participate. 

A restructuring of leadership teams at El Paso Community College (EPCC) provides an example of how to sustain engagement at one level of the institution. When EPCC was selected to participate in the Developmental Education Initiative, they initially planned to create a DEI-specific core team of relevant faculty, staff, and administrators, similar to their Achieving the Dream Core Team. However, in an effort to increase coordination and reduce overlap, the college created the President’s Student Success Core Team, comprised of representatives from all of the college’s major developmental education reform efforts. The following chart shows the organization and membership of the Student Success Core Team:



This organizational structure allows representatives from each major initiative to be at the table with the president and his cabinet to share updates and discuss their impact before final decisions are made. This structure also makes it easy to bring new initiatives to the table and integrate the work into existing efforts. 

Abby Parcell is a Program Manager at MDC.

Tuesday, June 14, 2011

Guest Post: You’ve Got a Friend—Supportive State Policy for Dev Ed Innovation

Today’s post comes from Michael Collins, associate vice president of postsecondary state policy at Jobs for the Future. JFF leads DEI’s state policy initiative by supporting policy teams in CT, FL, NC, OH, TX, and VA, who are implementing the three-pronged Developmental Education Initiative State Policy Framework. In the first of a three-part series about the framework, Michael shows how collecting the right data can inform state policy that can accelerate dev ed innovation across a system—and introduces a new JFF publication that details how states are doing just that.

There is a lot of talk about fixing developmental education these days. In most people’s mind, this looks like a teacher in a classroom who is teaching better—and thus, students are learning better. That’s not a bad thing. Better teaching is an important part of the solution. For all the energy devoted to technological innovation in instructional delivery, the classroom continues to hold its own as the most widely used venue to serve academically underprepared students. Teaching and learning are central to improving outcomes in developmental education.

State policy, on the other hand, does not come to mind when thinking about improving outcomes in developmental education. But policy can make a big difference in identifying and spreading effective teaching practices, which can contribute to deeper and more sustained learning.

Now, teaching and learning is a somewhat private enterprise between a teacher and a student—behind closed classroom doors. Smartly designed state policy can open these doors. For example, states—particularly those with community college systems—can use their data and performance measurement systems to identify institutions that consistently do the best job of accelerating students through developmental education, through math and English gatekeeper courses, and on to programs that are tied to credentials and degrees. The hypothesis is that analyzing student outcomes will lead to conversations with high performing colleges, which will lead to a better understanding about effective practices—including good teaching.

States can go further to document and disseminate effective teaching practices once identified. This might include the development and maintenance of a central repository of teaching resources or an innovators network where faculty are convened on a systematic basis to share proven-practices and serve as resources to each other. Optimally, it might be both.

State policy can go a long way to support better teaching and learning. Currently, there are few examples of states using their data and performance measurement systems in a robust way to identify and spread successful practices. The six states participating in the Developmental Education Initiative (DEI) are changing this. They are implementing a three-part change strategy featuring data-driven improvement, investment in innovation, and policy supports.

Under the first component of the strategy—data-driven improvement—the states are actively fashioning their data and performance measurement systems so that they can identify high performing colleges and begin to pinpoint effective strategies as described above. You can find out more about their progress by checking out Driving Innovation: How Six States Are Organizing to Improve Outcomes in Developmental Education here.

We’ll take up the second strategy, investment in innovation, next time! Until then…

Michael Collins is associate vice president for postsecondary state policy at Jobs for the Future.

Friday, June 10, 2011

Linky, Linky!

  • Yesterday on Confessions of a Community College Dean, Dean Dad blogged about boutique student success programs. “They can't scale up,” he says. “They work only so long as their per-student cost is off the charts. And those costs are covered by cutting other things.” Here at Accelerating Achievement, we agree that large-scale problems won’t be solved by small-scale programs. We’re hoping that our running series on the SCALERS model can help colleges think about program value AND feasibility as they scale-up small programs.  
  • Over at the Getting Past Go blog, Bruce Vandal is keeping us updated about his visit to New Zealand to meet with postsecondary leaders. “New Zealand, like the United States, has recognized that it must increase the percentage of their residents who pursue and achieve postsecondary credentials,” he tells us. “And, like the U.S., they find that far too many of their residents do not possess the skills they need to succeed at the higher levels of their tertiary education system.”
  • Inside Higher Ed brings us the saga of developmental math at California State University at Bakersfield. Because of budget cuts in 2009, CSU Bakersfield moved from classroom-based developmental math to only offering these courses online. Things didn’t work out like they’d hoped. “In one course, the student pass rate plummeted from 74 percent to 45 percent. In the other, the rate fell from 61 percent to 37 percent.” While the beginning of this story has all the makings of a tragic tale of students being left behind in this era of extreme resource constraints, the university has since managed to bring student pass rates to higher levels than before the cuts. You’ll have to read the whole story to find out how.

Wednesday, June 8, 2011

SCALERS Series: R is for Replicating Impact


Welcome to the sixth week of our seven-week series exploring the seven drivers of the SCALERS model, a framework of organizational capacities that are essential for successfully scaling-up effective programs. If you’re joining us for the first time, check out the series intro and the posts on the first five drivers: Staffing, Communicating, Alliance-Building, and Lobbying/Demonstrating Impact, and Earnings Generation/Resources.

“Replicating impact” means developing and maintaining institutional expertise and commitment as you scale up a program. This driver is an important part of sustainability planning and broader institutional improvement. Maintaining quality while scaling up effective programs is part of the college getting better at what it does.

Consider your college’s track record at expanding interventions in the past:
  • How do you capture organizational learning? 
  • What is your system for process improvement?
  • How do you involve the individuals responsible for implementing the strategy in learning and process improvement?
Some of this learning will be gleaned from your analysis of program outcome data, as discussed in the demonstrating impact post, but your institution should make space for interpretation of these data and integration with qualitative information.

To ensure the continuous improvement of your expanded strategy, you will need to systematically approach professional development. Expectations for participation in professional development should be clearly communicated to everyone involved in program delivery and management. Another essential piece is a plan to capture learning—both program and process-related—that can be incorporated into an existing continuous improvement strategy. Such knowledge development can actually be part of professional development, encouraging those who are implementing the strategy to innovate. The college should compare pre- and post-expansion data and take time to consider necessary modification. All of these processes and relationships will incorporate parts of other SCALERS drivers, including staffing, communicating and sustaining engagement.

The Faculty Inquiry Group model from the Carnegie Foundation for the Advancement of Teaching is an example of an approach to professional development that sets the stage for continuous learning.  As defined by the Foundation, faculty inquiry is:
“…a form of professional development by which teachers identify and investigate questions about their students’ learning. The inquiry process is ongoing, informed by evidence of student learning, and undertaken in a collaborative setting. Findings from the process come back in the form of new curricula, new assessments, and new pedagogies, which in turn become subjects for further inquiry.”
Danville Community College has employed this model in their Developmental Education Initiative work, convening faculty inquiry groups to pursue curriculum alignment among local high schools, adult basic ed programs, dev ed faculty, and college-level faculty. The groups have also proved vital to the college’s response to major dev ed redesign efforts led by the Virginia Community College System.

Abby Parcell is a Program Manager at MDC.

Friday, June 3, 2011

We Heart Wonks: A Friday DEI State Policy Review

  • Jobs for the Future just released their latest edition of Achieving Success, the state policy newsletter of Achieving the Dream and the Developmental Education Initiative, which highlights and profiles new state-level innovations, policy supports, and data-driven improvements. This issue features Florida’s new student success dashboard, Connecticut’s dev ed redesign task force, and Arkansas’ plans for developmental educator professional development. You can find it on the DEI website’s Resources page under “State Policy.”
  • In other news, it was a banner week for developmental education bills in the Texas Legislature. We posted in April about the impact that the DEI state policy team in Texas was having on the content of two House bills. This week, both bills passed the Texas House and Senate. Cynthia Ferrell, director of the Texas Developmental Education Initiative state policy team, tells us, “After learning about the Community College Research Center work on the ineffectiveness of online developmental education offerings, the language in HB 1244 and SB 1564 was changed from mandating online DE to include support for ‘a range of developmental coursework or instructional support that includes the integration of technology to efficiently address the particular developmental needs of the student.’“ The bills also address the need for new research-based and diagnostic assessment, encourage the use of course and non-course based dev ed, establishes alternative dev ed funding structures, and promotes professional development for developmental educators.

Thursday, June 2, 2011

Scaling Up is Hard to Do

In May, MDRC released an evaluation of our 15 colleges’ first year of the Developmental Education Initiative. The evaluation examined the progress that DEI colleges are making as they work to scale up successful developmental education strategies. To structure their analysis of the factors that obstruct and reinforce college scaling efforts, MDRC used the SCALERS model that we’ve been blogging about. The evaluators found that three of the SCALERS drivers were especially important: staffing, communicating, and alliance-building. “In particular,” they write, “scaling-up was more likely to proceed smoothly when the right people could readily be found to put the strategies in place, when there was ample communication with faculty members, when the necessary parties were engaged in alliances, and when the colleges could capitalize on preexisting working relationships.”

The MDRC evaluators also distilled six lessons for colleges as they seek to scale up their strategies:
  • Make sure that the best available data are used in intervention planning.
  • Find numerous occasions for the college president to express early and public support for the intervention.
  • Recognize that involving adjunct staff is likely to be critical for going to scale.
  • Consider making staff participation in professional development activities mandatory.
  • Actively market new strategies to students.
  • Anticipate complexities in scheduling and arranging space.
You can download the full report here.

Wednesday, June 1, 2011

SCALERS: E is for Earnings Generation


Welcome to the fifth week of our seven-week series exploring the seven drivers of the SCALERS model, a framework of organizational capacities that are essential for successfully scaling-up effective programs. If you’re joining us for the first time, check out the series intro and the posts on the first four drivers: Staffing, Communicating, Alliance-Building, and Lobbying.

In the original SCALERS model, the “Earnings Generation” driver is focused on creating additional revenue to support a particular enterprise; public institutions’ revenue-generating activity will be limited and not likely associated with the specific program or practice being scaled up. However, the expansion team still needs to consider the resources required to grow and sustain the program—and not just financial resources. We call this the “Resources” driver, and it helps focus the institution on securing and managing a program’s necessary staffing, space, technology, and other infrastructure needs.

Your team does have to think about funds; how do grants—local, state, and federal—influence the design of your program and the strategy you use to take a program to scale. Make sure that funds for expansion are included in an approved budget. Have a sustainability plan to secure continued funding over the life of the program. Consider a 2-3 year plan, as well as a longer-term plan, looking out 5-10 years. This attention to funding is especially important if the program was launched with time-limited monies. The institution will need to consider staffing here, too; an individual responsible for expansion must understand the hiring process and have the authority to make hiring decisions and authorize related expenditures.

If a program is expanding, there likely will be expanding space and technology needs. A scaling plan must include time to secure necessary office, training, and service accommodations. Depending on the nature of the program, the college also must acquire additional hardware, software, and telecommunications equipment. It is not just matter of purchasing equipment and clearing out space; the college also should ensure vital facilities and technical support are available. Clear communication is essential here as well; affected individuals must be apprised of any space or technology modifications and the organization should secure their commitment to support expansion.

Developmental Education Initiative campuses have dealt with resource issues in a variety of ways. Cuyahoga Community College has stretched professional development dollars by bringing trainers to campus rather than sending individuals to offsite conferences and courses. This allows more faculty to take advantage of professional development opportunities. Zane State College addressed limited computer lab space by purchasing laptop computers to create mobile labs that can move from classroom to classroom.   

Abby Parcell is a Program Manager at MDC.