Showing posts with label Lobbying. Show all posts
Showing posts with label Lobbying. Show all posts

Wednesday, February 15, 2012

This can save you money.

Today’s post is our fourth installment of “SCALERS: Round 2.” Originally created by Paul Bloom at the Duke University Fuqua School of Business’s Center for the Advancement of Social Entrepreneurship, the SCALERS model identifies seven organizational capacities that support the successful scaling of a social enterprise: staffing, communicating, alliance-building, lobbying, earnings generation, replicating impact, and stimulating market forces. (You can read an introduction to each driver in our first SCALERS series.)

Now, we’re asking DEI colleges about how particular SCALERS drivers have contributed to their scaling efforts. So far, we’ve covered staffing, communicating, and alliance-building. Below, Ginger Miller of Guilford Technical Community College shares what GTCC has learned about lobbying--or demonstrating impact, as we like to call it.



With a headcount enrollment of about 15,100 students, Guilford Technical Community College (GTCC) is the third largest community college in North Carolina. Its developmental education program serves 4,780 students on three campuses. As with any developmental education program, the primary goal at GTCC is for students to take the fewest number of developmental education courses necessary and complete them as quickly as possible. Our focus here is to describe how COMPASS testing supports this goal. Reviewing and re-testing the COMPASS test can save you time and money. That is the message we emphasize to incoming students. As an example, if a student places out of two developmental classes in English, that translates into tuition savings for the credit hours as well as two semesters—a full academic year—of their time.

As part of the application process, students complete the COMPASS placement test. Among the biggest obstacles to accomplishing proper placement in development education classes is student misconception about the test itself. Students may confuse COMPASS as an entrance test, rather than a placement test. Since they know they are accepted by a community college, they may not do their best to score well. To address this, we emphasize during registration the importance of taking the review workshop and re-testing, depending on their score. They must complete a review workshop before they are permitted to re-test. This workshop, available online or face-to-face, reviews the question format and the content for the math, English, and reading sections of the test.

As a result of completing a review workshop, followed by a re-test, 1,288 students have tested out of one or more developmental classes from fall 2010 through fall 2011. This represents a total estimated tuition savings of about $370,600 during the past three semesters. The largest percentage of students to test out of developmental education coursework appears in English and reading.  For three semesters between fall 2010 and fall 2011, an average of about 61 percent placed out of at least one developmental course in English. About 59 percent placed out of reading courses. In math, the results are lower, at about 33, 39, and 29 percent, for fall 2010, Spring 2011, and fall 2011.





 
At GTCC, the numbers of students to use these workshop reviews have increased from 981 students in fall 2010 to 1,241 students in fall 2011. The greatest increase has appeared in the online reviews, as compared to face-to-face workshops. For example, in fall 2011, 73 students attended the face-to-face reviews, compared to 1,168, who worked online. We also continue to reach out to local high schools, explaining the importance of the placement test; we have arranged for graduating seniors to take the test, the review workshop, and re-test again.

Wednesday, May 25, 2011

SCALERS Series: L is for Lobbying




Welcome to the fourth week of our seven-week series exploring the seven drivers of the SCALERS model, a framework of organizational capacities that are essential for successfully scaling up effective programs. If you’re joining us for the first time, check out the series intro and the posts on the first three drivers: Staffing, Communicating, and Alliance-Building.

Since “lobbying” has very specific—and sometimes negative—connotations, for some people, we like to call this driver “demonstrating impact.” In order to secure and sustain support for an expansion plan, you’ve got to articulate to institutional, state, and federal decision makers that expanding (and/or continuing) a particular practice or program will have substantial benefits relative to costs. These same arguments must be made to individuals delivering the program as well as program participants. Scaling up a program or practice that has been successful on a small scale may require some disruption of organizational culture; this intensifies the imperative to clearly demonstrate how such change will advance institutional priorities—or why those institutional priorities need to change.

No matter what program you’re expanding, you should start by articulating the rationale for expansion and the connection to the college’s larger strategic plan. Then, consider what data you need to show how effective the strategy is at meeting the specified goal for the specified target population. Since it’s Equity Week at Accelerating Achievement, we encourage you to analyze data disaggregated by race, income, and other demographic factors and identify achievement gaps among student populations. If closing these gaps is an institutional priority for your college and one of the desired outcomes of your program, then it is essential that you analyze the evidence for how effectively the program accomplishes this goal. You should also ensure that your organization has the institutional research capacity to collect, measure, and communicate all of these data elements.

Collecting and analyzing data only serves this driver if you get to the demonstrating step. Make a plan to share information about program outcomes—within the organization, within the broader community, and with individuals who are in positions to influence program continuation, innovation, and further expansion. Your team should include individuals who can connect to state and federal policy decision makers; these individuals must have access to up-to-date information about program outcomes. Consider ways that those delivering program services and those participating can inform policy decisions through advocacy and information sharing. All these relationships and practices require that the organization consider other SCALERS drivers, in particular Communicating, Alliance-Building, and Sustaining Engagement.

For an example of the power of data, we refer you back to one of this blog’s first posts from Michael Collins, program director at Jobs for the Future.  JFF developed the DEI State Policy Strategy, a state-level developmental education improvement strategy, with three action priorities:

A data-driven improvement process that ensures the right conditions for innovation.
A state-level innovation investment strategy that helps states align and coordinate support from multiple sources to provide incentives for the development, testing, and scaling up of effective models for helping underprepared students succeed.
Policy supports that provide a foundation for improved outcomes for underprepared students, facilitate the implementation of effective and promising models, and encourage the spread of successful practices.

By focusing on data-driven planning, resource coordination, and policy that supports effective practice, JFF’s strategy provides a framework for demonstrating impact at the college, system, and state level.

Abby Parcell is a Program Manager at MDC.