- College strategic plans for increasing student success are by nature long-term efforts, so concrete measures of progress often take years to appear. What do you do when your work to “move the needle” is slow going, or when initiative fatigue sets in at your institution? According to Inside Higher Ed, Monroe Community College has “started a series of modest but tangible 100-day projects to improve the college.” These projects are intended as small steps toward larger goals, but they also foster broad engagement and keep motivation high. Their first project is to “streamline the application and enrollment process so that prospective students have to create one password instead of three.” As blogger Dean Dad points out, this idea requires widespread institutional buy-in. If people don’t take it seriously, it won’t work. Wondering how to get that buy-in? Nick Bekas of Valencia Colleges offers his advice on building alliances in an Accelerating Achievement post earlier this year.
- A substantial portion of our nation’s workforce is unemployed or underemployed, but many companies can’t find the workers they need to fill high-skill jobs. Why are we struggling to train workers for existing positions when so many are in need of work? Maureen Conway of the Aspen Institute says that workforce training and education programs don’t do enough to address the real-world challenges adult students face. She sees a need for increased funding and budgetary flexibility for integrated student support services. Check out Colin Austin’s guest post on an approach that weaves together education and training, income supports, and financial services. Another reason that we can’t fill those open positions? It isn’t readily apparent to students or colleges what employers are looking for. Jobs for the Future’s Credentials that Work initiative uses real-time labor market information to help students choose credentials that will get them jobs, and to help institutions craft programs with local labor market value.
- Last week in The Chronicle of Higher Ed, developmental English professor Brian Hall of Cuyahoga Community College (a DEI Institution) shared some insight into “What Really Matters to Working Students.” Frustrated by students’ seemingly constant absence and inattention, Hall asked one of his developmental English classes to explain why so few students are successful. The biggest reason his students gave: the difficulty of balancing academics with life. Between work schedules and family responsibilities, many students feel that their motivation to do well in class is eclipsed by unforeseen hurdles. While developmental educators can’t eliminate these hurdles (see the previous bullet for what colleges can do), Hall and his students recommend ways professors can keep students on track, and caution against behavior that could knock them off course permanently. They propose that professors should make expectations and rules apparent from the start, treat students with the respect they require in return, make class work relevant and engaging, and show students that it is ok to make mistakes if you learn from them.
- The College Board has released a new “web based tool that provides quick and easy access to national, state and initiative-level data that describe the progress and success of community college students.” The Completion Arch establishes indicators for five areas: enrollment, developmental education placement, progress, transfer and completion, and workforce preparation and employment outcomes. You can filter the indicators by data source, state, and student characteristics. The site is easy to navigate, so check it out for yourself!
- The Hartford Courant ran an op-ed from the Community College Research Center at Columbia University last week on Connecticut’s recent developmental education legislation. Tom Bailey, Katherine Hughes, and Shana Smith Jaggers expressed their concerns over the potential negative impact that the legislation could have on students in need of significant skill development before they are ready for college-level coursework. They also noted their concerns about buy-in from college faculty and staff: “A policy that gives community college practitioners flexibility and support to try out new models — and that includes accountability measures to accelerate real change — would make them far more likely to embrace reforms on an institutional and state level.”
Friday, May 25, 2012
You Can't Handle the Links
Wednesday, May 23, 2012
Patrick Henry Community College: Predicting Success
We’re indebted to Greg Hodges, dean of developmental education and transitional programs, and Kevin Shropshire, director of institutional research, at Patrick Henry Community College for their contributions to this post.
In 2009, at the beginning of their Developmental Education Initiative grant, Patrick Henry Community College planned two different strategies for accelerating students’ progress through developmental education: Fast Track math that allowed students to complete two dev math courses in one semester and Accelerated Learning Program (ALP) courses, based on the model pioneered at the Community College of Baltimore County. PHCC knew that these different accelerated strategies weren’t a perfect fit for every student and wanted a way for advisors to assess which students would be most likely to succeed in these courses. They created a computer-based predictor tool with an easy-to-use interface that was built on top of a database that calculated a student’s likelihood of success based on his/her response to a series of questions asked during an advising visit. Preliminary results with the tool were promising. And then the Virginia Community College System launched a system-wide redesign of developmental mathematics that essentially fast-tracked all students, with a requirement to complete all dev math in one year, using a series of nine modules and a new diagnostic instrument. Here’s the story of the early days of the advising tool and how PHCC plans for revisions that will enable them to continue using it to place students in the appropriate type of acceleration.
Version One
PHCC first deployed the predictor tool in fall 2010. In the past, the assessment of advising at the college was difficult because different advisors focused on different characteristics to guide students to specific courses; thus, trying to sort out which student characteristics and situations would affect success in a particular instructional method was nearly impossible. The advising tool—and the database on which it is built—provides for consistency of some key variables and documents decision-making of different advisors. Variables included:
An advisor considered this computer-driven recommendation along with his/her own professional judgment, and the student’s perspective to make a proactive decision about acceleration possibilities that were in the best interest of the student.
Preliminary data suggested promise—and the need for some refinements. In fall 2010, the pass rate for Fast Track students with a record in the database (meaning the advisors used the advising database), was 82 percent. The rate for students with no records in the database (those that enrolled on their own or whose advisors did not use the advising database) was 65 percent. This suggests the possibility that the advising system helped advisors identify students who were more likely to be successful in this accelerated course.
The input predictor model was 80-85 percent accurate, mainly driven by student absences. In addition, the predictor can also be applied to all students at the end of a semester, though it must be acknowledged that some of the inputs—like math lab visits and expected absences—are student-driven. These estimates can then be correlated with student success measures such as course success and retention.
Redesign the Curriculum – Redesign the Predictor
The folks at Patrick Henry weren’t the only Virginians doing DEI work. In 2009 -2010, the Virginia Community College System convened a state-wide task force and redesigned the entire developmental mathematics curriculum. (You can read about the redesign here.) When the final curriculum was launched at all VCCS colleges in January of 2012, content that was once delivered in 3-4 credit developmental courses was now captured in nine one-credit modules, with the aim that all developmental math course work could be completed in one year. That meant that acceleration became the norm—not just one of many options—at Patrick Henry. And that meant that a predictor that might recommend a traditional, non-accelerated course for a student was no longer the right tool for advisors.
However, PHCC was confident that there was still great value in advisors having more information on which to base advising recommendations. Since VCCS allowed colleges to decide on the delivery mechanisms for the new developmental math modules, PHCC undertook three different delivery options for the new VA math: face-to-face instruction, computer-based instruction, and ALP (combining developmental with on-level coursework). This summer and fall they will be hard at work creating and implementing a revised predictor that will help advisors counsel students about their best instructional option. They have already identified some key characteristics that affect student success in the different course models.
We’re looking forward to seeing how the next iteration of the predictor shapes up. We’ll be sure to update our blog readers. If you’re interested in more specifics about how the college built their model, please contact Greg Hodges at ghodges@patrickhenry.edu or Kevin Shropshire at kshropshire@patrickhenry.edu. (Achieving the Dream colleges: they’re even willing to share SAS code that’s already set up for ATD data!)
In 2009, at the beginning of their Developmental Education Initiative grant, Patrick Henry Community College planned two different strategies for accelerating students’ progress through developmental education: Fast Track math that allowed students to complete two dev math courses in one semester and Accelerated Learning Program (ALP) courses, based on the model pioneered at the Community College of Baltimore County. PHCC knew that these different accelerated strategies weren’t a perfect fit for every student and wanted a way for advisors to assess which students would be most likely to succeed in these courses. They created a computer-based predictor tool with an easy-to-use interface that was built on top of a database that calculated a student’s likelihood of success based on his/her response to a series of questions asked during an advising visit. Preliminary results with the tool were promising. And then the Virginia Community College System launched a system-wide redesign of developmental mathematics that essentially fast-tracked all students, with a requirement to complete all dev math in one year, using a series of nine modules and a new diagnostic instrument. Here’s the story of the early days of the advising tool and how PHCC plans for revisions that will enable them to continue using it to place students in the appropriate type of acceleration.
Version One
PHCC first deployed the predictor tool in fall 2010. In the past, the assessment of advising at the college was difficult because different advisors focused on different characteristics to guide students to specific courses; thus, trying to sort out which student characteristics and situations would affect success in a particular instructional method was nearly impossible. The advising tool—and the database on which it is built—provides for consistency of some key variables and documents decision-making of different advisors. Variables included:
- expected absences
- expected use of tutorial services
- placement test scores, including math, reading, and writing
- previous developmental math success .
An advisor considered this computer-driven recommendation along with his/her own professional judgment, and the student’s perspective to make a proactive decision about acceleration possibilities that were in the best interest of the student.
Preliminary data suggested promise—and the need for some refinements. In fall 2010, the pass rate for Fast Track students with a record in the database (meaning the advisors used the advising database), was 82 percent. The rate for students with no records in the database (those that enrolled on their own or whose advisors did not use the advising database) was 65 percent. This suggests the possibility that the advising system helped advisors identify students who were more likely to be successful in this accelerated course.
The input predictor model was 80-85 percent accurate, mainly driven by student absences. In addition, the predictor can also be applied to all students at the end of a semester, though it must be acknowledged that some of the inputs—like math lab visits and expected absences—are student-driven. These estimates can then be correlated with student success measures such as course success and retention.
Redesign the Curriculum – Redesign the Predictor
The folks at Patrick Henry weren’t the only Virginians doing DEI work. In 2009 -2010, the Virginia Community College System convened a state-wide task force and redesigned the entire developmental mathematics curriculum. (You can read about the redesign here.) When the final curriculum was launched at all VCCS colleges in January of 2012, content that was once delivered in 3-4 credit developmental courses was now captured in nine one-credit modules, with the aim that all developmental math course work could be completed in one year. That meant that acceleration became the norm—not just one of many options—at Patrick Henry. And that meant that a predictor that might recommend a traditional, non-accelerated course for a student was no longer the right tool for advisors.
However, PHCC was confident that there was still great value in advisors having more information on which to base advising recommendations. Since VCCS allowed colleges to decide on the delivery mechanisms for the new developmental math modules, PHCC undertook three different delivery options for the new VA math: face-to-face instruction, computer-based instruction, and ALP (combining developmental with on-level coursework). This summer and fall they will be hard at work creating and implementing a revised predictor that will help advisors counsel students about their best instructional option. They have already identified some key characteristics that affect student success in the different course models.
We’re looking forward to seeing how the next iteration of the predictor shapes up. We’ll be sure to update our blog readers. If you’re interested in more specifics about how the college built their model, please contact Greg Hodges at ghodges@patrickhenry.edu or Kevin Shropshire at kshropshire@patrickhenry.edu. (Achieving the Dream colleges: they’re even willing to share SAS code that’s already set up for ATD data!)
Wednesday, May 16, 2012
The Blog is Back in Town
Accelerating Achievement has been a bit quiet for a couple of weeks. The elves have been called to other areas of the workshop, but they’re back in business now. Good thing our colleagues across the dev ed field have still been cranking out all sorts stuff that’s worth reading:
- This month, Getting Past Go, a developmental education state policy project of the Education Commission of the States, released a brief titled “Using State Policies to Ensure Effective Assessment and Placement in Remedial Education.” Mary Fulton, policy analyst at ECS, gives an excellent explanation of community college assessment and placement procedures and then walks through current research evaluating the effectiveness (or ineffectiveness) of current instruments and efforts to test the validity of those instruments. She then gives examples from states that are enacting policies to improve assessment by implementing multiple measures and clarifying the student intake process. The piece concludes with recommended assessment and placement policies for both states and postsecondary systems—and a chart that shows you which policies are currently in place in your state!
- The California Acceleration Project (CAP) just released their May newsletter, Acceleration News. You can read about progress among the CAP Community of Practice—how they’re scaling up and who’s joining the ranks. There also are details on how California community colleges are shrinking developmental course sequences, negotiating articulation agreements with four-year schools, and publishing new data on student performance in accelerated English and math courses. And a whole lot more—so check it out.
Looking for some professional development opportunities? Want to meet more folks that are committed to improving outcomes for community college students? Here are two upcoming events you should consider:
- The Community College of Baltimore County is hosting the Fourth Annual Conference on Acceleration in Baltimore, MD, June 7-8, with pre-conference workshops on writing, reading, and math on June 6. You can check out the conference schedule here and register here.
- The National Center for Postsecondary Research 2012 conference, “Strengthening Developmental Education: What Have We Learned and What’s Next,” is scheduled for June 21-22, 2012 at Columbia University in New York. See the agenda here and register here.
Subscribe to:
Posts (Atom)