Wednesday, May 23, 2012

Patrick Henry Community College: Predicting Success

We’re indebted to Greg Hodges, dean of developmental education and transitional programs, and Kevin Shropshire, director of institutional research, at Patrick Henry Community College for their contributions to this post.

In 2009, at the beginning of their Developmental Education Initiative grant, Patrick Henry Community College planned two different strategies for accelerating students’ progress through developmental education: Fast Track math that allowed students to complete two dev math courses in one semester and Accelerated Learning Program (ALP) courses, based on the model pioneered at the Community College of Baltimore County. PHCC knew that these different accelerated strategies weren’t a perfect fit for every student and wanted a way for advisors to assess which students would be most likely to succeed in these courses. They created a computer-based predictor tool with an easy-to-use interface that was built on top of a database that calculated a student’s likelihood of success based on his/her response to a series of questions asked during an advising visit. Preliminary results with the tool were promising. And then the Virginia Community College System launched a system-wide redesign of developmental mathematics that essentially fast-tracked all students, with a requirement to complete all dev math in one year, using a series of nine modules and a new diagnostic instrument. Here’s the story of the early days of the advising tool and how PHCC plans for revisions that will enable them to continue using it to place students in the appropriate type of acceleration.

Version One
PHCC first deployed the predictor tool in fall 2010. In the past, the assessment of advising at the college was difficult because different advisors focused on different characteristics to guide students to specific courses; thus, trying to sort out which student characteristics and situations would affect success in a particular instructional method was nearly impossible. The advising tool—and the database on which it is built—provides for consistency of some key variables and documents decision-making of different advisors. Variables included: 
  • expected absences
  • expected use of tutorial services
  • placement test scores, including math, reading, and writing
  • previous developmental math success .
The database returned a statistical likelihood (high, medium, low) of a student’s success in an accelerated course. (The key characteristics and probabilities were based on a risk model of prior students’ behaviors and rates of success.) Here’s a screen shot of the interface created by the college's webmaster:

An advisor considered this computer-driven recommendation along with his/her own professional judgment, and the student’s perspective to make a proactive decision about acceleration possibilities that were in the best interest of the student. 

Preliminary data suggested promise—and the need for some refinements. In fall 2010, the pass rate for Fast Track students with a record in the database (meaning the advisors used the advising database), was 82 percent. The rate for students with no records in the database (those that enrolled on their own or whose advisors did not use the advising database) was 65 percent. This suggests the possibility that the advising system helped advisors identify students who were more likely to be successful in this accelerated course.

The input predictor model was 80-85 percent accurate, mainly driven by student absences. In addition, the predictor can also be applied to all students at the end of a semester, though it must be acknowledged that some of the inputs—like math lab visits and expected absences—are student-driven. These estimates can then be correlated with student success measures such as course success and retention.

Redesign the Curriculum – Redesign the Predictor
The folks at Patrick Henry weren’t the only Virginians doing DEI work. In 2009 -2010, the Virginia Community College System convened a state-wide task force and redesigned the entire developmental mathematics curriculum. (You can read about the redesign here.) When the final curriculum was launched at all VCCS colleges in January of 2012, content that was once delivered in 3-4 credit developmental courses was now captured in nine one-credit modules, with the aim that all developmental math course work could be completed in one year. That meant that acceleration became the norm—not just one of many options—at Patrick Henry. And that meant that a predictor that might recommend a traditional, non-accelerated course for a student was no longer the right tool for advisors.
However, PHCC was confident that there was still great value in advisors having more information on which to base advising recommendations. Since VCCS allowed colleges to decide on the delivery mechanisms for the new developmental math modules, PHCC undertook three different delivery options for the new VA math: face-to-face instruction, computer-based instruction, and ALP (combining developmental with on-level coursework). This summer and fall they will be hard at work creating and implementing a revised predictor that will help advisors counsel students about their best instructional option. They have already identified some key characteristics that affect student success in the different course models.

We’re looking forward to seeing how the next iteration of the predictor shapes up. We’ll be sure to update our blog readers. If you’re interested in more specifics about how the college built their model, please contact Greg Hodges at ghodges@patrickhenry.edu or Kevin Shropshire at kshropshire@patrickhenry.edu. (Achieving the Dream colleges: they’re even willing to share SAS code that’s already set up for ATD data!)

No comments:

Post a Comment