Showing posts with label advising. Show all posts
Showing posts with label advising. Show all posts

Wednesday, May 23, 2012

Patrick Henry Community College: Predicting Success

We’re indebted to Greg Hodges, dean of developmental education and transitional programs, and Kevin Shropshire, director of institutional research, at Patrick Henry Community College for their contributions to this post.

In 2009, at the beginning of their Developmental Education Initiative grant, Patrick Henry Community College planned two different strategies for accelerating students’ progress through developmental education: Fast Track math that allowed students to complete two dev math courses in one semester and Accelerated Learning Program (ALP) courses, based on the model pioneered at the Community College of Baltimore County. PHCC knew that these different accelerated strategies weren’t a perfect fit for every student and wanted a way for advisors to assess which students would be most likely to succeed in these courses. They created a computer-based predictor tool with an easy-to-use interface that was built on top of a database that calculated a student’s likelihood of success based on his/her response to a series of questions asked during an advising visit. Preliminary results with the tool were promising. And then the Virginia Community College System launched a system-wide redesign of developmental mathematics that essentially fast-tracked all students, with a requirement to complete all dev math in one year, using a series of nine modules and a new diagnostic instrument. Here’s the story of the early days of the advising tool and how PHCC plans for revisions that will enable them to continue using it to place students in the appropriate type of acceleration.

Version One
PHCC first deployed the predictor tool in fall 2010. In the past, the assessment of advising at the college was difficult because different advisors focused on different characteristics to guide students to specific courses; thus, trying to sort out which student characteristics and situations would affect success in a particular instructional method was nearly impossible. The advising tool—and the database on which it is built—provides for consistency of some key variables and documents decision-making of different advisors. Variables included: 
  • expected absences
  • expected use of tutorial services
  • placement test scores, including math, reading, and writing
  • previous developmental math success .
The database returned a statistical likelihood (high, medium, low) of a student’s success in an accelerated course. (The key characteristics and probabilities were based on a risk model of prior students’ behaviors and rates of success.) Here’s a screen shot of the interface created by the college's webmaster:

An advisor considered this computer-driven recommendation along with his/her own professional judgment, and the student’s perspective to make a proactive decision about acceleration possibilities that were in the best interest of the student. 

Preliminary data suggested promise—and the need for some refinements. In fall 2010, the pass rate for Fast Track students with a record in the database (meaning the advisors used the advising database), was 82 percent. The rate for students with no records in the database (those that enrolled on their own or whose advisors did not use the advising database) was 65 percent. This suggests the possibility that the advising system helped advisors identify students who were more likely to be successful in this accelerated course.

The input predictor model was 80-85 percent accurate, mainly driven by student absences. In addition, the predictor can also be applied to all students at the end of a semester, though it must be acknowledged that some of the inputs—like math lab visits and expected absences—are student-driven. These estimates can then be correlated with student success measures such as course success and retention.

Redesign the Curriculum – Redesign the Predictor
The folks at Patrick Henry weren’t the only Virginians doing DEI work. In 2009 -2010, the Virginia Community College System convened a state-wide task force and redesigned the entire developmental mathematics curriculum. (You can read about the redesign here.) When the final curriculum was launched at all VCCS colleges in January of 2012, content that was once delivered in 3-4 credit developmental courses was now captured in nine one-credit modules, with the aim that all developmental math course work could be completed in one year. That meant that acceleration became the norm—not just one of many options—at Patrick Henry. And that meant that a predictor that might recommend a traditional, non-accelerated course for a student was no longer the right tool for advisors.
However, PHCC was confident that there was still great value in advisors having more information on which to base advising recommendations. Since VCCS allowed colleges to decide on the delivery mechanisms for the new developmental math modules, PHCC undertook three different delivery options for the new VA math: face-to-face instruction, computer-based instruction, and ALP (combining developmental with on-level coursework). This summer and fall they will be hard at work creating and implementing a revised predictor that will help advisors counsel students about their best instructional option. They have already identified some key characteristics that affect student success in the different course models.

We’re looking forward to seeing how the next iteration of the predictor shapes up. We’ll be sure to update our blog readers. If you’re interested in more specifics about how the college built their model, please contact Greg Hodges at ghodges@patrickhenry.edu or Kevin Shropshire at kshropshire@patrickhenry.edu. (Achieving the Dream colleges: they’re even willing to share SAS code that’s already set up for ATD data!)

Wednesday, April 18, 2012

Replicating Impact

Today, we’re returning to our “SCALERS: Round 2” series. Originally created by Paul Bloom, the SCALERS model identifies seven organizational capacities that support the successful scaling of a social enterprise: Staffing, Communicating, Alliance-building, Lobbying, Earnings Generation, Replicating Impact, and Stimulating Market Forces. (You can read an introduction to each driver in our first SCALERS series.) Now, we’re asking DEI colleges about how particular SCALERS drivers have contributed to their scaling efforts. We’ve looked at staffing. communicating, alliance-building, lobbying, and earnings generation. In the post below, Becky Ament, associate dean for developmental education at Zane State College, discusses how Zane State went about replicating the impact of their intrusive advising approach for developmental education students.

When Zane State College joined Achieving the Dream in 2005, initial data analysis suggested interventions with two groups of students had the greatest potential to improve the year-to-year retention rates:
  • Students who tested two levels below college level math, but failed to complete at least one developmental class during their first year in college; nearly 100 percent of students who fell into this category were not retained from one fall to the next.
  • Students most at risk for dropping out as measured by the Noel-Levitz College Student Inventory (CSI); students scoring highest (7, 8, or 9) in “dropout proneness” on the CSI were significantly less likely to be retained from one fall to the next.
The resulting interventions were:
  • Developmental math advising: Developmental student outcome data indicated strong course retention and successful completion rates as well as strong success rates in targeted gatekeeper courses. Confident that the curriculum was well aligned and meeting students’ needs, new intervention strategies focused on intrusive advising. Academic advisors developed an unmet prerequisite intervention process that monitors students’ participation in developmental mathematics through enrollment and completion of a college-level math course. Any student not attending or dropping out of a developmental mathematics course was targeted for intervention advising and required to continue the appropriate sequence of developmental and college-level math courses. In two years this intervention resulted in a 5 percent increase in students successfully completing developmental math courses within their first year. By 2008, the increase grew to 10 percent.
  • Advising for students most at-risk: This program provides personal contacts and individual support to students scoring in a high profile range for “drop-out proneness.” Advisors interpret the CSI results for the students immediately upon completion during the placement test session and discuss support options for counseling, tutoring, and communicating with a contact person who cares about the student’s entire experience and success. Student Success Center personnel maintain contacts within the first three weeks of the student’s first term and then at least quarterly throughout the student’s first year to assist them with support plans as needed. Analysis of the 2007 cohort showed a 16 percent increase in fall-to-fall retention of the high-risk group as a result of this intervention. 
Coupled with both of these approaches are early alert referrals from faculty to initiate intervention advising during the course of a term.

The Developmental Education Initiative afforded Zane State the opportunity to build on these initial successes and scale the intrusive advising program. The comprehensive intervention program touches all developmental students in some way, from the initial group placement test and CSI interpretation sessions to the intrusive interventions. Despite significant enrollment growth, the CSI case management style intervention has been maintained by employing three part-time paraprofessional advisors in the Success Center to make the personal student contacts and refer students to professional support services as needed. Their work frees time for the professional advisors to focus on the other interventions. To ensure quality service delivery, advisors and paraprofessional advisors participated in orientation sessions with the director of the Student Success Center. The academic advisors in the Success Center who had been working with the various interventions then trained the new advisors. Additionally, the new academic advisor attended a National Academic Advising Association event for further professional development. The unmet prerequisite intervention for math has been expanded to developmental reading and English with the addition of another academic advisor.

Collectively, all of these initiatives are contributing to the goal of improved first year fall-to-fall retention rates: data have shown that students who began one fall and returned the following fall had a three-year graduation rate of 87 percent.