Showing posts with label placement. Show all posts
Showing posts with label placement. Show all posts

Wednesday, May 23, 2012

Patrick Henry Community College: Predicting Success

We’re indebted to Greg Hodges, dean of developmental education and transitional programs, and Kevin Shropshire, director of institutional research, at Patrick Henry Community College for their contributions to this post.

In 2009, at the beginning of their Developmental Education Initiative grant, Patrick Henry Community College planned two different strategies for accelerating students’ progress through developmental education: Fast Track math that allowed students to complete two dev math courses in one semester and Accelerated Learning Program (ALP) courses, based on the model pioneered at the Community College of Baltimore County. PHCC knew that these different accelerated strategies weren’t a perfect fit for every student and wanted a way for advisors to assess which students would be most likely to succeed in these courses. They created a computer-based predictor tool with an easy-to-use interface that was built on top of a database that calculated a student’s likelihood of success based on his/her response to a series of questions asked during an advising visit. Preliminary results with the tool were promising. And then the Virginia Community College System launched a system-wide redesign of developmental mathematics that essentially fast-tracked all students, with a requirement to complete all dev math in one year, using a series of nine modules and a new diagnostic instrument. Here’s the story of the early days of the advising tool and how PHCC plans for revisions that will enable them to continue using it to place students in the appropriate type of acceleration.

Version One
PHCC first deployed the predictor tool in fall 2010. In the past, the assessment of advising at the college was difficult because different advisors focused on different characteristics to guide students to specific courses; thus, trying to sort out which student characteristics and situations would affect success in a particular instructional method was nearly impossible. The advising tool—and the database on which it is built—provides for consistency of some key variables and documents decision-making of different advisors. Variables included: 
  • expected absences
  • expected use of tutorial services
  • placement test scores, including math, reading, and writing
  • previous developmental math success .
The database returned a statistical likelihood (high, medium, low) of a student’s success in an accelerated course. (The key characteristics and probabilities were based on a risk model of prior students’ behaviors and rates of success.) Here’s a screen shot of the interface created by the college's webmaster:

An advisor considered this computer-driven recommendation along with his/her own professional judgment, and the student’s perspective to make a proactive decision about acceleration possibilities that were in the best interest of the student. 

Preliminary data suggested promise—and the need for some refinements. In fall 2010, the pass rate for Fast Track students with a record in the database (meaning the advisors used the advising database), was 82 percent. The rate for students with no records in the database (those that enrolled on their own or whose advisors did not use the advising database) was 65 percent. This suggests the possibility that the advising system helped advisors identify students who were more likely to be successful in this accelerated course.

The input predictor model was 80-85 percent accurate, mainly driven by student absences. In addition, the predictor can also be applied to all students at the end of a semester, though it must be acknowledged that some of the inputs—like math lab visits and expected absences—are student-driven. These estimates can then be correlated with student success measures such as course success and retention.

Redesign the Curriculum – Redesign the Predictor
The folks at Patrick Henry weren’t the only Virginians doing DEI work. In 2009 -2010, the Virginia Community College System convened a state-wide task force and redesigned the entire developmental mathematics curriculum. (You can read about the redesign here.) When the final curriculum was launched at all VCCS colleges in January of 2012, content that was once delivered in 3-4 credit developmental courses was now captured in nine one-credit modules, with the aim that all developmental math course work could be completed in one year. That meant that acceleration became the norm—not just one of many options—at Patrick Henry. And that meant that a predictor that might recommend a traditional, non-accelerated course for a student was no longer the right tool for advisors.
However, PHCC was confident that there was still great value in advisors having more information on which to base advising recommendations. Since VCCS allowed colleges to decide on the delivery mechanisms for the new developmental math modules, PHCC undertook three different delivery options for the new VA math: face-to-face instruction, computer-based instruction, and ALP (combining developmental with on-level coursework). This summer and fall they will be hard at work creating and implementing a revised predictor that will help advisors counsel students about their best instructional option. They have already identified some key characteristics that affect student success in the different course models.

We’re looking forward to seeing how the next iteration of the predictor shapes up. We’ll be sure to update our blog readers. If you’re interested in more specifics about how the college built their model, please contact Greg Hodges at ghodges@patrickhenry.edu or Kevin Shropshire at kshropshire@patrickhenry.edu. (Achieving the Dream colleges: they’re even willing to share SAS code that’s already set up for ATD data!)

Wednesday, April 25, 2012

Crowdsourcing the Placement Test Dilemma

On Monday, the Inside Higher Ed blog Confessions of a Community College Dean, was all about how you know a student needs remediation. Blogger Dean Dad gives a succinct overview of the often frustrating process: cutoff scores, preparing (or not preparing) students for a high-stakes test, and mandated tests that have little predictive value for a student’s performance. And then there’s “thousands of new students showing up in a compressed timeframe, ranging in age from fresh out of high school to retirement, and you need to place them all quickly.” He includes a few possible responses—using high school GPA and other diagnostics, embedded remediation, or the “let them fail” approach.

He then tosses the ball to his readers, requesting examples of efficient methods for placing a lot of students in the right place in a relative short time. While it’s often dangerous to read comments on blog posts, there are some interesting suggestions in the mix—from software solutions to diagnostic tests that are directly tied to instructional modules. We’ve covered some of these approaches that are happening in DEI colleges and states here on Accelerating Achievement:
What’s working on your campus? Obviously, there are a lot of inquiring minds that want to know!

Thursday, March 29, 2012

For the Reading List

It’s a newsy week on Accelerating Achievement. Today, we’re highlighting recent releases from the Community College Research Center. And while we’ve not figured out how to provide a link that downloads directly to your brain, we hope that you’ll be able to find time to read the pieces that are relevant to your work.


Wednesday, October 19, 2011

Linky, Linky!

  • Jay Matthews in the WaPo returns to community college placement and remediation with a follow-up to his commentary on Sarah Headden’s call for an entirely new approach.
  • We know some of you were on the debate team. Relive your glory days and hone your arguments with this NPR coverage of a debate about whether too many high school grads go to college.
  • Get Ready! A new MDRC study, Getting Ready for College, looks at the early impacts of developmental summer bridge programs in Texas. The final report on these programs won’t be released until next year, but preliminary results show promise.
  • Joanne Jacobs links to a John Locke Foundation study which shows that “as North Carolina’s high school graduation rate rose by 2.3 percent from 2006 to 2009, the community college remediation rate increased by 7 percent.”The report goes one to call out “low academic standards and expectations” as “one of a number of factors that provide marginal students an easier path to graduation.” We’d like to point out that the increase in remediation is also due to an influx of workers dislocated by the recession who haven’t been in school for years. The issue is more about alignment of standards between high school exit and college entrance than it is about lowering standards. North Carolina is already hard at work on this issue; the state was an early adopter of the Common Core Standards and one of the N.C. DEI State Policy team’s policy priorities is the alignment of standards for high school graduation, aiming to reduce the need for developmental education for recent high school graduates.
  • There’s a lot of different ways to approach the readiness question, of course. Check out this EdWeek article about a pilot program seeking to restructure high schools for college readiness.

Tuesday, July 19, 2011

Guest Post: Florida Remakes Placement Test

Today’s guest post comes from John Hughes, associate vice chancellor for evaluation at the Division of Florida Colleges. John is here to tell us about Florida’s new placement mechanism, the Postsecondary Education Readiness Test.

The Florida College System (FCS) is comprised of 28 colleges with 887,000 students enrolled in 2009-10. More than half of the colleges offer baccalaureate programs, though upper division enrollments represent only about 1 percent of the total. This reflects the system’s historic and continuing mission to serve lower division or two-year students. As part of that mission, Florida has long had well-established readiness testing and placement policies. All students are required to take a common placement test for reading, writing, and math, and are required to enroll in developmental education if they do not meet the minimum cut scores. For years Florida used the Accuplacer as the primary placement tool.

In 2008, working with the assistance of Achieve’s American Diploma Project, Florida began working toward a common definition of college readiness that would include specific expectations of what students need to know and be able to do in order to succeed in their first college-level English and math classes. During the same time period, the state’s contract for the Accuplacer expired and had to be re-bid. The Division of Florida Colleges recognized the opportunity and released an invitation to negotiate (ITN) for a test that would reflect the definition of college readiness already under development.

The requirements and expectations for the new test were established by English and math faculty. Statute and rule require a test that covers three subjects – reading, writing, and math. Faculty determined the content by identifying the competencies necessary to succeed in college credit courses and also developing example questions. The competencies were then used to create a test blueprint that identifies how many questions each student will be asked for each competency. Faculty approved the blueprint as well as the alignment of each item to the competencies and the content of every test item. Thus, Florida college faculty guided and shaped the entire test development process.

In October 2010, the Florida Department of Education’s Division of Florida Colleges (division) and McCann Associates rolled out the Florida Postsecondary Education Readiness Test (PERT). The test went live with interim cut-scores designed to replicate the placement rates observed with the Accuplacer. Once students have been placed using the PERT and the state has received their course outcomes, the division and McCann will conduct a detailed analysis to determine if the cut-scores are appropriate or need to be revised.

With a new placement test now in use, Florida is moving ahead with the development of diagnostic tests. Once again faculty members were asked to identify the competencies for the tests. In this case the competencies focused on the skills necessary to move into college credit courses. The diagnostic exams will be given to students who place into either the upper or lower level developmental courses. Data from the exams can be used to guide instruction or even to direct students into a modularized curriculum that focuses on just those areas in which a student is deficient. The lower level diagnostic tests are schedule to go live in August, 2011.

In addition to its use for entering college students, in 2011-12 the PERT will be used to assess the readiness of all 11th grade students whose high school assessments indicate they are at risk of not being college ready. Those who do not meet the cut-scores will be given remediation in 12th grade. As many as 150,000 Florida 11th graders may be eligible for readiness testing.


John Hughes is associate vice chancellor for evaluation at the Division of Florida Colleges.