The three-year Developmental Education Initiative (DEI) is drawing to a close. While the participating colleges and states are moving ahead with many of their expanded developmental education efforts, this also is a time to reflect on what we’ve learned over the last three years. We wanted to alert you to four upcoming publications from DEI partners that will delve into questions about success, challenges, and insights into where college, state, and funder priorities ought to be going forward. These publications, summarized below, will be released over the next three months. We’ll alert you when they hit the streets! We hope you’ll read them, share them, and make connections to your own work and learning.
Bringing Developmental Education to Scale: Lessons from the Developmental Education Initiative
Janet C. Quint, Shanna S. Jaggars, D. Crystal Byndloss, and Asya Magazinnik, MDRC and Community College Research Center
This second and final report from the official evaluation of the DEI colleges examines the degree to which the institutions scaled up their chosen developmental education reforms to serve more students, the factors that affected their ability to expand these programs and practices, and the extent to which these strategies were associated with improved student outcomes. It also considers ways that participation in DEI influenced the colleges more broadly. For these reasons, it may be of interest to other colleges looking to scale-up reforms, especially those related to instruction and the provision of student supports, as well as to funders concerned about how best to help community colleges bring promising ideas to scale. The evaluation, conducted by MDRC and its partner, the Community College Research Center at Teachers College, Columbia University, draws on both qualitative data (primarily interviews with key personnel at all 15 institutions) and quantitative data (information on participation and on student outcomes that the colleges regularly collected).
Ahead of the Curve: State Success in the Developmental Education Initiative
David Altstadt for Jobs for the Future
Building on their work through Achieving the Dream, six states and 15 community colleges joined the Developmental Education Initiative in 2009 to take on the daunting challenge of improving the success of students who enter the community college academically underprepared. Teams from the six DEI states— Connecticut, Florida, North Carolina, Ohio, Texas, and Virginia —working with Jobs for the Future, which has managed the DEI state policy effort, co-developed an ambitious, evidence-based state policy framework to guide large-scale, multi-faceted reforms in how community colleges provide underprepared students with the skills they need to succeed in college courses. Three years later, with the initiative winding down, these states have made significant progress in adopting the DEI policy recommendations and, as a result, they have augmented, accelerated, and spread developmental education systems change across their community colleges. Ahead of the Curve spotlights the major policy accomplishments of the Developmental Education Initiative by profiling specific innovations in each of the six states and by documenting the degree to which these states have pursued common strategies and policy levers contained in the initiative’s systems-change framework.
Presidential Reflections: What DEI Taught Us
Edited by Madeline Patton for MDC
The Developmental Education Initiative asked 15 college leaders to take what they’d learned in early Achieving the Dream efforts and apply that to the challenge of scaling up: what resources, policies, and practices are essential to scaling up effective developmental education efforts? Finding ways to move more students through developmental education more quickly—or bypass it altogether—while maintaining successful student outcomes required leadership and commitment from every level of the organization. In this essay collection, the presidents of the 15 DEI colleges reflect on what they learned about building, embedding, and maintaining systemic change in their institutions—particularly in the difficult field of developmental education— through work with their trustees, students, faculty, staff, and community. They discuss how they and their colleges took on identifying successful innovations and scaling them up in the midst of leadership transitions, serious reductions in financial resources, and major changes in organizational structure.
What We Know: A Synthesis of DEI College Learning
Abby Parcell, MDC, and Cynthia Ferrell, Community College Leadership Program
In February 2012, MDC convened DEI college teams composed of faculty, administrators, and presidents. We mixed them up—different colleges, different states, different roles—and asked them to create the ideal path for underprepared students to get from college entry to credential completion. Drawing on their collective knowledge, particularly what they’d learned during DEI, the teams considered four points of interaction with students or potential students: early intervention and access, advising and support services, developmental education instruction, and alignment with credential and degree programs. Six teams and six hours later, we had six designs that displayed a remarkable amount of consensus about the programs, policies, and institutional supports needed to help any student be successful on the path from college enrollment to credential completion. What We Know is a synthesis of our DEI experts’ recommended best program bets, and related critical institutional policies for helping all students succeed at what they set out to accomplish in community college.
Showing posts with label CCRC. Show all posts
Showing posts with label CCRC. Show all posts
Thursday, November 1, 2012
Thursday, March 29, 2012
For the Reading List
It’s a newsy week on Accelerating Achievement. Today, we’re highlighting recent releases from the Community College Research Center. And while we’ve not figured out how to provide a link that downloads directly to your brain, we hope that you’ll be able to find time to read the pieces that are relevant to your work.
- Predicting Success in College: The Importance of Placement Tests and High School Transcripts examines placement test and transcript validity in predicting course grades and college performance. Findings: while the placement tests aren’t very predictive, high school GPA is strongly associated with college GPA. You can read some additional commentary on this work on the JFF blog, in the New York Times, and in the Hartford Courant.
- Student Success Courses and Educational Outcomes at Virginia Community Colleges looks at the relationship between student success course enrollment and short-term student outcomes. Findings: students enrolled in a success course in the first semester were more likely to earn college-level credit in the first year and more likely to persist to the second year. You might also like to check out these D.R.E.A.M. 2012 presentations on student success courses in Maryland, North Carolina and Texas.
- A National Center for Postsecondary Research paper, Learning Communities for Students in Developmental English: Impact Studies at Merced College and the Community College of Baltimore County, looks at the impact of semester-long learning communities linking developmental English with a variety of other courses. Findings at the two colleges were mixed.
Tuesday, December 20, 2011
Guest Post: Understanding and Reconciling the Opposing Forces that Shape Developmental Programming
Much developmental education research today focuses on which cog needs the most grease; in other words, how do we fix a system that everyone seems to agree is broken? In a developmental education working paper published by the Community College Research Center in November 2011, Shanna Smith Jaggars and Michelle Hodara take a different tack. In this description of a case study of the City University of New York’s six community colleges, they ask why the system is broken and propose a framework that can help institutions answer that fundamental question. Below, Shanna Smith Jaggers introduces the study and the new framework.
Those who attempt innovation in developmental education often find our efforts thwarted by administration or faculty who seem dead-set against change. Too often, we dismiss our detractors’ objections as springing from short-sightedness, or worse, sheer obstinacy. Yet if we do not make the effort to understand and validate the real (and often positive) motivations of the opposite camp, we are unlikely to make any progress. Based on a recent case study of a large urban community college system, Michelle Hodara and I have developed an “opposing forces framework” that may help innovators understand the conflicting motivations that shape developmental education. In today’s post, I want to focus on one key set of opposing forces: support of student progression versus enforcement of academic standards.
Nationwide, faculty and administrators all want to support students to succeed. Evidence is mounting that accelerated strategies (such as shortening sequences, or mainstreaming developmental students with additional supports) can help do this. While accelerated strategies vary, many of them are based on the fact that placement exams are notoriously imprecise in their assessments of students’ capabilities; such acceleration strategies work by allowing students to “place upward” -- tackling more difficult work than the placement exam would suggest that they should. And indeed, while some students will falter and fail in this more difficult environment, on the whole, upward-placement methods allow more students to enter and successfully complete gatekeeper math and English courses than would be possible under the traditional sequence.
Why would anyone oppose such a strategy? It allows far more students to succeed in the long term, and at a lower cost to both the institution and the student. To put the problem into perspective, however, consider the fact that upward-placement methods are fundamentally equivalent to lowering your institution’s placement cut score -- and then imagine how people would feel about that. In our case study, although all faculty were passionate about student success, they were also universally uncomfortable with the notion of cut score decreases, based on three types of worries. The first worry is that the school would be perceived as having poor academic quality. The second is that introductory college-level courses would be harder to teach due to wide variation in student preparedness. Strongly related to that is the third worry: that a flood of less-prepared students would give faculty the uncomfortable choice of either failing more students or relaxing their standards. All three worries reflect the generalized fear that teaching quality, grading rigor, and academic standards would decline at the college -- in ways that would fail students, exhaust faculty, and disappoint the community. Viewed in that light, acceleration strategies could be understood as an existential threat to hardworking and committed faculty across the campus. Who can blame them, then, for opposing your work?
Overall, our case study illustrated that everyone involved in developmental education is passionately committed to the greater good, but they tend to fall on opposite ends of the spectrum in terms of what they think should be done to advance the greater good. I think this study has convinced us that it may be impossible to fix developmental education unless administrators and faculty sit down and candidly talk with one another, in a context that allows people to bring some of these fears out into the open and work through them. If these conversations happen, then colleges can work out strategies to support progression while at the same time enforcing standards. For example, to ensure high standards in accelerated developmental courses and introductory college-level courses, faculty could work together to develop common learning outcomes across sections of each course, collaboratively creating standards that are meaningful, clearly defined, and maintained at a high level. If faculty are having trouble getting their students to meet the defined learning outcomes, there would be more clear information about exactly where students are struggling, and which teachers have materials and techniques that seem more helpful in certain areas. And rather than punishing faculty who have low pass rates or pressuring them to increase their pass rates, departments could support faculty to experiment and learn together about strategies that seem to be effective with struggling students.
In our case study, we discuss the tension between progression and standards in more detail, as well as two additional tensions (centralization vs. autonomy, and efficient vs. effective assessment). For more details and recommendations, I encourage you to take a look at our report: The Opposing Forces that Shape Developmental Education: Assessment, Placement, and Progression at CUNY Community Colleges.
Those who attempt innovation in developmental education often find our efforts thwarted by administration or faculty who seem dead-set against change. Too often, we dismiss our detractors’ objections as springing from short-sightedness, or worse, sheer obstinacy. Yet if we do not make the effort to understand and validate the real (and often positive) motivations of the opposite camp, we are unlikely to make any progress. Based on a recent case study of a large urban community college system, Michelle Hodara and I have developed an “opposing forces framework” that may help innovators understand the conflicting motivations that shape developmental education. In today’s post, I want to focus on one key set of opposing forces: support of student progression versus enforcement of academic standards.
Nationwide, faculty and administrators all want to support students to succeed. Evidence is mounting that accelerated strategies (such as shortening sequences, or mainstreaming developmental students with additional supports) can help do this. While accelerated strategies vary, many of them are based on the fact that placement exams are notoriously imprecise in their assessments of students’ capabilities; such acceleration strategies work by allowing students to “place upward” -- tackling more difficult work than the placement exam would suggest that they should. And indeed, while some students will falter and fail in this more difficult environment, on the whole, upward-placement methods allow more students to enter and successfully complete gatekeeper math and English courses than would be possible under the traditional sequence.
Why would anyone oppose such a strategy? It allows far more students to succeed in the long term, and at a lower cost to both the institution and the student. To put the problem into perspective, however, consider the fact that upward-placement methods are fundamentally equivalent to lowering your institution’s placement cut score -- and then imagine how people would feel about that. In our case study, although all faculty were passionate about student success, they were also universally uncomfortable with the notion of cut score decreases, based on three types of worries. The first worry is that the school would be perceived as having poor academic quality. The second is that introductory college-level courses would be harder to teach due to wide variation in student preparedness. Strongly related to that is the third worry: that a flood of less-prepared students would give faculty the uncomfortable choice of either failing more students or relaxing their standards. All three worries reflect the generalized fear that teaching quality, grading rigor, and academic standards would decline at the college -- in ways that would fail students, exhaust faculty, and disappoint the community. Viewed in that light, acceleration strategies could be understood as an existential threat to hardworking and committed faculty across the campus. Who can blame them, then, for opposing your work?
Overall, our case study illustrated that everyone involved in developmental education is passionately committed to the greater good, but they tend to fall on opposite ends of the spectrum in terms of what they think should be done to advance the greater good. I think this study has convinced us that it may be impossible to fix developmental education unless administrators and faculty sit down and candidly talk with one another, in a context that allows people to bring some of these fears out into the open and work through them. If these conversations happen, then colleges can work out strategies to support progression while at the same time enforcing standards. For example, to ensure high standards in accelerated developmental courses and introductory college-level courses, faculty could work together to develop common learning outcomes across sections of each course, collaboratively creating standards that are meaningful, clearly defined, and maintained at a high level. If faculty are having trouble getting their students to meet the defined learning outcomes, there would be more clear information about exactly where students are struggling, and which teachers have materials and techniques that seem more helpful in certain areas. And rather than punishing faculty who have low pass rates or pressuring them to increase their pass rates, departments could support faculty to experiment and learn together about strategies that seem to be effective with struggling students.
In our case study, we discuss the tension between progression and standards in more detail, as well as two additional tensions (centralization vs. autonomy, and efficient vs. effective assessment). For more details and recommendations, I encourage you to take a look at our report: The Opposing Forces that Shape Developmental Education: Assessment, Placement, and Progression at CUNY Community Colleges.
Friday, July 15, 2011
Guest Post: Four Principles to Guide Reform
In today’s post, Shanna Smith Jaggars, senior research associate at the Community College Research Center (CCRC), Teachers College, Columbia University, hits the high points of CCRC’s Assessment of Evidence series. We’ve referenced the series on the blog before, but we thought you deserved a more comprehensive introduction.
This spring, the Community College Research Center released the Assessment of Evidence, a series of reports that present research-based recommendations to improve the success of community college students. In this post, I’ll briefly introduce the four key recommendations that arose from that work, and how they apply to developmental education reform.
1. Simplify the structures and bureaucracies that students must navigate.
This recommendation rests upon the finding that overly complex environments tend to cause people (all people, not just students) to make poor decisions. Accordingly, colleges should take a step back and look at their developmental education policies and practices to ensure they are not inadvertently creating unnecessary barriers, confusion, and frustration. Where possible, the developmental education sequence should be streamlined. Good examples include the Statway program and Virginia’s planned developmental math redesign, both of which aim to rationalize the developmental curriculum and improve its alignment with college-level material.
2. Broad engagement of all faculty should become the foundation for policies and practices to increase student success.
Reforms that are defined at the top and then imposed on faculty will not be lasting or effective. Reform should begin by engaging faculty in defining metrics and goals that they feel are meaningful – that is, by encouraging faculty to develop concrete student learning outcomes for their courses. Regular examination of their own students’ learning outcomes will help engage faculty in the process of experimentation and innovation necessary to improve those outcomes.
In developmental classes, faculty should consider incorporating learning outcomes related to academic behaviors, such as study skills, that help students be more successful in college. Incorporating such goals will lay the foundation for integrating supports to develop such skills into the everyday curriculum (see our report on non-academic supports).
3. Define common learning outcomes and assessments, and set high standards for those outcomes.
In K-12, schools that are successful with disadvantaged populations provide faculty with the time and support to work together to create coherent programs, with clear outcomes, common assessments, and integrated supports. Thus, our third recommendation builds on the second: engage faculty in working together to craft learning outcomes and assessments, with common measurement of outcomes across all sections of a course. That doesn’t mean all assignments have to be the same; it can mean a common final exam, or a final course project or portfolio that is graded according to a common rubric across sections. Faculty should collaborate not just on developmental courses, but also on learning outcomes for key introductory college-level courses, thus creating stronger alignment between developmental and college-level course material. Setting high standards for course outcomes -- which, typically, will not initially be met -- will challenge the department to innovate. For example, we uncovered very promising evidence for developmental pedagogies such as contextualization and structured group collaboration (see our contextualization and math pedagogy reports), but these instructional tactics are not widespread, perhaps primarily because they require intensive and focused faculty development. Colleges, departments, and individual faculty will be more motivated to systematically pursue such strategies if they can clearly see the gaps between their own goals and the reality of their students’ current learning.
4. Colleges should collect and use data to inform a continuous improvement process.
Achieving the Dream and Developmental Education Initiative colleges are already very familiar with the notion of using data and measurement as part of a continuous improvement cycle. For this process to have impact, faculty and mid-level administrators must be involved in defining and shaping it. To help support faculty involvement, colleges can rethink incentives, committee structures, and professional development. In particular, professional development resources might be redirected toward supporting the faculty teams described in the third recommendation.
For more on the eight strategies and four recommendations, you can download the reports from the CCRC website – and feel free to leave your suggestions (or objections!) from a practitioner’s perspective in the comments below.
Shanna Smith Jaggars is a senior research associate at the Community College Research Center (CCRC), Teachers College, Columbia University.
This spring, the Community College Research Center released the Assessment of Evidence, a series of reports that present research-based recommendations to improve the success of community college students. In this post, I’ll briefly introduce the four key recommendations that arose from that work, and how they apply to developmental education reform.
1. Simplify the structures and bureaucracies that students must navigate.
This recommendation rests upon the finding that overly complex environments tend to cause people (all people, not just students) to make poor decisions. Accordingly, colleges should take a step back and look at their developmental education policies and practices to ensure they are not inadvertently creating unnecessary barriers, confusion, and frustration. Where possible, the developmental education sequence should be streamlined. Good examples include the Statway program and Virginia’s planned developmental math redesign, both of which aim to rationalize the developmental curriculum and improve its alignment with college-level material.
2. Broad engagement of all faculty should become the foundation for policies and practices to increase student success.
Reforms that are defined at the top and then imposed on faculty will not be lasting or effective. Reform should begin by engaging faculty in defining metrics and goals that they feel are meaningful – that is, by encouraging faculty to develop concrete student learning outcomes for their courses. Regular examination of their own students’ learning outcomes will help engage faculty in the process of experimentation and innovation necessary to improve those outcomes.
In developmental classes, faculty should consider incorporating learning outcomes related to academic behaviors, such as study skills, that help students be more successful in college. Incorporating such goals will lay the foundation for integrating supports to develop such skills into the everyday curriculum (see our report on non-academic supports).
3. Define common learning outcomes and assessments, and set high standards for those outcomes.
In K-12, schools that are successful with disadvantaged populations provide faculty with the time and support to work together to create coherent programs, with clear outcomes, common assessments, and integrated supports. Thus, our third recommendation builds on the second: engage faculty in working together to craft learning outcomes and assessments, with common measurement of outcomes across all sections of a course. That doesn’t mean all assignments have to be the same; it can mean a common final exam, or a final course project or portfolio that is graded according to a common rubric across sections. Faculty should collaborate not just on developmental courses, but also on learning outcomes for key introductory college-level courses, thus creating stronger alignment between developmental and college-level course material. Setting high standards for course outcomes -- which, typically, will not initially be met -- will challenge the department to innovate. For example, we uncovered very promising evidence for developmental pedagogies such as contextualization and structured group collaboration (see our contextualization and math pedagogy reports), but these instructional tactics are not widespread, perhaps primarily because they require intensive and focused faculty development. Colleges, departments, and individual faculty will be more motivated to systematically pursue such strategies if they can clearly see the gaps between their own goals and the reality of their students’ current learning.
4. Colleges should collect and use data to inform a continuous improvement process.
Achieving the Dream and Developmental Education Initiative colleges are already very familiar with the notion of using data and measurement as part of a continuous improvement cycle. For this process to have impact, faculty and mid-level administrators must be involved in defining and shaping it. To help support faculty involvement, colleges can rethink incentives, committee structures, and professional development. In particular, professional development resources might be redirected toward supporting the faculty teams described in the third recommendation.
For more on the eight strategies and four recommendations, you can download the reports from the CCRC website – and feel free to leave your suggestions (or objections!) from a practitioner’s perspective in the comments below.
Shanna Smith Jaggars is a senior research associate at the Community College Research Center (CCRC), Teachers College, Columbia University.
Thursday, June 30, 2011
Guest Post: Good, Better, I-BEST
Today, we welcome John Wachen of the Community College Research Center to Accelerating Achievement. Below, John summarizes findings from a recent CCRC study of Washington State’s I-BEST model for integrating basic skills and workforce training.
To meet the ambitious goals set forth by the federal government and private foundations to increase substantially the number of students with high-quality postsecondary credentials, the higher education system must focus on retaining students and accelerating completion, particularly among underrepresented populations. One promising program model that works with basic skills students is the Integrated Basic Education and Skills Training (I-BEST) model in Washington State’s two-year colleges.
I-BEST was developed to increase the rate at which ABE and ESL students advance to college-level coursework and completion by integrating basic skills and career-technical instruction. In the model, basic skills instructors and career-technical instructors jointly teach college-level occupational courses that admit basic skills students. An I-BEST program is a series of these integrated courses in a career-technical field that leads to a credential. The I-BEST model has received a significant amount of attention in recent years as policy makers and practitioners in other states look for effective strategies to help low-skilled students move farther and faster along educational pathways.
Researchers at the Community College Research Center (CCRC) conducted several studies of the I-BEST model over the past few years, including a field study of implemented programs in Washington State’s colleges. Findings from our field study include information about program structure and management, integrated instruction and support services, and program costs and sustainability.
Below are some highlights from our findings:
Our report in Community College Review contains additional discussion of these and other findings, including a profile of I-BEST students, information on student financial support, and lessons for other states and colleges.
To meet the ambitious goals set forth by the federal government and private foundations to increase substantially the number of students with high-quality postsecondary credentials, the higher education system must focus on retaining students and accelerating completion, particularly among underrepresented populations. One promising program model that works with basic skills students is the Integrated Basic Education and Skills Training (I-BEST) model in Washington State’s two-year colleges.
I-BEST was developed to increase the rate at which ABE and ESL students advance to college-level coursework and completion by integrating basic skills and career-technical instruction. In the model, basic skills instructors and career-technical instructors jointly teach college-level occupational courses that admit basic skills students. An I-BEST program is a series of these integrated courses in a career-technical field that leads to a credential. The I-BEST model has received a significant amount of attention in recent years as policy makers and practitioners in other states look for effective strategies to help low-skilled students move farther and faster along educational pathways.
Researchers at the Community College Research Center (CCRC) conducted several studies of the I-BEST model over the past few years, including a field study of implemented programs in Washington State’s colleges. Findings from our field study include information about program structure and management, integrated instruction and support services, and program costs and sustainability.
Below are some highlights from our findings:
- Structured pathways. All I-BEST programs are part of long-term educational pathways that can yield increasingly valuable credentials. Our research suggests that it is important to provide structured, coherent pathways for basic skills students, who might otherwise find it difficult to navigate a broad array of choices.
- Integrated instruction. For each I-BEST course, the basic skills instructor and career-technical instructor must jointly teach in the same classroom with at least a 50 percent overlap of the instructional time. The degree to which basic skills and career-technical instruction is integrated in the I-BEST classroom varies considerably across programs. Fully integrated instruction is uncommon and difficult to achieve but we did find several examples of highly collaborative team-teaching.
- Faculty selection and collaboration. The team-teaching model is challenging for the instructors and facility with it often takes time to develop. The relationship between the instructors is critical and it is therefore important to identify and select instructors with the willingness and ability to work with a co-instructor.
- Funding and sustaining I-BEST. Approved programs are funded at 1.75 times the normal rate per full-time equivalent student (FTEs) to help cover the higher program costs. At many colleges, however, the expense of running the programs was a primary concern. Colleges identified several factors needed to sustain I-BEST programs, including maintaining strong enrollments, solid commitment from senior administrators, and continued financial support through enhanced FTEs.
Our report in Community College Review contains additional discussion of these and other findings, including a profile of I-BEST students, information on student financial support, and lessons for other states and colleges.
Friday, April 29, 2011
Survey says: Contextualization works!
We’ve covered a lot of territory this week, from South Texas to South Africa. As promised, we’re going to cap off the week by digging into the latest research on contextualization.
This month, the Community College Research Center (CCRC) at Columbia University’s Teachers College released a brief titled “Facilitating Student Learning Through Contextualization.” In the brief, CCRC reviews existing literature for evidence on the effectiveness of contextualized basic skills instruction, While there is promising evidence that contextualization improves students’ basic skills mastery , the results are mixed on whether these practices improve content learning outcomes. The authors also site several studies that tie contextualization with positive influence on developmental education course completion and college-level credit accumulation.
CCRC identifies some practical applications for their findings, including “considerable effort…needed to implement contextualization because instructors need to learn from each other and collaborate across disciplines, a practice that is not common in college settings.” We heard about the importance of interdisciplinary collaboration yesterday from Stevan Schiefelbein, who told us about faculty collaboration between South Texas College’s departments of History, Sociology, Developmental Reading, Developmental Math, and Developmental English.
For more CCRC analysis of contextualization models, check out their report on I-BEST. Developed in Washington state, Integrated Basic Education and Skills Training (I-BEST) integrates basic skills instruction with college-level occupational classes. Guess what? Faculty collaboration is vital for I-BEST, too. Has your college taken the contextualization challenge? What approcaches ease the path to collaboration? What gets in the way?
This month, the Community College Research Center (CCRC) at Columbia University’s Teachers College released a brief titled “Facilitating Student Learning Through Contextualization.” In the brief, CCRC reviews existing literature for evidence on the effectiveness of contextualized basic skills instruction, While there is promising evidence that contextualization improves students’ basic skills mastery , the results are mixed on whether these practices improve content learning outcomes. The authors also site several studies that tie contextualization with positive influence on developmental education course completion and college-level credit accumulation.
CCRC identifies some practical applications for their findings, including “considerable effort…needed to implement contextualization because instructors need to learn from each other and collaborate across disciplines, a practice that is not common in college settings.” We heard about the importance of interdisciplinary collaboration yesterday from Stevan Schiefelbein, who told us about faculty collaboration between South Texas College’s departments of History, Sociology, Developmental Reading, Developmental Math, and Developmental English.
For more CCRC analysis of contextualization models, check out their report on I-BEST. Developed in Washington state, Integrated Basic Education and Skills Training (I-BEST) integrates basic skills instruction with college-level occupational classes. Guess what? Faculty collaboration is vital for I-BEST, too. Has your college taken the contextualization challenge? What approcaches ease the path to collaboration? What gets in the way?
Friday, April 1, 2011
Hook ‘Em, Horns!
Last week, we had a guest post from Cynthia Ferrell, director of the Texas Developmental Education Initiative state policy team, on integrating state policy and institutional change. Today, we got a dispatch from Cynthia that shows this work in action. This week, Cynthia was asked to testify before the Texas House Higher Education Committee about a couple of bills that involve developmental education. Here’s what went down at the statehouse:
- HB 1244: this bill proposes requiring colleges to offer developmental education online. Cynthia shared some of the findings from a recent Community College Research Center (CCRC) literature review that provides evidence of the ineffectiveness of online education for low-income and underprepared students. Cynthia says, “The committee was very interested in the findings that showed not only the lack of research support for this action, but also the research-based reasons these students are not likely to succeed in online courses. I suggested that instead, we should support statewide scaling of promising hybrid innovations being piloted and scaled at ATD and DEI colleges.” Cynthia expects that the legislative language will be amended.
- HB 3468: this bill addresses assessment and placement policies. CCRC’s research on assessment (which we’ve posted about previously) was provided to a representative during bill development. According to Cynthia: “The bill, which was very well received by the committee, includes the report's recommendation for the rigorous evaluation of college readiness assessments and a placement model that targets alternative treatments.” And it looks like this bill will be passed!
Thursday, February 10, 2011
Read All About It!
We can always count on our colleagues at Jobs for the Future for the latest news. At yesterday's ATD/DEI State Policy Meeting, I learned about a few recent publications that might interest you:
Abby Parcell is MDC's Program Manager for the Developmental Education Initiative.
- Michael Collins, Program Director at JFF, contributed to the Fall 2010 issue of the Journal of Developmental Education. Michael's piece, "Bridging the Evidence Gap in Developmental Education," looks at conflicting perspectives on current developmental education research, examines the rigor of that research, and proposes an expanded research agenda. You can download the issue table of contents and the full abstract. The Journal of Developmental Education is published by the National Center for Developmental Education at Appalachian State University and is the official publication of the National Association of Developmental Education.
- The Community College Research Center released two new pieces in their Assessment of Evidence Series: "Assessing Developmental Assessment in Community Colleges" by Kathrine Hughes and Judith Scott-Clayton and "Reforming Mathematics Pedagogy: Evidence-Based Findings and Recommendations for the Developmental Math Classroom" by Michelle Hodara. You can read an introduction to this series here.
Abby Parcell is MDC's Program Manager for the Developmental Education Initiative.
Subscribe to:
Posts (Atom)