Showing posts with label data. Show all posts
Showing posts with label data. Show all posts

Tuesday, August 21, 2012

Is it worth it?

In March, we released More to Most: Scaling Up Effective Community College Practices, a guidebook that lays out a process for identifying promising practices and developing an expansion plan so that good programs reach as many students as possible. Central to the process is the SCALERS framework, developed at Duke University and discussed extensively on this blog. There are other steps that form the foundation for SCALERS application. We’re going to tell you about some of them in our next few posts. To download a copy of More to Most, visit the website: www.more2most.org.

Before you create a plan for scaling up, you need to decide if the program or practice you want to scale is actually effective. The last thing you need is to do more of something that isn’t even worthwhile. (Such logic only applies to things like eating Doritos or watching episodes of Chopped.) Chapter 2 of More to Most, Determining Program Value, lays out a process for assessing the value of a program.

The first stage of this process is defining the problem you are trying to address with your program and identifying your desired outcome. An example of a well-defined problem:
Too many students who test into three developmental education courses never successfully complete a college-level math or English course.
An example of a concrete outcome:
Currently, X percent of students successfully complete gateway math in one year. We will increase this by X percentage points by [YEAR].
You’ve got a program that’s designed to address this problem and help you reach this outcome. During the next stage, you collect evidence that will help you decide how well the program is achieving the desired outcome. What evidence of the program’s impact, both quantitative and qualitative, is available? Possible sources include:
  • Basic demographic profile of student body, a target group, and/or program participants
  • Course completion data: by course; by student cohort
  • Focus group data from students, staff, and/or faculty
  • Data from national surveys like CCSSE, SENSE, Noel-Levitz, etc.
Now, you can determine whether the program or practice meets criteria for effectiveness. We focus on two important criteria:
  • Does the evidence show that the program delivers the desired outcome?
  • Would scaling up the program align with institutional objectives?
It is important to consider both of these questions when determining the value of scaling up the program. Many institutions analyze data disaggregated by race, income, and other demographic factors and identify achievement gaps among student populations. If closing these gaps is an institutional priority for your college and one of the desired outcomes of your program, then make sure you analyze the evidence for how effectively the program accomplishes this goal. When comparing the evidence for several programs, keep in mind that if the program has positive outcomes for a designated population that is generally less successful, it may show a lesser impact on overall student success outcomes, at least in the short term. This does not make the program less valuable, however. The value comes from how well the program matches your desired outcomes and the institution’s priorities — or makes a case for changing those priorities.

Once you’ve collected your evidence, it’s time to make the case for expansion and developing a scaling strategy. We’ll discuss those steps in upcoming posts.

Friday, May 25, 2012

You Can't Handle the Links

  • College strategic plans for increasing student success are by nature long-term efforts, so concrete measures of progress often take years to appear. What do you do when your work to “move the needle” is slow going, or when initiative fatigue sets in at your institution? According to Inside Higher Ed, Monroe Community College has “started a series of modest but tangible 100-day projects to improve the college.” These projects are intended as small steps toward larger goals, but they also foster broad engagement and keep motivation high. Their first project is to “streamline the application and enrollment process so that prospective students have to create one password instead of three.” As blogger Dean Dad points out, this idea requires widespread institutional buy-in. If people don’t take it seriously, it won’t work. Wondering how to get that buy-in? Nick Bekas of Valencia Colleges offers his advice on building alliances in an Accelerating Achievement post earlier this year. 
  • A substantial portion of our nation’s workforce is unemployed or underemployed, but many companies can’t find the workers they need to fill high-skill jobs. Why are we struggling to train workers for existing positions when so many are in need of work? Maureen Conway of the Aspen Institute says that workforce training and education programs don’t do enough to address the real-world challenges adult students face. She sees a need for increased funding and budgetary flexibility for integrated student support services. Check out Colin Austin’s guest post on an approach that weaves together education and training, income supports, and financial services. Another reason that we can’t fill those open positions? It isn’t readily apparent to students or colleges what employers are looking for. Jobs for the Future’s Credentials that Work initiative uses real-time labor market information to help students choose credentials that will get them jobs, and to help institutions craft programs with local labor market value.  
  • Last week in The Chronicle of Higher Ed, developmental English professor Brian Hall of Cuyahoga Community College (a DEI Institution) shared some insight into “What Really Matters to Working Students.” Frustrated by students’ seemingly constant absence and inattention, Hall asked one of his developmental English classes to explain why so few students are successful. The biggest reason his students gave: the difficulty of balancing academics with life. Between work schedules and family responsibilities, many students feel that their motivation to do well in class is eclipsed by unforeseen hurdles. While developmental educators can’t eliminate these hurdles (see the previous bullet for what colleges can do), Hall and his students recommend ways professors can keep students on track, and caution against behavior that could knock them off course permanently. They propose that professors should make expectations and rules apparent from the start, treat students with the respect they require in return, make class work relevant and engaging, and show students that it is ok to make mistakes if you learn from them. 
  • The College Board has released a new “web based tool that provides quick and easy access to national, state and initiative-level data that describe the progress and success of community college students.” The Completion Arch establishes indicators for five areas: enrollment, developmental education placement, progress, transfer and completion, and workforce preparation and employment outcomes. You can filter the indicators by data source, state, and student characteristics. The site is easy to navigate, so check it out for yourself
  • The Hartford Courant ran an op-ed from the Community College Research Center at Columbia University last week on Connecticut’s recent developmental education legislation. Tom Bailey, Katherine Hughes, and Shana Smith Jaggers expressed their concerns over the potential negative impact that the legislation could have on students in need of significant skill development before they are ready for college-level coursework. They also noted their concerns about buy-in from college faculty and staff: “A policy that gives community college practitioners flexibility and support to try out new models — and that includes accountability measures to accelerate real change — would make them far more likely to embrace reforms on an institutional and state level.”

Wednesday, November 23, 2011

Gobble Gobble

This Thanksgiving, we're thankful for evidence-based practices:


Click Image to Enlarge
Enjoy the holiday!

Thursday, November 3, 2011

Guest Post: Making Space for Good Decisions

Rob Johnstone made a presentation at the July ATD/DEI State Policy meeting in Florida; we thought Rob’s work with the RP Group, and specifically a project called BRIC (Bridging Research, Information & Culture), would be of interest to the Accelerating Achievement crowd, so we asked him to guest post today. Below, Rob reflects on the importance of carving out time for crucial exploratory conversations—and introduces some tools to help you do the same.

Greetings from sunny California, where I sit in my office at Skyline College just south of San Francisco, with what the hotel industry would term a “partial ocean view.” This means, on the 45 days a year it’s not shrouded in fog, if I squint, I can see a sliver of blue between some trees.

The RP Group is a nonprofit organization that strengthens community colleges’ ability to gather, analyze, and act on information to improve student success. We provide research, evaluation, professional development, and technical assistance services that support evidence-based decision-making and inquiry. Because our work is defined and conducted by community college practitioners, the RP Group provides a unique, on-the ground perspective on complex issues within the California community college system, and, through our work on Completion by Design and the Aspen Prize, across the country.

The unifying thread is a desire to engage all of the groups working on a campus—faculty, staff, administrators, and IR professionals—in developing deep cultures of inquiry that serve as the foundation for improving student outcomes. We define “culture of inquiry” as the institutional capacity to support open, honest, and collaborative dialogue that strengthens the institution and student outcomes. In practice, this means we encourage a variety of people across the campus to ask a wider collection of questions, and then use the evidence gathered to inform campus decision-making. 

As I reflect on these projects, I’m struck both by the similarities in the issues we face and the vast array of structures and approaches colleges have put in place to address these issues. The problems we wrestle with in the community college sector are complex and challenging, and there are no silver bullets (at least none that we’ve found!). The analyses we conduct don’t speak for themselves, and there are no self-evident answers. Given this, the strongest take-away from our work is how critical it is to structurally create the time and space for deep conversations, to explore the available data and evidence, and to extract insight and meaning that forms the foundation for action. These conversations work best when a wide range of participants from different parts of the college are brought to the table to explore the data.

As important as it is, in our work, we’ve found that creating that time and space for critical inquiry and data exploration conversations is challenging. Colleges are already strapped for their most precious resource—time!—and creating yet another committee for this work can be difficult. We’ve seen colleges be more successful when they re-purpose existing venues for these types of discussions, such as department and division meetings; committees devoted to developmental education, college success, first-year experience, or institutional planning; academic senates, and even the President’s cabinet.  

In each of these venues, we’ve found that many participants actually welcome the shift, as these meetings tend to be bureaucratic and process-driven; re-focusing on research, evidence of student success, and outcomes is often energizing and engages the participants’ passion. This also has the benefit of weaving the work into the fiber of the college, rather than encapsulating it in one committee. Clearly, college leadership has to prioritize the importance of such a change in these meetings or committees, but we have seen it work in practice.

For ideas on the kinds of questions to ask and ways to translate the answers into decisions, take a look at some of the resources we’ve put together on our website. I’d start with the BRIC Inquiry Guides or the “Evidenced-based Decision-making” section of our resources page. Colleges have used the discussion questions from the Inquiry Guides in department meetings and other campus committees to start conversations that open the door to future inquiry into practices, structures, and student success data.

In the end, we have to translate the results of our collective inquiry into action. The type of research and inquiry in which we engage is directional. At some point, in the words of Elphaba from the musical Wicked, we have to “trust our instincts, close our eyes, and leap.” In our world, “instinct” refers to the collective and deep subject-matter expertise that we have amassed at the colleges, married with a deep process of inquiry into the data and evidence. In taking the leap, we add to our collective knowledge-base—and quite often, we learn more when our initiatives relatively fail than when they succeed.

Please feel free to contact me directly at rjohnstone@rpgroup.org with any comments or questions.

Rob Johnstone is senior research fellow, Research & Planning Group for California Community Colleges and dean of research and planning, Skyline College

Friday, September 23, 2011

Guest Post: Alternative Forms of Data

Today, Nick Bekas, professor of English at Valencia College, questions how far quantitative data can take us when we’re trying to understand what developmental education students need. He also suggests some qualitative methods that can make the picture more complete and thus improve decisions that affect individual students and entire institutions.

Every time I see a chart with a statistic, I immediately try to put it in context. I try to wrap it in details and identify its origin. And I don’t mean from a statistical perspective. What I think about are the people behind the stat:
•    Who are they? 
•    What do they think? 
•    What do they feel? 
A number cannot capture these things.

Yet, for the most part, statistics are the primary source for decision making. This stems from tradition and an ingrained suspicion of qualitative data prevalent amongst researchers. Also, numerical data are easy. They are readily digestible and “clean.”  How do you argue with 35 percent, beyond its method of collection? And when more sophisticated statistical analyses are shared, most people’s eyes glaze over because such studies are either beyond comprehension or simply not readable (meaning: not interesting).

The larger issue with such statistical studies is that they are rarely questioned to any great degree and are readily accepted as “truth” by policymakers. Alternative forms of data have developed as a reaction to the dominance of statistics, and because of a belief that statistics limit the complexity of things. Statistics should not be the only data point. In order for there to be an equilibrium of rigor and assurance, we have to present more than just statistical data. No one form should be the only lens through which we view the world. For this reason, at Valencia College, we use alternative forms of data to change the angle of repose on what we seek to understand. We incorporate methods such as:
•    focus groups
•    interviews
•    archival analysis
•    transcript analysis
•    journaling
•    video
•    film
•    photography
•    audio recordings 
We have to move away from viewing the apparent—a statistic—as sufficient evidence of the problematic. When alternative methods are applied with rigor and consistency, they can provide a valid and reliable picture of the things we evaluate. For example, having students keep a video journal for a term and then analyzing the content of those videos can provide a deeper understanding of why so many developmental education students are not successful. 

Thirty two percent can tell us there is a problem, but it cannot tell us why.

Nick Bekas is DEI project director and professor of English at Valencia College.


For another example of employing multiple data sources, check out this recent MDRC study on performance-based scholarships at the University of New Mexico. Behind the Study rounds out early quantitative findings on credit completion and financial aid uptake with student perspectives gathered in focus groups.

Friday, July 15, 2011

Guest Post: Four Principles to Guide Reform

In today’s post, Shanna Smith Jaggars, senior research associate at the Community College Research Center (CCRC), Teachers College, Columbia University, hits the high points of CCRC’s Assessment of Evidence series. We’ve referenced the series on the blog before, but we thought you deserved a more comprehensive introduction.

This spring, the Community College Research Center released the Assessment of Evidence, a series of reports that present research-based recommendations to improve the success of community college students. In this post, I’ll briefly introduce the four key recommendations that arose from that work, and how they apply to developmental education reform.

1.     Simplify the structures and bureaucracies that students must navigate.
This recommendation rests upon the finding that overly complex environments tend to cause people (all people, not just students) to make poor decisions. Accordingly, colleges should take a step back and look at their developmental education policies and practices to ensure they are not inadvertently creating unnecessary barriers, confusion, and frustration. Where possible, the developmental education sequence should be streamlined. Good examples include the Statway program and Virginia’s planned developmental math redesign, both of which aim to rationalize the developmental curriculum and improve its alignment with college-level material.

2.    Broad engagement of all faculty should become the foundation for policies and practices to increase student success.
Reforms that are defined at the top and then imposed on faculty will not be lasting or effective. Reform should begin by engaging faculty in defining metrics and goals that they feel are meaningful – that is, by encouraging faculty to develop concrete student learning outcomes for their courses. Regular examination of their own students’ learning outcomes will help engage faculty in the process of experimentation and innovation necessary to improve those outcomes.

In developmental classes, faculty should consider incorporating learning outcomes related to academic behaviors, such as study skills, that help students be more successful in college. Incorporating such goals will lay the foundation for integrating supports to develop such skills into the everyday curriculum (see our report on non-academic supports).

3.    Define common learning outcomes and assessments, and set high standards for those outcomes.
In K-12, schools that are successful with disadvantaged populations provide faculty with the time and support to work together to create coherent programs, with clear outcomes, common assessments, and integrated supports. Thus, our third recommendation builds on the second: engage faculty in working together to craft learning outcomes and assessments, with common measurement of outcomes across all sections of a course. That doesn’t mean all assignments have to be the same; it can mean a common final exam, or a final course project or portfolio that is graded according to a common rubric across sections. Faculty should collaborate not just on developmental courses, but also on learning outcomes for key introductory college-level courses, thus creating stronger alignment between developmental and college-level course material. Setting high standards for course outcomes -- which, typically, will not initially be met -- will challenge the department to innovate. For example, we uncovered very promising evidence for developmental pedagogies such as contextualization and structured group collaboration (see our contextualization and math pedagogy reports), but these instructional tactics are not widespread, perhaps primarily because they require intensive and focused faculty development. Colleges, departments, and individual faculty will be more motivated to systematically pursue such strategies if they can clearly see the gaps between their own goals and the reality of their students’ current learning.

4.    Colleges should collect and use data to inform a continuous improvement process.
Achieving the Dream and Developmental Education Initiative colleges are already very familiar with the notion of using data and measurement as part of a continuous improvement cycle. For this process to have impact, faculty and mid-level administrators must be involved in defining and shaping it. To help support faculty involvement, colleges can rethink incentives, committee structures, and professional development. In particular, professional development resources might be redirected toward supporting the faculty teams described in the third recommendation.

For more on the eight strategies and four recommendations, you can download the reports from the CCRC website – and feel free to leave your suggestions (or objections!) from a practitioner’s perspective in the comments below.

Shanna Smith Jaggars is a senior research associate at the Community College Research Center (CCRC), Teachers College, Columbia University.

Tuesday, June 14, 2011

Guest Post: You’ve Got a Friend—Supportive State Policy for Dev Ed Innovation

Today’s post comes from Michael Collins, associate vice president of postsecondary state policy at Jobs for the Future. JFF leads DEI’s state policy initiative by supporting policy teams in CT, FL, NC, OH, TX, and VA, who are implementing the three-pronged Developmental Education Initiative State Policy Framework. In the first of a three-part series about the framework, Michael shows how collecting the right data can inform state policy that can accelerate dev ed innovation across a system—and introduces a new JFF publication that details how states are doing just that.

There is a lot of talk about fixing developmental education these days. In most people’s mind, this looks like a teacher in a classroom who is teaching better—and thus, students are learning better. That’s not a bad thing. Better teaching is an important part of the solution. For all the energy devoted to technological innovation in instructional delivery, the classroom continues to hold its own as the most widely used venue to serve academically underprepared students. Teaching and learning are central to improving outcomes in developmental education.

State policy, on the other hand, does not come to mind when thinking about improving outcomes in developmental education. But policy can make a big difference in identifying and spreading effective teaching practices, which can contribute to deeper and more sustained learning.

Now, teaching and learning is a somewhat private enterprise between a teacher and a student—behind closed classroom doors. Smartly designed state policy can open these doors. For example, states—particularly those with community college systems—can use their data and performance measurement systems to identify institutions that consistently do the best job of accelerating students through developmental education, through math and English gatekeeper courses, and on to programs that are tied to credentials and degrees. The hypothesis is that analyzing student outcomes will lead to conversations with high performing colleges, which will lead to a better understanding about effective practices—including good teaching.

States can go further to document and disseminate effective teaching practices once identified. This might include the development and maintenance of a central repository of teaching resources or an innovators network where faculty are convened on a systematic basis to share proven-practices and serve as resources to each other. Optimally, it might be both.

State policy can go a long way to support better teaching and learning. Currently, there are few examples of states using their data and performance measurement systems in a robust way to identify and spread successful practices. The six states participating in the Developmental Education Initiative (DEI) are changing this. They are implementing a three-part change strategy featuring data-driven improvement, investment in innovation, and policy supports.

Under the first component of the strategy—data-driven improvement—the states are actively fashioning their data and performance measurement systems so that they can identify high performing colleges and begin to pinpoint effective strategies as described above. You can find out more about their progress by checking out Driving Innovation: How Six States Are Organizing to Improve Outcomes in Developmental Education here.

We’ll take up the second strategy, investment in innovation, next time! Until then…

Michael Collins is associate vice president for postsecondary state policy at Jobs for the Future.

Thursday, April 7, 2011

Duly Noted: ATD Data on Developmental Education Time to Completion

Every two months, JBL Associates publishes Data Notes: Keeping Informed About Achieving the Dream Data. These briefs dive into the ATD database to examine patterns occurring across the entire ATD network. Since ATD colleges provide extensive student-level data on first-time, full- and part-time cohorts, there’s a lot of interesting discoveries to be made. The January/February 2011 edition, “Developmental Education: Time to Completion,” written by Amy Topper delves into how many attempts students make in order to complete all of the developmental education courses to which they are referred. They also look at persistence and subsequent gateway course completion. Based on three-year outcomes in English and math, the analysis finds that:  
  • Students referred to developmental math were more likely to attempt the class than were students referred to developmental English
  • The higher the number of course attempts, the less likely students were to have successfully completed a gateway course (Sadly, it will not surprise you to learn that the same was true of the student’s referral level: the more levels down in developmental math, the less likely to successful gateway course completion.)
  • Students who completed their developmental education coursework were about twice as likely to persist than those who did not complete the referred sequence
Topper concludes that “students with the greatest developmental needs are at the greatest risk of leaving college; however, if the college can keep the developmental student enrolled, his or her chances of success improve.”

One great thing about the Data Notes briefs, aside from opening a window into ATD data, is that they always end with intriguing questions for colleges that are tackling these particular issues. Given these data regarding the outcome differences for students who make multiple attempts and those that complete their dev ed coursework on the first attempt, Topper suggests colleges consider the following:
  • Why is there a substantial number of students who are still enrolled after three years, but have not even attempted to complete their developmental coursework?
  • What distinguishes students who make multiple attempts at developmental education from those who do not?
  • Why is it that some students who complete their developmental coursework successfully within three years, or even earlier, do not persist, complete, or transfer?
  • Do unsuccessful developmental education attempts vary by subject area or referral level at your institution?
  • What distinguishes students who experience higher first-time success in developmental education courses to which they were referred from students whose outcomes are not as successful?
Has your college taken on such questions? How would you begin to answer them and what might you change about your current practice in order to achieve different outcomes?


Abby Parcell is MDC's Program Manager for the Developmental Education Initiative.

Thursday, March 24, 2011

Guest Post: Texas is Taking Developmental Education by the Horns

The theory of change for both Achieving the Dream and the Developmental Education Initiative rely on the integration of institutional change and policy change. Today’s post comes from Cynthia Ferrell, director of the Texas Developmental Education Initiative state policy team. Cynthia introduces the Texas approach to this college/state integration.  For more on Texas developmental education policy and to see some legislative action, check out Bruce Vandal’s post over at Getting Past Go. Now, let’s get ready to rodeo!

We Texans pride ourselves on doing things big. And like our famed and fearless rodeo bull riders, we have strapped ourselves, with full intentions of victory, to the back of the massive and stubborn challenge of improving developmental student success. Four brave Texas DEI colleges—Coastal Bend College, El Paso Community College, Houston Community College and South Texas College—have sparred with the beast and have found innovations that are making a dramatic impact.

These institutions, and other Texas ATD colleges like them, collaborated with the DEI state policy team to co-construct a strategic state policy plan founded on what the institutions had learned about improving developmental student success and the state policy supports needed for further improvements and scaling. The new Texas Developmental Education Initiative State Policy Strategy placed their promising innovations at the center of a statewide cycle of continuous improvement and policy planning aimed at closing student success gaps. The following graphic illustrates the four core priorities of the plan.



  1. Cultivating Broad Engagement: The colleges found that getting everyone involved was an important key to developmental student success. So, this state policy priority is about getting lots of Texans involved in scaling up effective practices. Much of this work is being done through collaborations between the policy team, lead institutions, and established state associations. One exciting new development is a grass-roots initiative of enthusiastic ATD faculty who want to build a statewide network for all Texas community college developmental education faculty. These faculty leaders, with DEI state policy team support and coordination, are building face-to-face and web-based opportunities to encourage statewide scaling of developmental innovations that work.
  2. Building a Statewide Culture of Evidence: Colleges are building institutional cultures of evidence and we want to do the same for the state. Texas has great statewide data on community college developmental education outcomes. This priority is dedicated to raising awareness of the state of developmental education, increasing commitments to improving developmental student outcomes, and informing policy and decision-making by sharing state data in more meaningful ways. We (the DEI team and the DEI colleges) are in the midst of planning the Texas Community College Developmental Education Data Summit for core teams from each community college.  
  3. Scaling Successes: While the 26 ATD college districts and DEI colleges are busy piloting and scaling successes funded by state and national foundations, Teaxs legislators committed state funding to finding sustainable solutions to the developmental dilemma. In addition to the state formula funding of developmental courses in the last biennial budget, the state appropriated $15 million for developmental and adult basic education innovations, pilots and demonstration projects.
  4. Providing Policy Supports: All of the colleges revised their institutional developmental policies to support student success. Likewise, this priority was designed to inform state-level policy and advocate for innovation funding to support colleges’ efforts. Although it is too early in the legislative session to predict the outcome, several developmental education bills and riders have been filed and are currently being debated, including legislation regarding statewide college readiness assessment and placement, statewide planning that would require offerings of technology-based developmental coursework and faculty development, success-based outcomes funding (momentum points), and non-semester length developmental education funding.
Together, the Lone Star State’s ATD and DEI colleges, and the DEI state policy team, are truly taking developmental education by the horns.

Tuesday, March 8, 2011

Guest Post: Applying Promising K-12 Models

In December 2010, the Institute for the Study of Knowledge Management in Education (ISKME) hosted a Big Ideas Fest, an “inspirational extravaganza” where participants shared “ideas and real world results that prove when you put the learner in the center of all systems, anything is possible.” Big ideas, indeed! Thanks to support from the Bill & Melinda Gates Foundation, three DEI colleges were able to send representatives. Dr. Kathleen Cleary, from Sinclair Community College attended the event; she was particularly struck by a presentation from the New York City Public Schools’ School of One program. Today’s post is an introduction to the School of One approach from Mickey Muldoon, School of One’s manager of external affairs. Tomorrow, we’ll feature a post from Dr. Cleary, exploring how such an approach might be applied in developmental education at the community college level.

School of One was inspired by the simple insight that students in New York City classrooms – and across the country – have incredibly variable skills, knowledge, abilities, and challenges. Treating all the students in any classroom as identical cogs doesn’t do justice to their differences – and can be inefficient and taxing on teachers. So just like Amazon.com and Pandora.com respond to the unique preferences of their users, we are building a classroom that adapts to every student, with the help of sophisticated technology behind the scenes.

Below are some of the key principles of our philosophy and design:
  • Multiple modalities enable personalization. Our classrooms are large (~2,000 square feet) and divided into learning stations in different modalities: large and small group instruction, small group collaboration, software-based instruction, live remote tutoring, and independent practice. Because students are distributed across the stations at any given time, the fast students can work on advanced material, and slower students are not left behind in the back of class. Moreover, students who struggle in traditional classrooms often thrive in small groups, one-on-one, or with software or online tutors.
  • Data-driven scheduling. All School of One students take a short, custom online quiz at the end of each math period. Then at the end of each day, School of One’s learning algorithm processes all the quiz results and generates a unique plan for every student for the following day. Students who pass their quizzes are automatically moved on to new material; students who don’t will continue on the same material on the following day, often with extra help from the teaching staff.
  • Classroom tools should empower teachers to do what they do best. Our technology and data systems streamline administrative tasks, assessments, grading, and data analysis. This means that School of One teachers can easily access the key information they need, and spend more time planning and delivering great instruction and working directly with students.
  • Constant performance evaluation. In addition to daily online quizzes, School of One students participate in mandatory and voluntary evaluation and testing. So far, the results are promising: in the most recent in-school and after-school pilots, School of One students significantly outperformed their peers. For more information, go here.
To learn more about School of One, please visit www.schoolofone.org.

Thanks to DEI and Accelerating Achievement for the opportunity to share our work.


Mickey Muldoon is School of One’s manager of external affairs.

Monday, February 14, 2011

What Time is it? It's Valentimes!

xkcd

Just a reminder on this holiday to always remember the human side of data.

If you've had experiences like this one, check out these 7 Basic Rules for Making Charts and Graphs from FlowingData. 

Happy Valentine's Day from the DEI Team at MDC!