Community College Reform Movement Has Failed Students

The Community College Reform Movement Has Failed Students: A Critical Review of the Literature on the Most Common Reforms of the Past Decade

Alexandros M. Goudas    (Working Paper No. 18)    October 2024

Abstract

In 2009, President Obama prioritized college completion at community colleges by creating the American Graduation Initiative, known as the College Completion Agenda. Experts and interest groups such as the Community College Research Center and Complete College America began promoting reforms focused on bypassing or eliminating remediation and developmental education, using multiple measures for placement, offering more accelerated models, and increasing pathway options. The result was a unified community college reform movement that quickly swept the nation, and its primary goal was to increase graduation rates. Ten years after the first widescale implementations of reforms took place in the fall semester of 2014, a significant corpus of literature has now been amassed with which to study whether the movement has been successful. Results indicate that piecemeal reforms have been successful in increasing first-year gateway pass rates for some students. However, overall graduation rates have not increased. Most importantly, full implementation of such popular reforms as corequisites has caused the most underserved students to fail out of college at higher rates. For these same students, notably, graduation rates also declined because of the corequisite model. Additionally, numerous studies over the past 2 decades that have shown positive results for remediation and developmental education have not been highlighted in the ongoing national debate. In this critical review of the literature, I provide analyses of the most rigorous research on the most popular and widespread interventions in the community college reform movement. The net outcome is that graduation rates have not increased, and results now indicate that some reforms are harming the most underrepresented students. Therefore, a reasonable policymaker could conclude that remediation and developmental education coursework and programs should be reinstated as options for public 2-year students, and more comprehensive reforms targeting actual barriers should be implemented instead.

Keywords: community college, reforms, multiple measures, corequisites, pathways, remediation, developmental education

The Community College Reform Movement Has Failed Students: A Critical Review of the Literature on the Most Common Reforms of the Past Decade

Starting in the fall semester of 2014, a set of reforms started to change the face of public 2-year colleges. Laws passed in 2012 and 2013 resulted in drastic changes to placement and gateway course offerings in community colleges in states such as Connecticut, Florida, and others, and this movement quickly spread to other state systems and institutions (Fain, 2012a, 2012b; Smith, 2019). The most typical reforms have involved using multiple measures for placement to bypass remedial courses, accelerating remedial English and mathematics courses, and changing the pathways or choices students have had while navigating courses and programs at community colleges.

Thus began a community college reform movement that has expanded exponentially for over a decade and continues today (Center for the Analysis, 2024b). The set of reforms comprising this community college reform movement— multiple measures for placement, corequisites and other accelerated models, and pathways—is arguably one of the most rapid and radical changes to policies at public 2-year colleges (Litschwartz et al., 2023; Zachry Rutschow & Mayer, 2018; Zachry Rutschow et al., 2019). Scholars who have documented the history of community colleges since their expansion in the early 1960s have rarely noted such swift, uniform, and widespread changes to a typically disconnected network of approximately 1,000 public 2-year colleges in various states (Cohen et al., 2014). The net result has been that most community colleges now bypass, eliminate, or severely restrict stand-alone prerequisite remedial and developmental education courses, effectively ending a trend of learning support policies that began in the 1960s and 1970s and that was codified at most public 2-year colleges in the 1980s and 1990s (Boylan, 2002; Merisotis & Phipps, 2000; Roueche, 1968).

The impetus of the community college reform movement was a response to President Obama’s laudable American Graduation Initiative (AGI) in 2009, also known as the College Completion Agenda (Ochoa, 2011). One of the AGI’s goals was to increase the number of community college graduates by an additional 5 million by the year 2020, over and above the number of students obtaining associate’s degrees and certificates annually at the time (Brandon, 2009). Numerous experts and institutions were invited to participate in a White House summit, led by Jill Biden, with the goal to begin work on achieving the AGI’s completion agenda (White House, 2010). Because of their focus on public 2-year colleges, Columbia University’s Community College Research Center (CCRC) was one of the institutions invited to the White House in 2010 (Bailey & Cho, 2010). Since then, researchers associated with the CCRC have been the primary source of scholarly work cited as evidence in support of the widespread reforms that have swept the nation (Bailey et al., 2010; Center for the Analysis, 2024b).

At the time that the CCRC was invited to the White House in 2010, CCRC director Thomas R. Bailey had been working on ways to improve outcomes at community colleges. In 2008, for example, Bailey was lead author on two working papers that concluded that remediation was ineffective (Bailey, 2008; Bailey et al., 2008). As a result, the authors specifically targeted remediation and developmental education for certain reforms designed to bypass or accelerate students through these courses (Bailey et al., 2008, 2009). In a 2010 White House brief, Bailey and Cho summarized their findings by making such claims as “developmental education is costly and not very effective” (p. 4) and “the picture of past and current developmental education appears bleak” (p. 7). They concluded, “Finding better ways to address the needs of underprepared students is a necessity for meeting the Obama administration’s goal of increasing the number of community college graduates by 5 million by 2020” (p. 7).

With over 2,000 citations on Google Scholar, Bailey et al.’s 2010 paper on remediation and developmental education, the published version of the 2008 and 2009 working papers, established the seminal claim that these courses and programs were ineffective and essentially functioned as barriers. Bailey et al. used this argument to promote a plan to reduce student placement into remediation and developmental education, fast-track these courses with accelerated models and corequisites, and restrict course choices with pathways. Shortly after Bailey et al.’s paper was published, experts in the field of developmental education noted that the CCRC’s definition of success for remediation and developmental education, as well as the CCRC’s interpretation of the data, differed from established definitions and extensive research in the field (Boylan, 2002; Goudas & Boylan, 2012).

Nevertheless, well-funded interest groups such as Complete College America, the Education Commission of the States, and MDRC—including such philanthropies as the Bill & Melinda Gates Foundation—cited Bailey et al. (2010) as their primary scholarly evidence and began working under the assumption that removing remedial barriers would increase graduation rates (Bill & Melinda Gates Foundation, 2010; Complete College America, 2012; Education Commission of the States, 2012; Fain, 2012). Nongovernmental organizations, philanthropies, news organizations, and interest groups have since been using CCRC claims to argue for the elimination or reduction of remedial courses and developmental programs because of the established narrative that they are ineffective and barriers (Barshay, 2018; Complete College America, 2022; Ganga, 2018; Gordon, 2016; Logue, 2021; Palmer, 2016).

The 10-year nationwide campaign against remediation and developmental education continues today with researchers and interest groups exerting influence on state legislators, policymakers, and presidents to make laws and policy reforms that have fundamentally transformed the community college landscape. The reforms continue to change how millions of students enter and progress in public 2-year colleges (Center for the Analysis, 2024a, 2024b; Kim, 2024; Litschwartz et al., 2023).

A decade is a sufficient timeframe to assess the efficacy of any national reform. Therefore, in this paper, I provide a critical review of the literature on the question of whether the mass implementation of multiple measures, corequisites, and pathways nationwide has in fact increased graduation rates—the original and primary goal as set forth by the AGI, the CCRC, and the interest groups that have been intimately involved in the community college reform movement. The determination of the effectiveness of the community college reform movement will and should have an effect on the decisions public 2-year college policymakers consider now and in the future.

Community College Graduation Rates Have Not Increased

Halfway through the AGI initiative, Field (2015) noted there had been only a minor increase in overall graduation rates. In 2020, Kelderman provided an extensive analysis of the results of the AGI efforts, and in spite of an overall jump in graduation rates of 10 percentage points for Americans ages 25–64 from 2008–2017, the author noted that one of the AGI’s original goals of surpassing other advanced nations’ postsecondary achievement rates had not been realized. Additionally, the Lumina Foundation (n.d.) provides updated information on the proportion of all citizens in the U.S. that has attained any postsecondary degree, and from 2011–2021, that proportion increased from 38.7% to 54.3%. This suggests that there has been at least modest progress on the overall goal of increasing graduation rates. In terms of public 2-year colleges, however, Lumina data has revealed that the proportion of Americans ages 25–64 that obtained either certificates (and certifications) or associate’s degrees barely budged a few percentage points from 14% to 17% (Lumina Foundation, n.d.). More importantly, the National Student Clearinghouse Research Center’s data on completion showed that the percentage of students graduating from community colleges after 6 years went from 41% to 43% in cohorts that started and ended college between 2013–2016 and 2019–2022 (Causey et al., 2022).

The $12 billion that was dedicated to increasing graduation at community colleges in federal legislation was eliminated during the legislative process in 2010 (Marcus, 2019), and this clearly hampered efforts to increase graduation. It is also likely that the Great Recession impeded progress on the AGI’s attainment goals (Kelderman, 2020; Nellum & Hartle, 2016). Nonetheless, significant taxpayer dollars and funding from philanthropies were invested by the government and interest groups to increase completion (Complete College America, 2022; Institute of Education Sciences, n.d.). The bottom line, however, is that regardless of whether attainment rates are analyzed by proportion or cohort percentage, graduation rates at public 2-year colleges have not significantly increased since the community college reform movement began.

The Most Rigorous CCRC-Approved Research Has Shown Weak Evidence of Improvements

Another way to analyze the efficacy of reforms in terms of completion is to report outcomes for individual interventions at institutions or state systems. After 10 years of small-scale reforms and research, CCRC and MDRC researchers Bickerstaff et al. (2022) selected 17 rigorous studies created between 2010–2022 and have used their results to argue that the most popular reforms to remediation and developmental education have been effective. Because the CCRC’s research has been the primary evidence base of reform efforts, it is important to summarize and critique the actual findings of these studies and reports within the framework of the original goal of the community college reform movement. In this section, I critique the 10 studies featured in Bickerstaff et al. that focus on the three most common reforms of multiple measures, acceleration (including corequisites), and pathways.

Multiple Measures Assessment in Bickerstaff et al. (2022)

Arguably the most common reform states and institutions have implemented as part of the community college reform movement is multiple measures assessment (MMA). Litschwartz et al. (2023) showed that 73% of institutions are currently using high school GPA as part of their placement measures. Often, these institutions allow incoming students to self-report their HSGPA instead of collecting it with transcripts (Cullinan et al., 2018). The push to use high school GPA instead of (or in addition to, rarely) placement tests began to be scaled nationally with CCRC research starting in 2012 (Scott-Clayton, 2012; Scott-Clayton et al., 2014). The CCRC began promoting the use of multiple single measures “‘to have various ways to get placed out of developmental education’” (Smith, 2016).

There were two studies on MMA highlighted in Bickerstaff et al. (2022). First, the pinnacle of CCRC research analyzing the effects of MMA was a randomized controlled trial (RCT)[1] conducted in seven community colleges in Upstate New York (Barnett et al., 2020). The MMA RCT placed a far higher proportion of incoming students into college-level English and mathematics courses compared to the business-as-usual group (i.e., how students had been placed prior to the intervention). Barnett et al. (2020) released findings from five semesters of data, and Kopko et al. (2023) followed up with nine semesters of data. Both reports on the Upstate NY public 2-year college RCT showed no increase in graduation rates and only a small increase in gateway course completion for students in English after nine semesters, with no impact on gateway mathematics completion outcomes. Second, a similar RCT was conducted in the Midwest with outcomes reported from three semesters (Cullinan & Biedzio, 2021). That RCT’s results were consistent with Barnett et al. (2020) and Kopko et al. (2023).

Researchers in these three papers dedicated considerable space to conducting subgroup analyses to argue that students bumped up around a cutoff performed better than students bumped down, and the authors interpreted the results of these analyses as a basis upon which to claim that more students should be placed into college-level courses automatically (see Principle 1 in Bickerstaff et al., 2022). However, it is unclear why the bump-up and bump-down subgroup results demonstrate anything other than the fact that a more complex, costly, and systematized placement process results in slightly improved placement. Nonetheless, the final outcomes of two gold standard RCTs on MMA were the following: weak effects on gateway English pass rates, no effect on gateway mathematics pass rates, and no increase in graduation rates. The total cost for the Upstate NY RCT was over $1.4 million for nearly 13,000 students.

One significant confounding factor in this RCT was the fact that the placement test itself was considered the actual treatment. In other words, even if a student took the placement test (and was randomly assigned to the business-as-usual group and took the Accuplacer or was assigned to take the more involved program MMA), and that student never enrolled in college, that student was included in the final analytic sample. Barnett et al. (2020) noted that “14 percent of students who were randomly assigned to the business-as-usual or program group and who received a placement later decided not to enroll in any course in the first term after testing” (p. 14). Since a higher proportion of business-as-usual-placed remedial students who took the assessment ended up not going to college at all, this methodological choice negatively affected the outcomes of the business-as-usual group. Thus, while it appears that the traditional method for placing students resulted in slightly lower outcomes, some of this result is due to the fact that a significant proportion of business-as-usual students never enrolled in college at all, yet they were still included in the percentage of students who did not pass their gateway English or mathematics courses.

There is another common confounding factor found in all studies involving cohort comparisons between stand-alone remedial and nonremedial sequences: Students who take remediation have one or more fewer semesters of opportunity to attain any gateway course outcomes. This timeframe inconsistency was not considered in the CCRC MMA analyses. Research has shown that given enough time, remedial students attain similar completion outcomes compared to nonremedial students when the timeframe is lengthened or controlled for (Douglas et al., 2022; Noble & Sawyer, 2013). In fact, nearly all the completion figures in Kopko et al. (2023) demonstrate a trend of narrowing gaps between the MMA intervention groups and business-as-usual control groups (with a higher proportion of remedial students) as time goes on. It is likely that after 6 or 8 years, the two most common standard timeframes for analyzing graduation rates (Causey et al., 2022), these relatively small gaps in the MMA RCTs would disappear entirely.

Corequisites and Other Accelerated Models in Bickerstaff et al. (2022)

Corequisite Models

The second most popular reform that state systems and institutions have implemented as a result of the push to reform community colleges is the corequisite model. Because the CCRC’s goal was to accelerate remediation (Bailey, 2008; Bailey et al., 2008, 2009, 2010), they selected the Accelerated Learning Program (ALP) as a model of corequisites to scale nationally. As opposed to a stand-alone prerequisite course sequence that could require one or more semesters for a remedial student to complete a gateway college-level course, the original ALP corequisite model that was studied placed volunteer remedial students who were just beneath the college-level cutoff into a gateway course and its corresponding remedial companion course simultaneously (Cho et al., 2012). Most current corequisite models are variations of this original ALP design.

Litschwartz et al. (2023) found that nearly 80% of institutions are currently offering corequisites, up significantly from data collected from states in 2016 (Zachry Rutschow et al., 2019a). Since the community college reform movement began, many state systems have required all their students to enroll in corequisites (Education Commission of the States, 2021). Bickerstaff et al. (2022) identified three RCT studies on corequisites, two on mathematics and one English, as evidence that the model is effective in improving outcomes (Douglas et al., 2020a; Logue et al., 2019; Miller et al., 2022).

Douglas et al. (2020a) found that students who were invited to participate in the tutor-based corequisite model and who were randomly assigned to receive treatment passed any gateway mathematics course after 3 years at 11.4 percentage points higher than the stand-alone prerequisite remedial group who did not get selected (38.9% compared to 27.5%). While the initial progress was encouraging, 3-year graduation rates were no different between the two groups.

Miller et al. (2022) compared similar students who were randomly assigned to either a corequisite English course model or to the highest level stand-alone integrated reading and writing developmental education course. The authors reported that the corequisite group had an 18.4 percentage point increase in passing the gateway college-level English course (college composition I) after 2 years compared to the remedial group. However, when the analysis was extended to college composition II, a course that most students require, that advantage was reduced to just over 6 percentage points, a finding only significant at the p < 0.05 level. As shown in the MMA results (Kopko et al., 2023) and other studies (Douglas et al., 2022; Noble & Sawyer, 2013), given enough time, students in stand-alone prerequisite remedial courses tend to catch up with their corequisite or nonremedial counterparts. The fact that the college composition II pass rates differed by only 6 percentage points after 2 years indicates that this gap would likely shrink significantly, perhaps completely, after 6 or 8 years. Miller et al. did not find that the corequisite model had any positive effects on persistence or graduation after 2 years.

Finally, Logue et al. (2016) randomized students just beneath a cutoff of a mathematics assessment into an elementary algebra prerequisite remedial course or a college-level statistics course with workshop assistance, i.e., a corequisite mathematics model. Logue et al. (2019) followed up on the initial study with results characterized in Bickerstaff et al. (2022) as increases in pass rates of “19.2 percentage points, credit accumulation by 4.4 credits, and graduation rate by 8.1 percentage points” (p. 6). The problem is that Bickerstaff et al. omitted a critical finding in the Logue et al. study. Table 1 included another completion column that combined both graduation and transfer, and there was only a 4.8 percentage point difference between the two groups, a finding that was statistically insignificant. Therefore, the two groups’ graduation and transfer rates were statistically equivalent after 3 years.

In 2022, Douglas et al. followed up on the Logue et al. (2016, 2019) studies with 7 years of data and found that the stand-alone prerequisite remedial prealgebra group’s graduation rate was 41% and the corequisite statistics group was 44%. Subgroup analyses showed that students in the prerequisite remedial algebra group graduated with an associate’s degree at a rate of about 24% compared to the corequisite statistics group’s rate of nearly 25%. Bachelor’s degree attainment rates between the two groups were 16.5% (remedial algebra) and 19.5% (corequisite statistics). It is important to note that nearly 8% of the remedial algebra group and just over 5% of the corequisite statistics group were still enrolled in either a community college or a university after 7 years. This exemplifies the timeframe analysis confounding factor noted in Noble and Sawyer (2013).

The most encouraging result from the Douglas et al. (2022) RCT on corequisites was a slight increase in labor market outcomes for the corequisite statistics group, largely because these students were able to start working sooner. Confounding this finding, however, is the fact that the intervention’s shift in curriculum (statistics) may have caused those students to take different programs later in their coursework, and those who took prealgebra may have chosen more lengthy STEM pathway programs simply because the prerequisite remedial course they were placed into qualified them to take college algebra, a prerequisite for math-intensive programs. This may explain the nearly 3 percentage point gap in students who are still enrolled in college after 7 years. Again, given that prerequisite remedial students require a slightly longer timeframe to complete their college programs, it is likely that the remedial algebra group would catch up to their nonremedial statistics counterparts after 8 or 10 years.

The net result from these small-scale corequisite interventions show little to no effects on graduation rates over time. The increases in first-year gateway course pass rates for corequisite students are somewhat large but also temporary, and the studies do not demonstrate that stand-alone prerequisite remedial courses are in fact a barrier to any completion outcomes. Moreover, scaling individual models as these is well-recognized as particularly difficult and often results in weaker effects on positive outcomes (Bailey et al., 2020). Finally, other research now indicates that full-scale corequisite implementation causes differing outcomes on various levels of remedial students, including harmful results for the most underprepared students (Ran & Lee, 2024).

Accelerated Models

There were two other accelerated models included in Bickerstaff et al. (2022) that found promising results including slightly higher graduation rates. First, a 2014 study by Hodara and Jaggars in the City University of New York system found that students placed in accelerated writing courses passed their gateway English courses at a rate of 6.1 percentage points higher than nonaccelerated students. After 5 years, students in the accelerated model graduated with a degree at a rate of 2.2 percentage points higher than the control group (both associate’s and bachelor’s degrees included). When only analyzing associate’s degree attainment, there was no practical difference between the two groups after 3 years (1.1%). This 1.1% difference was significant at the p < 0.10 level. Research has shown that even with a p = 0.05, the rate of false positives is nearly 30% (Colquhoun, 2014). Therefore, any results with p-values higher than 0.01 should be interpreted with caution.

Second, Douglas et al. (2020b) randomly assigned students to an accelerated mathematics model that included a bridge program or an accelerated one-semester model of mathematics, compared to a three-semester sequence of stand-alone prerequisite remediation and college-level mathematics. After 3 years, students in the accelerated model passed the gateway mathematics course at a rate of 33 percentage points higher. Bickerstaff et al. (2022) highlighted the finding that the accelerated group had a 9-percentage point increase in 3-year graduation rates. According to Douglas et al. (2020b), the actual difference was 8.3 percentage points without controls, but the 9-percentage-point regression-adjusted graduation rate was only statistically significant at the p < 0.10 level, which, similar to Hodara and Jaggars (2014), is typically interpreted as having very weak or no reliable statistical significance (Colquhoun, 2014).

Similar to the corequisite models, the two featured noncorequisite accelerated models in Bickerstaff et al. (2022) showing increases in graduation rates suffer from the confounding factor of not allowing for extra time for the stand-alone prerequisite remedial control groups to make up for their semester(s) taking those remedial courses. Noble and Sawyer (2013) found that 5-year graduation rates for nonremedial students were equivalent to 6-year graduation rates for remedial students. This delayed effect is particularly relevant when interpreting results in Douglas et al. (2020b) due to the study’s relatively short timeframe for analysis and the extensive sequence of mathematics courses required for those remedial students. However, it is also likely that Hodara and Jaggars (2014) might find that the 2.2 percentage point difference in overall graduation rates would diminish or disappear if the stand-alone prerequisite remedial students were given another semester’s opportunity to overcome this minor difference in graduation rates. Since most community college students’ enrollment intensity is part-time, researchers have determined that an 8-year timeframe is more appropriate to analyze transfer and graduation rates (Causey et al., 2022). As shown in the Douglas et al. (2022) corequisite study after 7 years, remedial students typically catch up to nonremedial students given enough time. If an 8-year timeframe analysis were provided for these individual studies on corequisite and other accelerated models, it is even more likely the remedial students would match the nonremedial groups’ outcomes.

Mathematics Pathways in Bickerstaff et al. (2022)

The primary pathway structure highlighted by the CCRC in Bickerstaff et al. (2022) was an accelerated mathematics model created by the Charles A. Dana Center called Dana Center Mathematics Pathways (DCMP). Three studies on DCMP were featured by the CCRC, one RCT and two quasi-experimental studies (Biedzio & Sepanik, 2022; Schudde & Keisler, 2019; Schudde & Meiselman, 2019). All three found small increases in gateway mathematics pass rates overall, little to no increases in college-level credit accumulation, and no increase in graduation rates. This means that the students in the stand-alone prerequisite remedial course comparison groups, in spite of having lower initial gateway college-level mathematics course pass rates, eventually caught up to the DCMP students in the most important of all metrics, degree attainment, which was the original goal of the reforms.

Similar to Logue et al.’s (2016, 2019) design flaw, a significant confounding factor in these DCMP studies is that researchers changed the intervention group’s course type, which resulted in control groups taking stand-alone prerequisite prealgebra remediation and intervention groups taking nonalgebra courses, such as statistics and quantitative reasoning. This shift in curricula is based on the controversial idea that non-STEM pathway students should be exempt from the requirement to pass college algebra as part of their programs. Therefore, the DCMP is essentially conducting two interventions simultaneously: a change in curriculum for the intervention group and an accelerated model combined. Biedzio and Sepanik (2022) described the pathway model’s effects this way:

Students in the program [DCMP] entered into a math pathway, and their first college math course was more likely statistics or quantitative reasoning, while their counterparts not in the program were more likely to take college algebra (although many did take standard statistics or quantitative reasoning courses). (p. 3)

Furthermore, comparing a nonalgebra accelerated model pathway with a stand-alone remedial prealgebra pathway has biased the results positively for DCMP students and negatively for prealgebra students. For instance, Schudde and Keisler (2019) stated, “Enrolling in and passing nonalgebra college math coursework explained the majority of the increase in college math completion” (p. 16). The authors also admitted that “participating in DCMP has a small negative relationship with taking and passing college algebra, lowering the probability of each by about 1 percentage point” (p. 16). Schudde and Meiselman (2019) concluded, “The observed increase in college-level math course enrollment and completion among DCMP students may partially stem from students taking non-algebra college-level math, a key component of the DCMP model” (p. 11).

Studies on other corequisite models have corroborated the finding that the pathway shift from stand-alone prerequisite remedial prealgebra to statistics and other nonalgebra courses was the primary cause of increased outcomes, and these improved outcomes were not directly attributable to the accelerated or corequisite model per se. Ran and Lin (2022) studied the full corequisite model in the Tennessee state system of community colleges and concluded, “The positive impacts of placing into corequisite math were largely driven by the colleges’ math pathways reforms, instead of the structural change to mainstream students in college-level courses right after enrollment” (p. 480).

Finally, Sepanik (2023) reported updated 5-year progress on the DCMP model originally studied by Zachry Rutschow et al.’s (2019b) initial results and Biedzio and Sepanik’s (2022) 3-year results. The most positive outcome was the relatively small increase in the first gateway mathematics pass rates for the DCMP students, which was 5.6 percentage points, down from the 6 percentage points in the 3-year timeframe reported in Biedzio and Sepanik (2022). Unfortunately, Sepanik (2023) also found several disappointing results: no effect on subsequent college mathematics course pass rates, no effect on overall college-level credits, minimal impact on persistence, and no difference in graduation rates after 5 years. It is important to note that the creators of the DCMP model hypothesized that “changes in math completion would help students persist in college longer and accumulate more overall college credits, and ultimately be more likely to earn a certificate or degree” (p. 3). None of these goals were achieved.

Other pathway models that include mathematics pathways have been promoted by the CCRC. Since the publishing of the CCRC book Redesigning America’s Community Colleges: A Clearer Path to Student Success (Bailey et al., 2015), the CCRC has focused on implementing a holistic and piecemeal intervention they termed guided pathways. After repeated analyses on selected institutions that have implemented parts of the guided pathways model, CCRC authors have admitted that their guided pathways model has yet to result in any meaningful or sustained increases in outcomes (Jenkins et al., 2018, 2021, 2024). Bickerstaff et al. (2022) does not include any research on guided pathways in their selected 17 most rigorous studies, yet they mention the model numerous times elsewhere in the paper and promote it as being both popular and effective.

A Shift in the Goal Line From Graduation to Gateway Pass Rates: The Theory of Early Momentum

CCRC researchers claimed that remediation and developmental education served as barriers as early as 2008 (Bailey, 2008; Bailey et al., 2008, 2009, 2010). In 2012, CCRC researchers Zeidenberg et al. studied other coursework that they argued serve as barriers for incoming public 2-year college students, apart from first-year English and mathematics remedial courses, both of which had been the primary focus for reforms since 2009. Zeidenberg et al. found that “many introductory college-level courses in other subjects also served as obstacles to completion for many students, and these latter courses posed obstacles just as great as college math and English” (p. 28). By 2017, CCRC researchers had understood that the most popular reforms to remedial English and mathematics would not significantly increase graduation rates (Jaggars & Bickerstaff, 2018). In spite of their history of advocacy for these popular reforms, top CCRC authors came to a remarkable conclusion in a chapter on developmental education reform in 2018:

Research suggests that the most popular reform models (including multiple measures assessment and placement, math pathways, and the co-requisite approach) will indeed improve students’ rate of success in college-level math and English, but they are unlikely to substantially improve graduation rates. (p. 496)

Nevertheless, the CCRC and other interest groups persisted in promoting the most common reforms primarily to English and mathematics remedial courses. More recently, Bickerstaff et al. (2022) argued that these reforms have been successful, despite the fact that other common first-year courses are still presumably serving as barriers, and despite the fact that graduation rates have not meaningfully increased as a result of the reforms.

One reason why CCRC researchers and other proponents of reforms may claim that the three most common reforms are successful is because they are proponents of the theory of early momentum. The theory of early momentum was introduced and developed by Adelman (1999) and others (Attewell et al., 2012; Wang, 2017). Both Adelman and Attewell et al. uncovered correlations between outcomes such as high school or college credits and subsequent graduation rates and found that there were strong relationships between these variables. As one illustration, students who take more credits in their first year of college typically have higher graduation rates.

Based on these correlations, CCRC researchers Belfield et al. (2019) then proposed a focus on improving early momentum metrics (EMMs) as a way for institutions to gauge whether reforms could be considered successful. Belfield et al. concluded that “college outcomes would be substantially higher if more students met EMMs” (p. 1). The problem with this intuitive assumption is that correlation is not causation: Moving underprepared students into college-level courses or increasing the number of college-level credits that part-time students take in the first year of their community college program may not in fact cause those students to have higher graduation rates. This is because there are many more factors that contribute to student success than college-level credits, not least of which are socioeconomic status, parental education levels, income, high school quality, etc. (Adelman, 1999). Causation and correlation are difficult to disentangle when analyzing fundamentally different groups of students, especially when there are so many confounding variables.

Early Momentum Does Not In Fact Lead to Increased Graduation Rates: A Corequisite Reform Actually Caused Lower Completion Outcomes

The theory of early momentum posits that if students pass their gateway courses at higher rates, those students will be more likely to graduate. Working under this assumption, in addition to the argument that remediation and developmental education have been barriers, one of the earliest adopters of the corequisite model was the Tennessee Board of Regents. Decisionmakers there replaced stand-alone prerequisite remediation with full-scale corequisites for all students in fall semester of 2015 in almost all community colleges in the system. Ran and Lee (2024) conducted the most comprehensive research to date into the full implementation of corequisites in Tennessee. The authors showed that overall graduation rates did not increase as a result of the model. More importantly, the most underserved students failed out of college at a higher rate, and, surprisingly, the corequisite model caused graduation rates to decline for this subgroup of students.

The specifics of this particular difference-in-differences (DID) study design are important to understand for practitioners and policymakers who wish to see what results from the full-scale implementation of what were largely theoretical reforms for the first few years of the reform movement. First, opposed to Ran and Lin’s (2022) initial study on the reforms in Tennessee, which covered only students beneath the cutoff for college-level courses, Ran and Lee (2024) expanded the sample to include students at all levels. Second, the reason why the reform was implemented was to increase graduation rates, and in 2017, the goal shifted to increasing first-year metrics associated with early momentum. First-year gateway course pass rates did indeed increase by several percentage points. However, Ran and Lee found that the reform “had null effects on the number of college-level credits earned or the likelihood of transfer to a public four-year university by the end of the third year” (p. 4). More importantly, the authors reported that “the proportion of students earning degrees at the associate level or higher did not change significantly before and after the corequisite reform” (p. 24).

Perhaps it is unsurprising that a widescale reform did not increase graduation rates as anticipated because of the herculean efforts required to increase completion overall. However, in addition to this finding, Ran and Lee (2024) discovered a rather startling outcome: “Remedial students, particularly those with lower placement scores, were more likely to drop out and were less likely to earn short-term certificates” (p. 29). This means that students who typically would have taken stand-alone prerequisite remedial and developmental education courses dropped out of college at a higher rate after the change. These students also had lower graduation rates as a result of the corequisite model.

In other words, if the Tennessee Board of Regents had not implemented any reforms in the fall of 2015, their graduation rates would actually now be higher for the most underserved students. It is critical to note that students of color constituted 55% of this lowest scoring group, and they were also more likely than higher scoring students to be of low socioeconomic status. Therefore, the full implementation of the corequisite model had a disproportionately negative effect on students of color and students of low income in public 2-year colleges.

Ran and Lee (2024) concluded that reforms to the structure of first-year courses were inadequate in overcoming the challenges that the most underprepared students face in community colleges. In fact, they called into question “whether remediation reform alone is expected to improve college completion” (p. 29). The authors combined their results with that of several other studies and stated, “The problems with traditional remediation models were not the primary drivers of low college completion rates” (p. 30). Perhaps the following statement sums up the most important finding in Ran and Lee’s critical DID study covering 10 years of data:

Our results suggest that colleges need to think beyond what throughput rates measure and place DE reforms in the context of the broader sets of institutional support to help students complete a postsecondary degree, as replacing prerequisite remediation with corequisite models alone was not enough to solve the completion problem. (p. 25)

Numerous Studies Showing Positive Results for Remediation and Developmental Education Have Been Excluded From the Debate

The primary narrative of the community college reform movement that began in 2010 was that remediation and developmental education was a barrier, and bypassing it with multiple measures, removing it with legislation, or accelerating it with corequisites and other pathway models—all of these reforms would improve graduation rates. Then approximately in 2017, that narrative shifted to a focus on increasing first-year gateway pass rates due to the theory of early momentum. At no point since the beginning of the reform movement had any expert publicly expressed the idea that some remediation and developmental education courses and programs should be retained, especially for the most underserved students. The only explicit mention of that idea was in a paper by Bailey et al. (2013):

While our research does conclude that the current system of developmental education does not work very well for many students, we do not advocate—nor do we believe that the results of our research support—the elimination of developmental education, the placing of all students into college courses, or the wholesale conversion of developmental education into a co-requisite model. (p. 2)

In part, this statement was made as a response to the laws changed in Connecticut and Florida in 2012 and 2013, both of which eliminated or made optional remediation and developmental education (Fain, 2012a, 2012b; Smith, 2019). Since then, it is difficult to find statements by experts supporting that idea.

In addition to the lack of support by current and recent CCRC researchers for Bailey et al.’s (2013) statement, surprisingly absent from the discussion over the past 15 years since the start of the community college reform movement has been the fact that numerous studies, several in top journals in higher education, have shown that prerequisite remedial and developmental education courses actually benefit students. For example, Sanabria et al. (2020) stated, “Taking remediation is associated with a nearly nine percentage-point increase in bachelor’s degree completion for 2-year college students after accounting for demographic, familial, and academic background characteristics” (p. 474). Saw (2019) reported, “For 2-year college students, remediation enrollment in both mathematics and English improved the likelihood of transferring to a 4-year college and earning a bachelor’s degree” (p. 298). Turk (2019) found that “when two groups of statistically similar students were compared, developmental education generally improved the chances of earning an associate degree” (p. 1090). These three studies were published in Research in Higher Education and the Journal of Higher Education.

Several other studies over the past 2 decades have shown positive outcomes for students in remediation and developmental education. In 2005, Cabrara et al. found that for the lowest socioeconomic group in their sample, “taking remedial reading actually increases their likelihood of transferring by 24%” (p. 174). Fike and Fike (2008) showed that students who enrolled in remedial mathematics and who did not pass still achieved higher retention rates compared to students who never enrolled in remedial mathematics. Bettinger and Long (2005, 2009) ran a type of regression on English remedial students and reported that these students were “about 12 percentage points less likely to drop out and 11 percentage points more likely to graduate within six years” (p. 755).

Since 2008, even top CCRC researchers have repeatedly cited studies showing positive results for remediation and developmental education. For instance, at a CCRC-funded organization called the National Center for Postsecondary Research (NCPR), Calcagno and Long (2008) found an increase in persistence as a result of remediation in community colleges. Boatman and Long (2010, 2018), also at the NCPR, found that “placement into the lowest level of Remedial Writing appears to have positive effects on student persistence, college-level credit accumulation, and degree completion” (p. 20).

As cited in a Bailey (2008) working paper, Lesik (2006) used a regression discontinuity design (RDD) and concluded that students in remedial mathematics programs were more likely to pass their gateway courses on the first attempt. Also cited in Bailey’s working paper was a study by Moss and Yeaton (2013) who also conducted an RDD study and reported that developmental education “students experienced higher achievement at the cut score when compared to [nondevelopmental education] students” (p. 393). Finally, in 2018, CCRC researchers released a primer for developmental education policymakers to argue in favor of the most common reforms (Ganga et al.). The authors cited research on developmental education by Chen (2016) multiple times, yet they did not highlight one of the most significant findings of that study: 49% of remedial coursetakers passed all their remedial courses and went on to graduate at higher rates compared to nonremedial students.

This research does not prove that stand-alone prerequisite remediation and developmental education would be effective for many or most of incoming students at community colleges. However, these studies, including the early statements by top CCRC researchers, contradict the narrative that most experts at the CCRC and interest groups have been forwarding based on their interpretation of the research they both created and cited in the past 15 years. Clearly, remediation and developmental education can and does play a beneficial role in student success. If more importance had been placed on these studies, perhaps researchers, policymakers, and legislators may not have chosen to eliminate and restrict most stand-alone prerequisite remediation and developmental education nationwide.

Effective Developmental Education as It Was Originally Envisioned:

The City University of New York Accelerated Study in Associate Programs

In addition to the numerous studies showing that remediation and developmental education can be effective in improving outcomes, there is a comprehensive model that resembles actual developmental education as it was originally envisioned by experts in the field (Boylan, 2002). According to Boylan et al. (1999), developmental education is a system of supports for at-risk college students that includes such features as counseling and advising, tutoring, learning supports, supplemental instruction, in addition to remedial courses, as well as other components, all of which are guided by the principles of adult education theory.

The framework of developmental education was reified by a remarkably effective reform, which was not originally part of the CCRC plan but still featured in Bickerstaff et al. (2022), and this was a comprehensive model called the City University of New York Accelerated Study in Associate Programs (CUNY ASAP). The program nearly doubles graduation rates for underserved students, includes prerequisite developmental education courses combined with holistic services and other components, and has been replicated in several states (Azurdia & Galkin, 2020; Miller & Weiss, 2021; Scrivener et al., 2015). The cost per student per year for one of the replicated versions of CUNY ASAP is as low as $1,800 (Miller & Weiss, 2021). Therefore, contrary to the narrative promoted by the CCRC and interest groups, rigorous data show that developmental education, including stand-alone prerequisite remedial courses, can in fact increase graduation rates significantly.

Conclusion

After a decade of the CCRC’s and other interest groups’ well-funded attempts to reform community colleges to improve graduation rates, there has been no meaningful increase in completion overall for public 2-year students. More critically, as a result of the full implementation of corequisites in at least one state system, there have been higher stopout rates for the most at-risk students, and these students also now have lower graduation rates. It is likely that widescale blunt reforms, such as mandates requiring students at all levels to participate in corequisites or be placed into college-level courses with multiple measures, will result in equally disappointing outcomes.

Additionally, individual studies on piecemeal reforms involving multiple measures, corequisites and other accelerated models, and mathematics pathways, particularly the 10 studies featured in this critical review, have only shown temporary and limited improvements. The theory of early momentum, postulating that early increases in pass rates will result in later increases in graduation rates, has not proven to be an accurate or reliable framework. For corequisites, most of the positive results for the model are due not to the structure itself, but they are a result of a shift in curriculum, especially in mathematics. The corequisite model’s key component of putting all underprepared students into college-level courses with varying degrees of support is causing harm to many underserved students, as evidenced now by lower graduation rates. If a policymaker were to combine these findings with the numerous studies showing positive results for stand-alone prerequisite remediation and developmental education over the past 2 decades, they could conclude that these reforms have not served students well overall.

The community college reform movement has therefore failed to achieve the College Completion Agenda’s goals. In fact, the widespread policy changes to eliminate and severely restrict stand-alone prerequisite remediation and developmental education may have been a serious mistake that is now creating a new barrier for the most underrepresented public 2-year students, particularly students of color and students of low income. On the other hand, it is clear that the community college reform movement has improved first-year outcomes for some students. This is laudable.

However, the preponderance of evidence in this literature review demonstrates that to improve outcomes for the most underserved students, community college systems and institutions that participated in the reform movement may need to reevaluate their choices and begin offering beneficial and necessary prerequisite remedial and developmental education courses as options to students who need them, along with a system of sustained and well-funded supports that continue beyond the first year. This system of supports, including stand-alone prerequisite remedial and developmental education coursework, was the original design and intent of the term developmental education (Boylan et al., 1999; Boylan, 2002). CUNY ASAP’s success in increasing graduation rates is proof that developmental education is not a barrier. According to the research, the framework of developmental education, including remedial coursework, should be recognized as evidence-based and beneficial. Therefore, the thoughtful implementation of comprehensive and well-funded support programs such as developmental education should be considered as a necessary part of student success at community colleges going forward.

[1] An RCT is the gold standard of research designs that is best able to demonstrate the effects of an intervention. It is difficult to execute and thus relatively rare in higher education research design.

 

References

Adelman, C. (1999). Answers in the tool box: Academic intensity, attendance patterns, and bachelor’s degree attainment. National Institute on Postsecondary Education, Libraries, and Lifelong Learning. http://files.eric.ed.gov/fulltext/ED431363.pdf

Attewell, P., Heil, S., & Reisel, L. (2012). What is academic momentum? And does it matter? Educational Evaluation and Policy Analysis, 34(1), 27–44. https://journals.sagepub.com/doi/10.3102/0162373711421958

Azurdia, G., & Galkin, K. (2020). An eight-year cost analysis from a randomized controlled trial of CUNY’s Accelerated Study in Associate Programs (Working Paper). MDRC. https://www.mdrc.org/sites/default/files/ASAP_Cost_Working_Paper_final.pdf

Bailey, T. R. (2008). Challenge and opportunity: Rethinking the role and function of developmental education in community college (CCRC Working Paper No. 14). Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/challenge-and-opportunity.pdf

Bailey, D. H., Duncan, G. J., Cunha, F., Foorman, B. R., & Yeager, D. S. (2020). Persistence and fade-out of educational-intervention effects: Mechanisms and potential solutions. Psychological Science in the Public Interest: A Journal of the American Psychological Society, 21(2), 55–97. https://doi.org/10.1177/1529100620915848

Bailey, T. R., & Cho, S. W. (2010). Developmental education in community colleges: The White House Summit on Community College [Issue brief]. Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/developmental-education-community-colleges.pdf

Bailey, T. R., Jaggars, S. S., & Jenkins, D. (2015). Redesigning America’s community colleges: A clearer path to student success. Harvard Press.

Bailey, T. R., Jaggars, S. S., & Scott-Clayton, J. (2013). Characterizing the effectiveness of developmental education: A response to recent criticism. Community College Research Center, Teachers College, Columbia University. http://files.eric.ed.gov/fulltext/ED542142.pdf

Bailey, T. R., Jeong, D. W., & Cho, S. W. (2008). Referral, enrollment, and completion in developmental education sequences in community colleges (CCRC Working Paper No. 15, Version 1). Community College Research Center, Teachers College, Columbia University.

Bailey, T. R., Jeong, D. W., & Cho, S. W. (2009). Referral, enrollment, and completion in developmental education sequences in community colleges (CCRC Working Paper No. 15). Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/referral-enrollment-completion-developmental_V2.pdf

Bailey, T. R., Jeong, D. W., & Cho, S. W. (2010). Referral, enrollment, and completion in developmental education sequences in community colleges. Economics of Education Review, 29(2), 255–270. https://doi.org/10.1016/j.econedurev.2009.09.002

Barnett, E., Kopko, E., Cullinan, D., & Belfield, C. (2020). Who should take college-level courses? Impact findings from an evaluation of a multiple measures assessment strategy. Center for the Analysis of Postsecondary Readiness and Community College Research Center, Teachers College, Columbia University, and MDRC. https://postsecondaryreadiness.org/wp-content/uploads/2020/10/multiple-measures-assessment-impact-findings.pdf

Barshay, J. (2018, February 19). How to help students avoid the remedial ed trap. The Hechinger Report. https://hechingerreport.org/help-students-avoid-remedial-ed-trap/

Belfield, C. R., Jenkins, D., & Fink, J. (2019). Early momentum metrics: Leading indicators for community college improvement [CCRC research brief]. Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/early-momentum-metrics-leading-indicators.pdf

Bettinger, E. P., & Long, B. T. (2009). Addressing the needs of underprepared students in higher education: Does college remediation work? Journal of Human Resources, 44(3), 736–771. https://www.jstor.org/stable/20648915

Bickerstaff, S., Beal, K., Raufman, J., Lewy, E. B., & Slaughter, A. (2022). Five principles for reforming developmental education: A review of the evidence [Report]. Center for the Analysis of Postsecondary Readiness, Community College Research Center, Teachers College, Columbia University, and MDRC. https://ccrc.tc.columbia.edu/media/k2/attachments/capr-synthesis-report-final.pdf

Biedzio, D., & Sepanik, S. (2022). Interim findings from the Dana Center Mathematics Pathways long-term follow-up study. Center for the Analysis of Postsecondary Readiness, Community College Research Center, Teachers College, Columbia University. https://files.eric.ed.gov/fulltext/ED620146.pdf

Bill & Melinda Gates Foundation. (2010, April 20). Foundation giving $110 million to transform remedial educationhttps://www.gatesfoundation.org/ideas/media-center/press-releases/2010/04/foundation-giving-$110-million-to-transform-remedial-education

Boatman, A., & Long. B. T. (2010). Does remediation work for all students? How the effects of postsecondary remedial and developmental courses vary by level of academic preparation [NCPR working paper]. National Center for Postsecondary Research, Teachers College, Columbia University.

Boatman, A., & Long, B. T. (2018). Does remediation work for all students? How the effects of postsecondary remedial and developmental courses vary by level of academic preparation. Educational Evaluation and Policy Analysis, 40(1), 29–58. https://doi.org/10.3102/0162373717715708

Boylan, H. R. (2002). What works: Research-based best practices in developmental education. Continuous Quality Improvement Network and the National Center for Developmental Education, Appalachian State University. https://books.google.com/books/about/What_Works.html?id=R4DWAAAAMAAJ

Boylan, H. R., Bonham, B. S., & White, S. R. (1999). Developmental and remedial education in postsecondary education. New Directions for Higher Education, 1999(108), 87–101. https://doi.org/10.1002/he.10806

Cabrera, A. F., Burkum, K. R., & La Nasa, S. M. (2005). Pathways to a four-year degree: Determinants of transfer and degree completion. In A. Seidman (Ed.), College student retention: A formula for success (pp. 155–214). ACE/Praeger Series on Higher Education. https://www.researchgate.net/publication/261699379_Pathways_to_a_four_year_degree_Determinants_of_transfer_and_degree_completion

Calcagno, J. C., & Long. B. T. (2008). The impact of postsecondary remediation using a regression discontinuity approach: Addressing endogenous sorting and noncompliance [NCPR working Paper]. National Center for Postsecondary Research, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/impact-remediation-regression-discontinuity.pdf

Causey, J., Lee, S., Ryu, M., Scheetz, A., & Shapiro, D. (2022). Completing college: National and state report with longitudinal data dashboard on six- and eight-year completion rates (Signature Report 21). National Student Clearinghouse Research Center. https://nscresearchcenter.org/wp-content/uploads/Completions_Report_2022.pdf

Center for the Analysis of Postsecondary Readiness. (2024a, August 8). How research informed policy in Arkansas’s adoption of multiple measures. Community College Research Center, Teachers College, Columbia University. https://postsecondaryreadiness.org/arkansas-adoption-of-multiple-measures/

Center for the Analysis of Postsecondary Readiness. (2024b, August 19). Ten years of CAPR. Community College Research Center, Teachers College, Columbia University. https://postsecondaryreadiness.org/ten-years-of-capr/

Cho, S. W., Kopko, E., Jenkins, D., & Jaggars, S. S. (2012). New evidence of success for community college remedial English students: Tracking the outcomes of students in the Accelerated Learning Program (ALP) (CCRC Working Paper No. 53). Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/ccbc-alp-student-outcomes-follow-up.pdf

Cohen, A. M., Brawer, F. B., & Kisker, C. B. (2014). The American community college (6th ed.). Jossey-Bass.

Colquhoun, D. (2014). An investigation of the false discovery rate and the misinterpretation of p-values. Royal Society Open Science1(3), 140216. https://doi.org/10.1098/rsos.140216

Complete College America. (2012). Remediation: Higher education’s bridge to nowhere. Bill & Melinda Gates Foundation. https://eric.ed.gov/?id=ED536825

Complete College America. (2022, September 20). Complete college America awarded five year grant to increase equitable student outcomes in higher education. https://completecollege.org/resource/complete-college-america-awarded-five-year-grant-to-increase-equitable-student-outcomes-in-higher-education/

Cullinan, D., Barnett, E., Ratledge, A., Welbeck, R., Belfield, C., & Lopez, A. (2018). Toward better college course placement: A guide to launching a multiple measures placement system. MDRC and Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/2018_Multiple_Measures_Guide_1.pdf

Cullinan, D., & Biedzio, D. (2021). Increasing gatekeeper course completion: Three-semester findings from an experimental study of multiple measures assessment and placement. MDRC and Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/publications/increasing-gatekeeper-course-completion.html

Douglas, D., Edwards, R., & McKay, H. (2020a). First in the world—AMP-UP, Union County College: Final evaluation report. Rutgers School of Management and Labor Relations, Education and Employment Research Center. https://files.eric.ed.gov/fulltext/ED608872.pdf

Douglas, D., Logue, A. W., & Watanabe-Rose, M. (2022). The long-term impacts of corequisite mathematics remediation with statistics: Degree completion and wage outcomes. Educational Researcher52(1), 7–15. https://journals.sagepub.com/doi/full/10.3102/0013189X221138848

Douglas, D., McKay, H., & Edwards, R. (2020b). Accelerating mathematics: Findings from the AMP-UP program at Bergen Community College. Rutgers School of Management and Labor Relations, Education and Employment Research Center. https ://files.eric.ed.gov/fulltext/ED608780.pdf

Education Commission of the States. (2012). Core principles for transforming remedial education: A joint statement. (2012). Charles A. Dana Center, Complete College America, Inc., Education Commission of the States, and Jobs for the Future. https://www.ecs.org/docs/STATEMENTCorePrinciples.pdf

Education Commission of the States. (2015, November 10). Core principles for transforming remediation within a comprehensive student success strategy: A joint statement. https://www.insidehighered.com/sites/default/server_files/files/core_principles_nov5.pdf

Education Commission of the States. (2021, December 10). 50-state comparison: Developmental education policies. https://www.ecs.org/50-state-comparison-developmental-education-policies/

Fain, P. (2012a, April 3). How to end remediation. Inside Higher Ed. https://www.insidehighered.com/news/2012/04/04/connecticut-legislature-mulls-elimination-remedial-courses

Fain, P. (2012b, May 6). Connecticut legislature passes remedial education overhaul. Inside Higher Ed. https://www.insidehighered.com/quicktakes/2012/05/07/connecticut-legislature-passes-remedial-education-overhaul

Field, K. (2015, January 20). 6 years in and 6 to go, only modest progress on Obama’s college-completion goal. The Chronicle of Higher Education. https://www.chronicle.com/article/6-Years-in6-to-Go-Only/151303

Fike, D. S., & Fike, R. (2008). Predictors of first-year student retention in the community college. Community College Review, 36(2), 68–88. https://journals.sagepub.com/doi/abs/10.1177/0091552108320222

Ganga, E., Mazzariello, A., & Edgecombe, N. (2018). Developmental education: An introduction for policymakers. Education Commission of the States, and the Center for the Analysis of Postsecondary Readiness, Community College Research Center, Teachers College, Columbia University. https://www.ecs.org/wp-content/uploads/Developmental-Education_An-Introduction-for-Policymakers.pdf

Gordon, L. (2016, November 9). Remedial courses are ‘barriers’ for many community college students, report says. EdSource. https://edsource.org/2016/remedial-courses-are-barriers-for-many-community-college-students-report-says/572483

Goudas, A. M., & Boylan, H. R. (2012). Addressing flawed research in developmental education. Journal of Developmental Education, 36(1), 2–13. http://files.eric.ed.gov/fulltext/EJ1035669.pdf

Hodara, M., & Xu, D. (2016). Does developmental education improve labor market outcomes? Evidence from two states. American Educational Research Journal, 53(3), 781–813. https://doi.org/10.3102/0002831216647790

Institute of Education Sciences. (n.d.). Community College Research Center and related organizations’ IES grant monies search results. https://ies.ed.gov/funding/grantsearch/index.asp?mode=2&sort=1&order=1&all=1&search=PrincipalAffiliationName&slctAffiliation=304&GrantsPageNum=1

Jaggars, S. S., & Bickerstaff, S. (2018). Developmental education: The evolution of research and reform. In M. B. Paulsen (Ed.), Higher education: Handbook of theory and research (pp. 469–503). Springer International Publishing. https://doi.org/10.1007/978-3-319-72490-4_10

Jenkins, D., & Bailey, T. (2017). Early momentum metrics: Why they matter for college improvement (CCRC Brief No. 65). Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/early-momentum-metrics-college-improvement.pdf

Jenkins, D., Brown, A. E., Fink, J., & Lahr, H. (2018). Building guided pathways to community college student success: Promising practices and early evidence from Tennessee [Report]. Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/building-guided-pathways-community-college-student-success.pdf

Jenkins, D., Lahr, H., & Brock, T. (2024). Lessons from two major evaluations of guided pathways [Research brief]. Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/lessons-two-major-evaluations-guided-pathways.pdf

Jenkins, D., Lahr, H., & Mazzariello, A. (2021). How to achieve more equitable community college outcomes: Lessons from six years of CCRC research on guided pathways [Report]. Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/equitable-community-college-student-outcomes-guided-pathways.pdf

Kelderman, E. (2020, January 7). Happy new year, higher ed: You’ve missed your completion goal. The Chronicle of Higher Education. https://www.chronicle.com/article/happy-new-year-higher-ed-youve-missed-your-completion-goal/

Kim, A. (2024). Incomplete: The unfinished revolution in college remedial education. FutureEd. https://www.future-ed.org/wp-content/uploads/2024/06/FutureEd-Dev-Ed-Reform-Report.pdf

Kopko, E., Daniels, H., & Cullinan, D. (2023). The long-term effectiveness of multiple measures assessment: Evidence from a randomized controlled trial [CAPR working paper]. MDRC and Community College Research Center, Teachers College, Columbia University. https://www.mdrc.org/sites/default/files/long-term-effectiveness-multiple-measures-assessment_0.pdf

Lesik, S. A. (2006). Applying the regression-discontinuity design to infer causality with non-random assignment. The Review of Higher Education, 30(1), 1–19. https://muse.jhu.edu/article/203468/pdf

Litschwartz, S., Cullinan, D., & Plancarte, V. (2023). Multiple measures assessment and corequisite courses: Alternate ways to place and prepare new college students [CAPR research brief]. Center for the Analysis of Postsecondary Readiness, Community College Research Center, Teachers College, Columbia University. https://postsecondaryreadiness.org/multiple-measures-assessment-corequisite-alternate-place/

Logue, A. W. (2021, February 3). Who needs remediation? Inside Higher Ed. https://www.insidehighered.com/views/2021/02/03/academe-should-get-rid-remediation-and-conceptual-framework-surrounds-it-opinion

Logue A. W., Douglas, D., & Watanabe-Rose, M. (2019). Corequisite mathematics remediation: Results over time and in different contexts. Educational Evaluation and Policy Analysis, 41(3), 294–315. https://doi.org/10.3102/0162373719848777

Logue, A. W., Watanabe-Rose, M., & Douglas, D. (2016). Should students assessed as needing remedial mathematics take college-level quantitative courses instead? A randomized controlled trial. Educational Evaluation and Policy Analysis, 38(3), 578–598. http://journals.sagepub.com/doi/pdf/10.3102/0162373716649056

Lumina Foundation. (n.d.). A stronger nation: Learning beyond high school builds American talenthttps://www.luminafoundation.org/stronger-nation/report/#/progress

Marcus, J. (2019, January 14). 10 years later, goal of getting more Americans through college is way behind schedule. The Hechinger Report. https://hechingerreport.org/10-years-later-goal-of-getting-more-americans-through-college-is-way-behind-schedule/

Merisotis, J. & Phipps, R. (2000). Remedial education in colleges and universities: What’s really going on? The Review of Higher Education, 24(1), 67–86. https://muse.jhu.edu/article/30114/pdf

Miller, C., & Weiss, M. (2021). Increasing community college graduation rates: A synthesis of findings on the ASAP model from six colleges across two states. MDRC. https://files.eric.ed.gov/fulltext/ED611732.pdf

Miller, T., Daugherty, L., Martorell, P., & Gerber, R. (2022). Assessing the effect of corequisite English instruction using a randomized controlled trial. Journal of Research on Educational Effectiveness, 15(1), 78–102. https://doi.org/10.1080/19345747.2021.1932000

Moss, B. G., & Yeaton, W. H. (2013). Evaluating effects of developmental education for college students using a regression discontinuity design. Evaluation Review, 37(5), 370–404. https://doi.org/10.1177/0193841X14523620

Nellum, C. J., & Hartle, T. W. (2016). Where have all the low-income students gone? American Council on Education. http://static.politico.com/49/73/624217574fc19fbe6d4d3a0100d0/aces-membership-magazine-article-on-low-income-students.pdf

Noble, J. P., & Sawyer, R. (2013). A study of the effectiveness of developmental courses for improving success in college. ACT Research Report Series. http://www.act.org/content/dam/act/unsecured/documents/ACT_RR2013-1.pdf

Ochoa, E. M. (2011, October 5). Renewing the American dream: The college completion agenda. The White House: President Barack Obama. https://obamawhitehouse.archives.gov/blog/2011/10/05/renewing-american-dream-college-completion-agenda

Palmer, I. (2016, March). How to fix remediation at scale. New America. https://www.luminafoundation.org/wp-content/uploads/2017/08/how-to-fix-remediation-at-scale.pdf

Ran, F. X., & Lee, H. (2024). Does corequisite remediation work for everyone? An exploration of heterogeneous effects and mechanisms (EdWorkingPaper: 24-928). Annenberg Institute at Brown University. https://doi.org/10.26300/h26j-2484

Ran, F. X., & Lin, Y. (2022). The effects of corequisite remediation: Evidence from a statewide reform in Tennessee. Educational Evaluation and Policy Analysis44(3), 458–484. https://doi.org/10.3102/01623737211070836

Roueche, J. E. (1968). Salvage, redirection, or custody: Remedial education in the community junior college. ERIC Clearinghouse for Junior College Information, American Association of Junior Colleges. https://eric.ed.gov/?id=ED019077

Sanabria, T., Penner, A. & Domina, T. (2020). Failing at remediation? College remedial coursetaking, failure and long-term student outcomes. Research in Higher Education, 61, 459–484. https://doi.org/10.1007/s11162-020-09590-z

Saw, G. K. (2019). Remedial enrollment during the 1st year of college, institutional transfer, and degree attainment. Journal of Higher Education, 90(2), 298–321. https://www.tandfonline.com/doi/full/10.1080/00221546.2018.1493668

Schudde, L., & Keisler, K. (2019). The relationship between accelerated dev-ed coursework and early college milestones: Examining college momentum in a reformed mathematics pathway. AERA Open, 5(1), 1–22. https://doi.org/10.1177%2F2332858419829435

Schudde, L., & Meiselman, A. Y. (2019). Early outcomes of Texas community college students enrolled in Dana Center Mathematics Pathways prerequisite developmental courses [CAPR research brief]. Center for the Analysis of Postsecondary Readiness, Community College Research Center, Teachers College, Columbia University, and MDRC. https://ccrc.tc.columbia.edu/media/k2/attachments/early-outcomes-math-pathways-developmental-courses.pdf

Scott-Clayton, J. (2012). Do high-stakes placement exams predict college success? (CCRC Working Paper No. 41). Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/high-stakes-predict-success.pdf

Scott-Clayton, J., Crosta, P. M., & Belfield, C. R. (2014). Improving the targeting of treatment: Evidence from college remediation. Educational Evaluation and Policy Analysis36(3), 371–393. https://doi.org/10.3102/0162373713517935

Scrivener, S., Weiss, M. J., Ratledge, A., Rudd, T., Sommo, C., & Fresques, H. (2015). Doubling graduation rates: Three-year effects of CUNY’s Accelerated Study in Associate Programs (ASAP) for developmental education students. MDRC. http://www.mdrc.org/sites/default/files/doubling_graduation_rates_fr.pdf

Sepanik, S. (2023). Impact findings from the Dana Center Mathematics Pathways long-term follow-up study [CAPR research brief]. Center for the Analysis of Postsecondary Readiness, Community College Research Center, Teachers College, Columbia University. https://postsecondaryreadiness.org/impact-findings-dana-center-mathematics-pathways/

Smith, A. A. (2016, May 26). Determining a student’s place. Inside Higher Ed. https://www.insidehighered.com/news/2016/05/26/growing-number-community-colleges-use-multiple-measures-place-students

Smith, A. A. (2019, March 29). Mixed results on Florida remedial education gamble. Inside Higher Ed. https://www.insidehighered.com/news/2019/03/29/remedial-education-progress-florida-still-leaves-unanswered-questions

Turk, J. M. (2019). Estimating the impact of developmental education on associate degree completion: A dose–response approach. Research in Higher Education, 60, 1090–1112. https://doi.org/10.1007/s11162-019-09549-9

Wang, X. (2017). Toward a holistic theoretical model of momentum for community college student success. In M. B. Paulsen (Ed.), Higher education: Handbook of theory and research (pp. 259–308). New York, NY: Springer. https://link.springer.com/chapter/10.1007/978-3-319-48983-4_6

The White House. (2011, June). The White House summit on community colleges [Summit report]. https://obamawhitehouse.archives.gov/sites/default/files/uploads/community_college_summit_report.pdf

Zachry Rutschow, E., & Mayer, A. K. (2018). Early findings from a national survey of developmental education practices (CAPR Research Brief). Center for the Analysis of Postsecondary Readiness, Community College Research Center, Teachers College, Columbia University, and MDRC. https://postsecondaryreadiness.org/wp-content/uploads/2018/02/early-findings-national-survey-developmental-education.pdf

Zachry Rutschow, E., Cormier, M. S., Dukes, D., & Cruz Zamora, D. E. (2019a). The changing landscape of developmental education practices: Findings from a national survey and interviews with postsecondary institutions. Center for the Analysis of Postsecondary Readiness, Community College Research Center, Teachers College, Columbia University, and MDRC. https://ccrc.tc.columbia.edu/media/k2/attachments/changing-landscape-developmental-education-practices.pdf

Zachry Rutschow, E., Sepanik, S., Deitch, V., Rauman, J., Dukes, D., & Moussa, A. (2019b). Gaining ground: Findings from the Dana Center Mathematics Pathways impact study. Center for the Analysis of Postsecondary Readiness, Community College Research Center, Teachers College, Columbia University and MDRC. https://postsecondaryreadiness.org/wp-content/uploads/2019/11/gaining-ground-dana-center-mathematics-pathways.pdf

Zeidenberg, M., Jenkins, D., & Scott, M. A. (2012). Not just math and English: Courses that pose obstacles to community college completion (CCRC Working Paper No. 52). Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/not-just-math-and-english.pdf