A Brief History of the Community College Model and How Recent Reforms Have Changed It
Alexandros M. Goudas (Working Paper No. 14) December 2020
Traditionally, the primary mission of the community college model had been to increase access to higher education for underserved populations (Mellow, 2000). Though the first junior college was built at the start of the 20th century, the share of two-year college enrollment in higher education increased significantly after the post-WWII G.I. Bill of Rights of 1944, which allowed millions of working class veterans admission to an education beyond high school and thus access to the middle class. The post-WWII expansion of the two-year college model was originally an “effort to make a college education accessible and affordable for everyone” (p. 4). Though the usage of the term social justice only started during this era, an argument could be made that the expansion of the two-year college model in the 1960s was an attempt to address social justice issues arising during the civil rights movement. In fact, overall Black enrollment in college rose significantly in the early 1970s (Reeves & Guyot, 2017) even though detailed demographic data from two-year colleges were not available until the 1990s (Bailey et al., 2015).
Geller (2001) outlined five generations of the two-year college model’s development and progression, in which the first three morphed from being an extension of high school into a junior college generation, and the last two of which are termed the “comprehensive community college” and the “learning community college” (p. 3). These designations suggest a shift in the role of two-year colleges, from their beginnings as a more limited and practical addition to a high school education, to the more all-inclusive nature of their current roles—symbolized by the modification in terminology from junior colleges to community colleges in the 1960s and 70s. These roles now include workforce training, but also transfers to four-year institution, dual enrollment, certificate and degree attainment, lifelong learning coursework, and remediation.
The new range of diversity and complexity in its missions has also caused the community college model to come under fire for low success rates, especially for the at-risk populations (i.e., students of color and of low socioeconomic status [SES]) its creation and expansion attempted to serve (Bailey et al., 2015). This is especially concerning because two-year colleges serve a disproportionate percent of underserved populations (Espinosa et al., 2019). The struggle to accommodate the increase of underperforming and underserved students began in the 1960s and 70s, shortly after most two-year colleges opened, and led to a mandatory remediation movement in the 1980s designed to protect students from failing (Roueche & Roueche, 1993). From that point until recently, students who tested below college level were required to take remedial coursework (Hadden, 2000), yet at-risk students are still characterized by high rates of stopping out (Bailey et al., 2015). Even with more recent increases in Pell Grant monies after the Great Recession (Baum et al., 2018), at-risk students have continued to show limited success rate increases (Shapiro et al., 2019). Further complicating the two-year college model, a lack of funding has contributed to the inability of institutions to achieve their missions’ goals; funding has not kept pace with the expansion of two-year college roles and student enrollment (Beach, 2011).
Over a decade ago, as a part of the economic stimulus plan to combat the effects of the Great Recession, newly elected President Obama declared that one of the country’s goals would be to increase postsecondary completion. Termed the American Graduation Initiative, the completion agenda set its sights on addressing unmet needs in the workforce. One of the centerpieces of this declaration involved community colleges. Notably, it had been several decades since a sitting President called attention to the important role they play in higher education (Lester, 2014). However, Obama’s proposal narrowed the traditional two-year public college mission from access and equity to workforce training to fill job vacancies: “We will not fill those jobs, or even keep those jobs here in America, without the training offered by community colleges” (Liasson, 2009, para. 9).
As a result of the American Graduation Initiative’s focus on completion, the field of remediation and developmental education (R/DE) drew increased scrutiny from scholars, research centers, and interest groups that saw an opportunity for reforms that could result in higher completion rates, especially in two-year public colleges (Bailey, 2008; Bailey et al., 2009, 2010; Bailey et al., 2015; Bailey et al., 2016; Complete College America, 2012; Education Commission of the States, 2015; Jaggars & Bickerstaff, 2018; Vandal, 2014, 2015). Policymakers highlighted three areas in R/DE that they deemed could be improved at the nation’s two-year public colleges in order to increase completion: multiple measures for assessment and placement, accelerated coursework, and guided pathways. The goal of the recent reform movement has been to increase completion metrics (Bailey et al., 2015; Complete College America, 2012; Goudas & Boylan, 2012). Therefore, the pendulum swing back to optional remediation has now become more prevalent (Jaggars & Stacey, 2014; Rutschow & Mayer, 2018; Scott-Clayton, 2018). This has complicated an already variegated and strained system of two-year public postsecondary education models, especially in the field of R/DE that is underfunded and struggles to fulfill the copious types of services students and communities expect of it.
This paper addresses the most popular recent reforms, their thematic groupings, how they have changed the community college model, and how practitioners, policymakers, and institutions should employ these reforms for the most effective outcomes. First, I provide an overview of the background and development of the three primary reforms; second, I organize their various iterations in institutional, system-wide, and statewide policies into two themes; finally, I offer recommendations for improving the efficacy of current and future reform due to the effect of these reforms on the community college model.
Background and Development of the Three Primary Reforms
Based on early research into Achieving the Dream colleges and the progression and outcomes of a sample of students in R/DE, Bailey (2008) and Bailey et al. (2009, 2010) produced seminal papers that set the groundwork for future research and reform at community colleges. Even before the completion agenda had been set, Bailey (2008) had outlined a plan for important areas of research for various approaches to increase completion rates at two-year public colleges, including such initiatives as more comprehensive assessment and placement, accelerating or streamlining coursework, and changing coursework to include a pathways approach to pedagogy, all of which foreshadowed the three primary reform approaches most institutions and state systems adopted several years after. Bailey summarized his recommendations in this overview:
I suggest a broad developmental education reform agenda based on a comprehensive approach to assessment, more rigorous research that explicitly tracks students with weak academic skills through their early experiences at community colleges, a blurring of the distinction between developmental and “college-level” students that could improve pedagogy for both groups of students, and strategies to streamline developmental programs and accelerate students’ progress toward engagement in college-level work. (abstract)
The three primary reforms in this outline have been honed over time as researchers began to study and apply models that included these approaches. Eventually they became popularized in the field under these headings: multiple measures for placement; models of acceleration such as corequisites; and a type of guided pathways called mathematics pathways (Dougherty et al., 2017; Ganga et al., 2018). Each reform progressed through stages that have included an initial theory, a process of development with studied models, and a more widespread implementation.
Multiple Measures for Placement
After the Bailey (2008) outline was formed and expanded (Bailey et al., 2009, 2010), researchers conducted initial investigations into community college placement practices. These papers revealed distinct flaws in common placement methods, particularly institutions’ overreliance on a single, high-stakes test for placement into or out of R/DE. As a result, the authors recommended that multiple measures, i.e., a mixed measure of several assessments, should be employed instead (Belfield, 2014, 2015; Belfield & Crosta, 2012; Scott-Clayton, 2012a, 2012b; Scott-Clayton et al., 2014). A relatively new metric researchers identified was a student’s cumulative high school grade point average (HSGPA) because they determined that it predicted success in first-year gateway courses slightly better than a one-time, high-stakes placement test that most institutions were using, e.g., ACT’s Compass® and College Board’s Accuplacer®. According to the authors of this foundational research, combining a standardized placement assessment with HSGPA improved placement accuracy. They even theorized that HSGPA could be used alone to place students due its relatively higher accuracy rates.
Several years after the initial research into multiple measures, researchers realized that the employment of HSGPA alone harmed the placement of certain racial subgroups. For instance, Scott-Clayton and Stacey (2015) found that a standardized test placed Blacks into college-level English at a rate of 31% whereas the use of HSGPA only placed 15% of Blacks into college-level English (p. 3). The authors concluded that instead of employing one metric, using an either-or approach would obviate any potential negative effects: “One way to avoid differential impacts on subgroups would be to allow students to test out of remediation based on either test scores or high school achievement” (p. 3). To further mitigate harmful effects, regarding the actual cutoff number, Belfield (2015) recommended a reduction of HSGPA “to assign students to developmental education if their high school grade point average (GPA) is below a threshold (such as 2.7 or 3.0)” (p. 2).
The initial recommendation of mixed measures originally promoted and researched in the multiple measures reform movement thus morphed into a de facto recommendation to use any one of many different measures to place students into college-level gatekeeper courses (Barnett & Reddy, 2017; Ganga & Mazzariello, 2019). Other types of assessments that had not been studied rigorously, such as noncognitive assessments (Kafka, 2016) and waiver systems, were included in the plethora of options, any one of which could place students out of R/DE.
Institutions and state systems began implementing the use of HSGPA alone, as well as offering other options to place students, shortly after the initial research was released (Smith, 2016). Rutschow and Mayer (2018a) reported longitudinal survey data of institutions and their assessment types and found that in 2011, a fifth to a quarter of institutions used multiple measures, whereas in 2016, over 50% employed more than one measure to place students in mathematics and English (p. 2). More recent research found widespread national implementation of multiple measures in state legislation or systems, with nearly three-fourths of all states’ state- or system-wide policies addressing placement procedures (Hodges et al., 2020).
Researchers also recently implemented a randomized controlled trial (RCT) of a combined multiple measures system in seven community colleges in Upstate New York (Barnett et al., 2020; Cullinan et al., 2019). They found that in spite of substantial increases in start-up and maintenance costs, the intervention could modestly increase placement rates and hold steady or slightly increase pass rates for first-year college courses without harming subgroup performance. Cullinan et al. (2018) provided recommendations for institutions based on this research. However, RCTs and comprehensive placement reform remain limited nationwide.
Other initial reform theory and scholarship focused on three successful incipient models of acceleration to provide institutions with examples showing how to accelerate at-risk students without decreasing outcomes: e.g., the Accelerated Learning Project (ALP) and other corequisite models; fast start models such as the Community College of Denver’s (CCD) FastStart and the City University of New York’s (CUNY) Start; and general acceleration models, as shown in the California Acceleration Project (CAP) and the Charles A. Dana Center’s Mathematics Pathways (Charles A. Dana Center, 2020; Cho et al., 2012; Edgecombe, 2011; Edgecombe et al., 2013a, 2013b; Edgecombe et al., 2014; Hodara & Jaggars, 2014; Jaggars et al., 2014; Jenkins et al., 2010; Rutschow & Mayer, 2018a, 2018b; Rutschow et al., 2019; Scrivener et al., 2018).
In what later became termed corequisites, ALP was the first accelerated model in English composition to be studied using more advanced quasi-experimental research designs (Cho et al., 2012; Jenkins et al., 2010). The initial program out of the Community College of Baltimore County took volunteers from students just beneath the cutoff for college-level composition and allowed them to take two three-credit courses simultaneously from the same instructor: a college-level English composition course and an R/DE co-requisite companion course (as opposed to the traditional prerequisite remedial course first) (Adams et al., 2009).
Results demonstrated that initial pass rates in the first and second college composition courses were higher in the intervention groups, but the weaknesses in ALP’s design, and flaws in the research such as the confounding factors of volunteers, imputed data, and non-random placement (Cho et al., 2012; Jenkins et al., 2010) have cast doubts on the generalizability of these outcomes (Bailey et al., 2016). Nevertheless, well-funded interest groups such as Complete College America (2012) began promoting the wholesale conversion of R/DE into corequisites shortly after research displayed promising results, and they pushed to implement statewide and nationwide corequisite reform in several state systems (Vandal, 2014, 2015).
Aside from the corequisite model, some institutions have experimented with a compressed model of acceleration in which students take all their R/DE courses in one semester prior to taking any college-level courses. As one of the earliest examples, the CCD implemented a module-based compressed semester of R/DE courses called FastStart, and researchers showed modest improvement in subsequent metrics (Edgecombe, 2011; Edgecombe et al., 2013a). In New York, the CUNY Start model similarly offers all R/DE courses in a low-cost one accelerated semester, and research has found slightly higher rates of enrollment and pass rates in subsequent college courses (Scrivener et al., 2018).
Finally, general accelerated models, including the CAP in California and the Charles A. Dana Center’s Mathematics Pathways project in Texas implemented types of accelerated mathematics models with limited results. Descriptive outcomes for CAP showed that students in both R/DE English and mathematics courses who take two classes compressed into one semester outperform students who take the traditional two-semester time frame to complete these courses (Edgecombe, 2011). Rutschow and Mayer (2018b) and Rutschow et al. (2019) found that students availing themselves of the accelerated mathematics R/DE option have increased success rates in later semesters when compared to more traditional models.
Aside from the increased utilization of HSGPA as a placement measure, which is relatively easy to accomplish as an institution, acceleration in its various forms has been the most common approach and design in two-year college reform, a change that is more time- and labor-intensive to implement. Nevertheless, recent research has now shown that over half of states and systems contain policies mandating or promoting acceleration models (Hodges et al., 2020).
Guided Pathways and Mathematics Pathways
Bailey et al. (2015) proffered a comprehensive theoretical model to increase community college completion rates called guided pathways. One theory undergirding the model is the practice of limiting options and designating predetermined pathways for students to take, thus potentially making it simpler for students to complete college. This is not only applied to programs of study, which have been restricted in some institutions to four general programs (Bailey et al., 2015), but is also applied to coursework. In other words, course-level changes have been proposed to modify existing requirements for R/DE prerequisites and college-level courses, especially in mathematics. In short, reformers have recommended that statistics or quantitative reasoning replace algebra for most program requirements, and that accelerated versions of these courses should also prevail as the primary design for R/DE and college-level mathematics.
The leading proponent for mathematics reform efforts centered on content and delivery modifications has been the Charles A. Dana Center at the University of Texas at Austin. Researchers there conducted RCTs and observational studies involving large numbers of students taking various iterations of specially designed mathematics course frameworks based on statistics and quantitative reasoning content (Charles A. Dana Center, 2020; Rutschow & Mayer, 2018b; Rutschow et al., 2019). Results from an RCT at four Texas colleges showed modest increases in intervention group outcomes. For instance, after three semesters, students in the intervention group completed a college-level mathematics course at 6.8 percentage points higher than the control group (Rutschow et al., 2019, p. 57). Other outcomes were less favorable.
Logue et al. (2016) conducted another RCT involving the switch to statistics and away from algebra, this time including a corequisite component. Students who placed into R/DE algebra courses were randomly assigned to three groups, two of which were remedial algebra, (one algebra group had an intervention with corequisite support), and the second intervention group employed college-level statistics as the primary mathematics course, also with corequisite support. Positive outcomes in college-level statistics were only found for those just beneath the college-level cutoff, as prior evidence has shown as well (Cho et al., 2012; Jenkins et al., 2010).
The simplest version of the mathematics pathways reform is for institutions or state systems to change their program requirements from algebra to statistics for students who are not enrolling in STEM programs. For example, Oklahoma State Regents for Higher Education implemented their own variation of mathematics pathways as a replacement for algebra (Hodges et al., 2020). As another example, The Tennessee Board of Regents mandated that most community colleges replace algebra with statistics for the default placement of incoming students in the fall of 2015, but they also imposed an R/DE requirement of corequisites simultaneously (Belfield et al., 2016). After five years of data, results of this dual intervention showed modest increases in college-level pass rates, but most of the gains could only be attributed to the switch to statistics and not the corequisite model (Kane et al., 2020; Ran & Lin, 2019). Notably, after controlling for the effects of a third program the state implemented, moving more college mathematics R/DE to high school, Harvard scholars found that “prerequisite remediation is neither the major cause of low completion (as many of its critics have argued) nor a major solution for students with weak math skills” (Kane et al., 2020, para. 9).
Several other states are implementing similar mathematics pathways changes with varying degrees of comprehensiveness (Hodges et al., 2020). In spite of the increased momentum that scholars and interest groups are using to promote it as one method to improve outcomes, the mathematics pathways model implementation is still quite limited in its scope nationwide (Ganga et al., 2018; Hodges et al., 2020; Rutschow & Mayer, 2018a).
Themes in Remediation and Developmental Education Reforms
The three primary R/DE reforms and the innumerable variations of modeling types being implemented nationwide can be broken down into two themes: the simple implementation, characterized by uncomplicated, expedient, and inexpensive changes in institutional practices at the state or system levels; and the complex implementation, characterized by more comprehensive, intricate, and cost-prohibitive models, most of which are found sporadically.
Theme 1: Simple Implementation
Predictably, the most common theme for reform in R/DE is the simple implementation, or the easiest types of changes that institutions can make, known among policymakers as low-hanging fruit. This includes institutions merely adding HSGPA as a metric as another means by which students can bypass R/DE and enter college-level courses. It also includes institutions and states shifting R/DE students into statistics or quantitative reasoning courses in a wholesale manner with or without an accompanying component of support, i.e., the corequisite model. The prominent reason behind most institution, system, and state choices to follow the simple implementation route has to do with the decrease in state funding for two-year colleges and for higher education in general over the past several decades (Beach, 2011; Phelan, 2014).
Theme 2: Complex Implementation
The theme of the complex implementation is characterized by increased resources, time, and staffing. This includes models studied with RCTs conducted in Texas (Rutschow et al., 2019), New York’s CUNY (Logue et al., 2016; Scrivener et al., 2018), Upstate New York (Barnett et al., 2018; Barnett et al., 2020), and Colorado (Edgecombe, 2011). Institutions that participated in these studies often benefited from external funding from research groups that created them. These complex models do not represent most reforms (Hodges et al., 2020).
The national response to initial R/DE reform scholarship and recommendations by research organizations and interest groups resulted in several states removing R/DE as options immediately. For example, only a few years after the completion agenda was proclaimed and researchers offered reform recommendations, the state legislatures of Connecticut and Florida eliminated and made their R/DE optional, respectively (Bailey et al., 2012; Pain, 2016). These statewide decisions reflected the theme of the simple implementation. As a response, some of the scholars who were prominently involved in the creation and promotion of these reforms were forced to speak out against the wholesale effort to remove or restrict R/DE: Bailey et al. (2012) and Bailey et al. (2013) warned of the dangers of states and institutions moving to swiftly to remove or severely restrict R/DE since it might harm underserved students.
Avoid the Simple Implementation Approach
Therefore, the first recommendation I would offer regarding reforms in R/DE is to caution researchers, scholars, policymakers, and legislators against choosing the simple implementation approach that often results in short-term gains, decreased access, and stagnating completion rates. In fact, preliminary research into the simple change that the State of Florida enacted to make R/DE optional has shown that it may have harmed students of color (Pain, 2016). Overall pass rates and first-year metrics may rise initially, but the risk of reducing access and equity outweigh these short-lived gains. More importantly, I would argue that popular and cosmetic reforms are ineffective. In fact, Jaggars and Bickerstaff (2018) noted, “The most popular reform models (including multiple measures assessment and placement, math pathways, and the co-requisite approach) will indeed improve students’ rate of success in college-level math and English, but they are unlikely to substantially improve graduation rates” (p. 496).
Invest in the Complex Implementation: Proven Holistic Reforms
The only comprehensive reform that has shown proven results in dramatically increasing completion rates at two-year public colleges is the CUNY Accelerated Study in Associate Programs (ASAP). It is a comprehensive model that incorporates all three primary reforms and expands and extends supports beyond their more limited variations. ASAP incorporates R/DE in the first semester, much like CUNY Start, but it also includes free books and transportation, increased tutoring and advising, a learning community approach, and many other components within a multifaceted and sustained assistance framework that lasts three years (Scrivener & Weiss, 2013). CUNY conducted two RCTs initially, with most participants of low SES and of color, one with R/DE and one with non-R/DE students. Both programs doubled three-year graduation rates (City University of New York, 2016; Scrivener & Weiss, 2013; Scrivener et al., 2015). A cost-benefit analysis of ASAP showed returns on investment from four to 10 times the cost (Levin & Garcia, 2013), and replications of the RCT in Ohio revealed that similar results could be attained with approximately two thousand dollars per student per year in costs above normal operations (Azurdia & Galkin, 2020).
Using ASAP as an idealized model for comprehensive reform, therefore, I recommend that states and institutions move toward the full implementation of this model using incremental steps. In other words, policymakers should think about their reforms using a spectrum approach: Institutions could identify which of ASAP’s components are currently utilized, put those on a spectrum of least-to-mostly integrated, and then begin to connect and integrate more features as funding, staffing, and time allow. At the very least, however, I recommend policymakers come to terms with the evidence showing that simple changes do not move the needle. Holistic problems require holistic solutions, and funding must accompany any comprehensive reform.
Unfortunately, the completion agenda and all its accompanying reforms have done little to increase two-year college metrics and improve the community college model (Field, 2015; Jaggars & Bickerstaff, 2018). In an article discussing the unintended consequences of the American Graduation Initiative on community colleges, Lester (2014) argued that the completion agenda’s overemphasis on changing their mission to become workforce training centers and not institutions for equity and access has reduced the chances for student access and SES mobility:
Continuing to discuss completion without an emphasis on its relation to access places community colleges in an impossible bind where they are forced to tip the scale in favor of inappropriate completion measures gained by reducing access to those students who are most likely to complete. (p. 459)
Simply put, the overemphasis on increasing completion metrics forces institutions to cut quality and reduce access at the expense of those who most need support. The theme of simple implementation is all too prevalent in postsecondary reform circles nationwide, most likely due to funding cuts and stagnating time and staffing. Completion rates have not budged, and the community college model is at risk of morphing into an inaccessible institution designed only for those who can afford it and who are predestined to be successful due to their more privileged backgrounds.
This is most clearly evidenced in the past decade’s most common reform themes for R/DE. Too often, the simple, cheap, and easy approach has defined reform movements. The complex, costly, and sustained reform models are few and far between. CUNY ASAP’s results, especially including the model’s successful replication at several institutions, has shown legislators, policymakers, and practitioners a scalable model, sustainable and fiscally within reason. The only remaining barrier to doubling graduation rates for the nation’s most at-risk students is the collective social and political will to scale this proven holistic reform. Obama originated the nationwide completion agenda with a more narrow workforce focus in mind; after a decade of piecemeal reform that has not moved the needle, current leaders in higher education should now broaden the focus to be more holistic, thoughtful, and well-funded.
Adams, P., Gearhart, S., Miller, R., & Roberts, A. (2009). The accelerated learning program: Throwing open the gates. Journal of Basic Writing, 28(2), 50–69. http://files.eric.ed.gov/fulltext/EJ877255.pdf
Azurdia, G., & Galkin, K. (2020). An eight-year cost analysis from a randomized controlled trial of CUNY’s Accelerated Study in Associate Programs (Working Paper). MDRC. https://www.mdrc.org/sites/default/files/ASAP_Cost_Working_Paper_final.pdf
Bailey, T. R. (2008). Challenge and opportunity: Rethinking the role and function of developmental education in community college (CCRC Working Paper No. 14). Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/challenge-and-opportunity.pdf
Bailey, T. R., Bashford, J., Boatman, A., Squires, J., Weiss, M., Doyle, W., Valentine, J. C., LaSota, R., Polanin, J. R., Spinney, E., Wilson, W., Yeide, M., & Young, S. H. (2016). Strategies for postsecondary students in developmental education – A practice guide for college and university administrators, advisors, and faculty. Institute of Education Sciences, What Works Clearinghouse. http://ies.ed.gov/ncee/wwc/Docs/PracticeGuide/wwc_dev_ed_112916.pdf
Bailey, T. R., Hughes, K., & Jaggars, S. S. (2012, May 18). Law hamstrings college remedial programs. Hartford Courant. https://www.courant.com/opinion/hc-xpm-2012-05-18-hc-op-bailey-college-remedial-education-bill-too-r-20120518-story.html
Bailey, T. R., Jaggars, S. S., & Jenkins, D. (2015). Redesigning America’s community colleges: A clearer path to student success. Harvard Press.
Bailey, T. R., Jaggars, S. S., & Scott-Clayton, J. (2013). Characterizing the effectiveness of developmental education: A response to recent criticism. Community College Research Center, Teachers College, Columbia University. http://files.eric.ed.gov/fulltext/ED542142.pdf
Bailey, T. R., Jeong, D. W., & Cho, S. W. (2009). Referral, enrollment and completion in developmental education sequences in community colleges (CCRC Working Paper No. 15). Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/referral-enrollment-completion-developmental_V2.pdf
Bailey, T. R., Jeong, D. W., & Cho, S. W. (2010). Referral, enrollment and completion in developmental education sequences in community colleges. Economics of Education Review, 29(2), 255–270. https://doi.org/10.1016/j.econedurev.2009.09.002
Barnett, E., Bergman, P., Kopko, E., Reddy, V., Belfield, C., Roy, S., & Cullinan, D. (2018). Multiple measures placement using data analytics: An implementation and early impacts report. Center for the Analysis of Postsecondary Readiness and MDRC. https://ccrc.tc.columbia.edu/media/k2/attachments/multiple-measures-placement-using-data-analytics.pdf
Barnett, E., & Kopko, E. (2020). What really works in student success? (CCRC Working Paper No. 121). Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/what-works-student-success.pdf
Barnett, E., Kopko, E., Cullinan, D., & Belfield, C. (2020). Who should take college-level courses? Impact findings from an evaluation of a multiple measures assessment strategy. Center for the Analysis of Postsecondary Readiness and MDRC. https://postsecondaryreadiness.org/wp-content/uploads/2020/10/multiple-measures-assessment-impact-findings.pdf
Barnett, E., & Reddy, V. (2017). College placement strategies: Evolving considerations and practices (CAPR Working Paper). Center for the Analysis of Postsecondary Readiness. https://ccrc.tc.columbia.edu/media/k2/attachments/college-placement-strategies-evolving-considerations-practices.pdf
Baum, S., Ma, J., Pender, M., & Libassi, C. J. (2018). Trends in student aid 2018. College Board. https://research.collegeboard.org/pdf/trends-student-aid-2018-full-report.pdf
Beach, J. M. (2011). Gateway to opportunity: A history of the community college in the United States. Stylus Publishing, LLC.
Belfield, C. R. (2014). Improving assessment and placement at your college: A tool for institutional researchers. Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/improving-assessment-placement-institutional-research.pdf
Belfield, C. R. (2015). Improving assessment and placement at your college: A tool for institutional researchers. Community College Research Center, Columbia University, Teachers College. http://ccrc.tc.columbia.edu/media/k2/attachments/improving-assessment-placement-institutional-research.pdf
Belfield. C. R., & Crosta, P. M. (2012). Predicting success in college: The importance of placement tests and high school transcripts (CCRC Working Paper No. 42). Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/predicting-success-placement-tests-transcripts.pdf
Belfield. C. R., Jenkins, D., & Lahr, H. (2016). Is corequisite remediation cost effective? Early findings from Tennessee (CCRC Research Brief No. 62). Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/corequisite-remediation-cost-effective-tennessee.pdf
Charles A. Dana Center at the University of Texas at Austin. (2020). Launch Years: A new vision for the transition from high school to postsecondary mathematics. https://utdanacenter.org/launchyears
Cho, S. W., Kopko, E., Jenkins, D., & Jaggars, S. S. (2012). New evidence of success for community college remedial English students: Tracking the outcomes of students in the Accelerated Learning Program (ALP) (CCRC Working Paper No. 53). Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/ccbc-alp-student-outcomes-follow-up.pdf
City University of New York. (2016). Significant increases in associate degree graduation rates: CUNY Accelerated Study in Associate Programs (ASAP). http://www1.cuny.edu/sites/asap/wp-content/uploads/sites/8/2016/06/ASAP_Program_Overview_Web.pdf
Complete College America and the Bill & Melinda Gates Foundation. (2012). Remediation: Higher education’s bridge to nowhere. https://postsecondary.gatesfoundation.org/wp-content/uploads/2014/10/CCA-Remediation-Bridge-to-No-Where.pdf
Cullinan, D., Barnett, E., Kopko, E., Lopez, A., & Morton, T. (2019). Expanding access to college-level courses: Early findings from an experimental study on multiple measures assessment and placement. MDRC and Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/expanding-access-college-level-courses.pdf
Cullinan, D., Barnett, E., Ratledge, A., Welbeck, R., Belfield, C., & Lopez, A. (2018). Toward better college course placement: A guide to launching a multiple measures placement system. MDRC and Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/2018_Multiple_Measures_Guide_1.pdf
Dougherty, K., Lahr, H., & Morest, V. S. (2017). Reforming the American community college: Promising changes and their challenges (CCRC Working Paper No. 98). Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/reforming-american-community-college-promising-changes-challenges.pdf
Edgecombe, N. (2011). Accelerating the academic achievement of students referred to developmental education (CCRC Working Paper No. 30). Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/accelerating-academic-achievement-students.pdf
Edgecombe, N., Baker, E. D., & Bailey, T. R. (2013a). Acceleration through a holistic support model: An implementation and outcomes analysis of FastStart@CCD. Community College Research Center, Teachers College, Columbia University. http://files.eric.ed.gov/fulltext/ED539910.pdf
Edgecombe, N., Cormier, M. S., Bickerstaff, S., & Barragan, M. (2013b). Strengthening developmental education reforms: Evidence on implementation efforts from the Scaling Innovation Project (CCRC Working Paper No. 61). Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/strengthening-developmental-education-reforms.pdf
Edgecombe, N., Jaggars, S. S., Xu, D., & Barragan, M. (2014). Accelerating the integrated instruction of developmental reading and writing at Chabot College (CCRC Working Paper No. 71). Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/accelerating-integrated-developmental-reading-and-writing-at-chabot.pdf
Education Commission of the States. (2015, November 10). Core principles for transforming remediation within a comprehensive student success strategy: A joint statement. https://www.insidehighered.com/sites/default/server_files/files/core_principles_nov5.pdf
Espinosa, L. L., Turk, J. M., Taylor, M., & Chessman, H. M. (2019). Race and ethnicity in higher education: A status report. American Council on Education. https://1xfsu31b52d33idlp13twtos-wpengine.netdna-ssl.com/wp-content/uploads/2019/02/Race-and-Ethnicity-in-Higher-Education.pdf
Field, K. (2015, January 20). 6 years in and 6 to go, only modest progress on Obama’s college completion goal. The Chronicle of Higher Education. https://www.chronicle.com/article/6-Years-in6-to-Go-Only/151303
Ganga, E., Mazzariello, A., & Edgecombe, N. (2018). Developmental education: An introduction for policymakers. Education Commission of the States, Center for the Analysis of Postsecondary Readiness. https://www.ecs.org/wp-content/uploads/Developmental-Education_An-Introduction-for-Policymakers.pdf
Ganga, E., & Mazzariello, A. (2019). Modernizing college course placement by using multiple measures. Center for the Analysis of Postsecondary Readiness, Teachers College, Columbia University and the Education Commission of the States. https://www.ecs.org/wp-content/uploads/Modernizing-College-Course-Placement-by-Using-Multiple-Measures.pdf
Geller, H. A. (2001). A brief history of community colleges and a personal view of some issues (open admissions, occupational training and leadership). George Mason University. https://files.eric.ed.gov/fulltext/ED459881.pdf
Goudas, A. M., & Boylan, H. R. (2012). Addressing flawed research in developmental education. Journal of Developmental Education, 36(1), 2–13. http://files.eric.ed.gov/fulltext/EJ1035669.pdf
Hadden, C. (2000). The ironies of mandatory placement. Community College Journal of Research and Practice, 24, 823–838. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.580.4344&rep=rep1&type=pdf
Hodara, M., & Jaggars, S. S. (2014). An examination of the impact of accelerating community college students’ progression through developmental education. Journal of Higher Education, 85(2), 246–276. https://ccrc.tc.columbia.edu/publications/impact-accelerating-students-progression.html
Hodges, R., Payne, E. M., McConnell, M. C., Lollar, J., Guckert, D. A., Owens, S., Gonzales, C., Hoff, M. A., Lussier, K. O., Wu, N., & Shinn, H. B. (2020). Developmental education policy and reforms: A 50-state snapshot. The Journal of Developmental Education, 44(1), 2–17.
Jaggars, S. S., & Bickerstaff, S. (2018). Developmental education: The evolution of research and reform. In M. B. Paulsen (Ed.), Higher education: Handbook of theory and research (pp. 469–503). Springer International Publishing. https://doi.org/10.1007/978-3-319-72490-4_10
Jaggars, S. S., Edgecombe, N., & Stacey, G. W. (2014). What we know about accelerated developmental education. Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/accelerated-developmental-education.pdf
Jaggars, S. S., Hodara, M., Cho, S. W., & Xu, D. (2015). Three accelerated developmental education programs: Features, student outcomes, and implications. Community College Review, 43(1), 3–26. http://journals.sagepub.com/doi/pdf/10.1177/0091552114551752
Jaggars, S. S., & Stacey, G. W. (2014). What we know about developmental education outcomes [Research overview]. Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/what-we-know-about-developmental-education-outcomes.pdf
Jenkins, D., Speroni, C., Belfield, C., Jaggars, S., & Edgecombe, N. (2010). A model for accelerating academic success of community college remedial English students: Is the accelerated learning program (ALP) effective and affordable? (CCRC Working Paper No. 21). Community College Research Center, Teachers College, Columbia University. https://files.eric.ed.gov/fulltext/ED512398.pdf
Kafka, T. (2016). A list of non-cognitive assessment instruments. Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/images/a-list-of-non-cognitive-assessment-instruments.pdf
Kane, T. J., Boatman, A., Kozakowski, W., Bennett, C., Hitch, R., & Weisenfeld, D. (2020). Is college remediation a barrier or a boost? Education Next, 20(2). https://www.educationnext.org/college-remediation-barrier-boost-evidence-from-tennessee/
Lester, J. (2014). The completion agenda: The unintended consequences for equity in community colleges. In M. B. Paulsen (Ed.) Higher education: Handbook of theory and research (pp. 423–466). Springer. https://doi.org/10.1007/978-94-017-8005-6_10
Levin, H. M., & Garcia, E. (2013). Benefit-cost analysis of accelerated study in associate programs (ASAP) of the City University of New York (CUNY). Center for Benefit-Cost Studies in Education, Teachers College, Columbia University. https://www1.nyc.gov/assets/opportunity/pdf/Levin_ASAP_Benefit_Cost_Report_FINAL_05212013.pdf
Liasson, M. (2009, July 15). Obama: Education a key to economic rebound. National Public Radio. https://www.npr.org/templates/story/story.php?storyId=106632638
Logue, A. W., Watanabe-Rose, M., & Douglas, D. (2016). Should students assessed as needing remedial mathematics take college-level quantitative courses instead? A randomized controlled trial. Educational Evaluation and Policy Analysis, 38(3), 578–598. https://journals.sagepub.com/doi/pdf/10.3102/0162373716649056
Mellow, G. (2000). The history and development of community colleges in the United States [Paper presentation]. Inter-American Development Bank and the Harvard Graduate School of Education. https://files.eric.ed.gov/fulltext/ED455883.pdf
Pain, K. D. (2016). Voluntary remediation in Florida: Will it blaze a new trail or stop student pathways? Community College Journal of Research and Practice, 40(11), 927–941. https://www.tandfonline.com/doi/full/10.1080/10668926.2015.1134361
Phelan, D. J. (2014). The clear and present funding crisis in community colleges. New Directions for Community Colleges, 2014(168), 5–16. https://doi.org/10.1002/cc.20116
Ran, F. X., & Lin, Y. (2019). The effects of corequisite remediation: Evidence from a statewide reform in Tennessee (CCRC Working Paper No. 115). Community College Research Center, Teachers College, Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/effects-corequisite-remediation-tennessee.pdf
Reeves, R. V., & Guyot, H. (2017, December 4). Black women are earning more college degrees, but that alone won’t close race gaps. Brookings Institution. https://www.brookings.edu/blog/social-mobility-memos/2017/12/04/black-women-are-earning-more-college-degrees-but-that-alone-wont-close-race-gaps/
Roueche, J. E., & Roueche, S. D. (1993). Between a rock and a hard place: The at-risk student in the open-door college. American Association of Community Colleges.
Rutschow, E. Z., & Mayer, A. K. (2018a). Early findings from a national survey of developmental education practices (CAPR Research Brief). Center for the Analysis of Postsecondary Readiness, Teachers College, Columbia University, Community College Research Center, Teachers College, Columbia University, and MDRC. https://postsecondaryreadiness.org/wp-content/uploads/2018/02/early-findings-national-survey-developmental-education.pdf
Rutschow, E. Z., & Mayer, A. K. (2018b). Making it through: Interim findings on developmental students’ progress to college math with the Dana Center mathematics pathways (CAPR Research Brief). Center for the Analysis of Postsecondary Readiness, Community College Research Center, Teachers College, Columbia University and MDRC. https://www.mdrc.org/sites/default/files/DCMP-InterimFindings.pdf
Rutschow, E. Z., Sepanik, S., Deitch, V., Rauman, J., Dukes, D., & Moussa, A. (2019). Gaining ground: Findings from the Dana Center Mathematics Pathways impact study. Center for the Analysis of Postsecondary Readiness, Community College Research Center, Teachers College, Columbia University and MDRC. https://postsecondaryreadiness.org/wp-content/uploads/2019/11/gaining-ground-dana-center-mathematics-pathways.pdf
Scott-Clayton, J. (2012a, April 20). Are college entrants overdiagnosed as underprepared? The New York Times Economix Blog. http://economix.blogs.nytimes.com/2012/04/20/are-college-entrants-overdiagnosed-as-underprepared/
Scott-Clayton, J. (2012b). Do high-stakes placement exams predict college success? (CCRC Working Paper No. 41). Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/high-stakes-predict-success.pdf
Scott-Clayton, J. (2018, March 29). Evidence-based reforms in college remediation are gaining steam – and so far living up to the hype (Evidence Speaks Series). Brookings Institution. https://www.brookings.edu/research/evidence-based-reforms-in-college-remediation-are-gaining-steam-and-so-far-living-up-to-the-hype/
Scott-Clayton, J., Crosta, P. M., & Belfield, C. R. (2014). Improving the targeting of treatment: Evidence from college remediation. Educational Evaluation and Policy Analysis, 36(3), 371–393. https://doi.org/10.3102/0162373713517935
Scott-Clayton, J., & Stacey, G. W. (2015). Improving the accuracy of remedial placement. Community College Research Center, Teachers College, Columbia University. http://ccrc.tc.columbia.edu/media/k2/attachments/improving-accuracy-remedial-placement.pdf
Scrivener, S., Gupta, H., Weiss, M. J., Cohen, B., Cormier, M. S., & Brathwaite, J. (2018). Becoming college ready: Early findings from a CUNY Start evaluation. MDRC. https://www.mdrc.org/sites/default/files/CUNY_START_Interim_Report_FINAL_0.pdf
Scrivener, S., & Weiss, M. J. (2013). More graduates: Two-year results from an evaluation of Accelerated Study in Associate Programs (ASAP) for developmental education students (MDRC Policy Brief). MDRC. http://www.mdrc.org/sites/default/files/More_Graduates.pdf
Scrivener, S., Weiss, M. J., Ratledge, A., Rudd, T., Sommo, C., & Fresques, H. (2015). Doubling graduation rates: Three-year effects of CUNY’s Accelerated Study in Associate Programs (ASAP) for developmental education students. MDRC. http://www.mdrc.org/sites/default/files/doubling_graduation_rates_fr.pdf
Shapiro, D., Dundar, A., Huie, F., Wakhungu, P. K., Bhimdiwala, A., & Wilson, S. E. (2019). Completing college: Eight year completion outcomes for the fall 2010 cohort (Signature Report No. 12c). National Student Clearinghouse Research Center. https://nscresearchcenter.org/wp-content/uploads/NSC_Signature-Report_12_Update.pdf
Smith, A. A. (2016, May 26). Determining a student’s place. Inside Higher Ed. https://www.insidehighered.com/news/2016/05/26/growing-number-community-colleges-use-multiple-measures-place-students
Vandal, B. (2014). Promoting gateway course success: Scaling corequisite academic support. Complete College America. https://eric.ed.gov/?id=ED558791
Vandal, B. (2015). Core principles for transforming remediation. Complete College America. https://ccrscenter.org/sites/default/files/August-15-Bruce-Vandal.pdf