Dr. Ahrash N. Bissell
Published October 2021
*The paper published in 2021 inspired the following video/campaign “Do Placement Differently.” (original text below)
There are numerous studies and publications {1,2,3} that address the philosophical and practical problems with placement testing as practiced by the majority of postsecondary institutions today. Pressure continues to rise locally and nationally for institutions to change their placement practices, and state-level policies are being adopted that are forcing many of these changes to occur. However, just because the existing practices are ineffective doesn’t mean that the problem they are supposed to address doesn’t exist. Despite many years of reform efforts at multiple levels, it remains true that many students who are seeking postsecondary credentials are not fully prepared to succeed in their studies, and the deficiencies are most glaring in math and English and for underserved populations {4,5,6}.
To address the problem, we need a framework for better describing the situation. Ideally, this framework would also suggest some specific tools and practices that could be deployed to manage the logistics and integration with institutional departments and policies. NROC’s work in this area has given us the opportunity to develop this framework, and much of it is best understood from the context of the figure below.
This figure is a simplified illustration of the administrative perspective for students matriculating to any given institution. The data acquired from a placement test—or any other metric—allow us to group students into three rough categories, indicated in the figure by different colors.
Students who are very well prepared (green) are relatively easy to discern since they’ll produce strong evidence of preparation and / or will perform well on any diagnostic we might administer. Similarly, students who are very weak (red) should also be easy to identify.
If a student falls into either the green or red categories, equivalent to occupying one extreme or the other in the figure shown here (above), then we can act with some confidence on those data. The problem is that most students are not so clear-cut and fall into a middle zone—shown in yellow—where their level of preparedness is unclear.
It’s essentially impossible to establish a rigorous placement standard (often called a “cut score”) for the students in the middle zone. Moreover, the boundaries among these three categories are blurry: we can’t be confident in our “placement” unless the student is really close to one extreme or the other.
It’s our contention that this figure does not vary substantially depending on the form of the measurement. In other words, the problem we face with “placement” can’t be entirely fixed by using different tests or testing against different expectations (e.g., statistics versus algebra).
The fact is that we should not be using this information to sort and separate students in that middle zone; instead, we need to provide a way for those students to clarify their readiness status without holding them up.
Historically, it was hard to know how to handle this situation. Students in the middle zone can be wildly variable in both their strengths and weaknesses. Additionally, there are many other factors besides academic preparation that can affect their potential to succeed in college-level courses. It seemed an impossible task to try to sort out all of the variables and deal with each student individually.
This logistical challenge inevitably led to what is now considered a “high-stakes” approach to placement, where faculty and administrators are forced to discriminate among “prepared” and “unprepared” students using hard cut-scores, and the consequence of being “unprepared” is a requirement to take one or more terms of “remedial” or “developmental” classes, usually for no credit, and often at substantial cost in both time and money. This approach is logistically simple since every student who falls below the cut score receives the same treatment, and, in theory, every student in a remedial class would at least be exposed to material in need of review, even if reviewing only a small subset of the overall curriculum.
However, as has been extensively documented elsewhere {7,8}, this approach extracts significant costs from the students. Many students who end up in these remedial classes do not finish them, and even if they do, their success in the follow-on, college-level courses is often marginal. Because many of these students struggle with much more than academic preparation, the “high-stakes” failure right at the outset of their postsecondary studies simply reinforces self-perceptions that college may be too difficult for them.
While it’s the case that some baseline level of academic preparation is necessary for success in any college-level program of study, there’s also ample evidence {9,10} that students who otherwise appeared to be hopelessly underprepared can achieve readiness and even excel in college, strongly suggesting that we shouldn’t be sorting students into “preparedness groups,” but rather applying appropriate and effective supports for student success in a manner that adapts to each student’s needs. There is no high-stakes examination approach that can perform this sort of task.
Since most current approaches are motivated mostly by logistical concerns, as opposed to evidence-based practice for improving student outcomes, there’s a clear opportunity to improve the logistics with appropriate technologies. EdReady, a platform for personalizing learning, is one such technology. There are other tools and technologies that could also be used to address these placement issues, but we’ll focus explicitly on EdReady for the rest of this report by way of illustrating how it changes placement practices and can substantially improve intended outcomes.
EdReady’s design was largely motivated by the issues described in this report; however, those issues also transcend the “placement” space. Students face math and English readiness issues in any situation where they are transitioning from one learning experience to another and there is some expectation that knowledge will transfer from the first experience to the second. This situation applies to students working through any hierarchical course sequence (e.g., moving from Algebra 1 to Algebra 2), or for any course of study where there are prerequisites, or for transition from one institution to another (e.g., an adult education school to a community college). Thus, EdReady was purpose-built to be generally applicable to any such situation, but also purpose-built to be easily customized and adapted to meet the specific expectations of the affected programs or institutions.
After a student completes EdReady’s Initial Diagnostic assessment, the application builds each student an individualized study path that empowers them to skip the math or English concepts that they already understand and accelerate their mastery of the concepts they need to know.
EdReady constructs an academic program that zeroes in on the identified weaknesses and provides all of the supporting materials needed for each student to improve. These materials include video lessons, online textbooks, practice problems, and a growing library of other interventions, and the student can spend as much time as needed to achieve competency in the identified topical areas. Through an iterative process of review and competency testing, the student will gradually fill in identified gaps and achieve the readiness threshold needed to be properly prepared for follow-on success. EdReady adapts to each student’s needs, is fully accessible (WCAG 2.1 AA compliant), and is usable with any Internet-connected device.
First, EdReady changes institutional practices to shift the emphasis from “placement” to “readiness.” This change occurs as a natural by-product of the fact that EdReady is not a high-stakes test; instead, EdReady gives students (and supporting faculty and staff) the information they need to make informed decisions about whether it would be beneficial to spend time getting better prepared and whether additional support (beyond the EdReady platform itself) might be needed.
Second, EdReady puts the responsibility for these decisions, and the subsequent effort to improve, where it belongs: on the student. Instead of arbitrarily segregating a subset of “at-risk” students for this type of intervention, EdReady can be used by every student to determine whether their preparedness levels are where they should be. Rather than being a barrier to certain matriculating students, EdReady becomes one of the first steps to postsecondary success for every student. In many cases, students use EdReady to expand their options by achieving readiness thresholds for more advanced courses, opening up new degree possibilities and reducing the time to a degree.
Third, EdReady helps to reduce the overall size of the underprepared population, and can direct scarce institutional resources where they are needed most. Because EdReady is low-stakes and accessible at any time and from any place, institutions can direct matriculating students to use EdReady before they show up on campus. Students can then “self-remediate” as needed and may be able to get sufficiently prepared prior to the start of classes, thereby reducing the total size of the pool of students who need more attention. Institutional staff will also be able to verify students who are already well prepared, and can reach out pro-actively with more intensive support programs to those students who are least prepared.
Fourth, the data acquired from the EdReady diagnostic can provide the basis for new readiness practices, such as co-requisite models, peer-led study groups, tutoring support, and more. EdReady can also be tweaked and adjusted (e.g., in scope or target scores) to respond to the outcomes that are seen over time. The ultimate metric for validating “readiness” is the extent to which students are able to pass the courses or programs of study that they originally worked to be ready for. If adjustments need to be made, that’s easily done in EdReady.
There are other benefits as well, but those mentioned above capture some of the key improvements specific to the “placement” issue. Furthermore, these benefits accrue in tandem with substantial logistical improvements over current protocols. Here is a case where the technology makes a substantive and positive difference across the board. By reframing the problem from “placement” to “readiness” and employing this low-stakes approach, we’re poised to radically transform this particularly vexing issue nationwide into a process that helps students, rather than derailing their aspirations.
Many postsecondary institutions—especially selective colleges and universities—expend substantial effort to identify and recruit promising students from underrepresented groups, and will often tout the apparent diversity of their matriculating classes (across any number of metrics). Yet many of these same institutions have found that they need to invest more time and resources than expected to foster the success of these students after they’ve entered the front gates.
More and more, colleges are trying to step up, but there are so many different variables in play, it can be difficult to figure out which investments and changes should be made. Given that many students, especially those from less privileged backgrounds, are financing their postsecondary studies with publicly backed loans, there is increasing public pressure for institutions to move more quickly to give students the tools they need for success. Even in selective institutions, far too many students are marginalized and poorly served by current placement-focused practices, and solutions are needed now.
We have the means to overhaul this aspect of the postsecondary experience. A low-stakes, readiness-oriented approach can work in tandem with other initiatives in this domain, such as co-requisite models, different competency (prerequisite) expectations, and programs that accelerate progress toward a degree.
While there are many other variables that are also worthy of ongoing improvement and investment, the academic basis for readiness is one area where we already understand why current practices don’t work, and we have ready-to-go solutions in hand.
1 Bailey, T., Wook Jeong, D., & Cho, S. (2010). “Referral, enrollment, and completion in developmental education sequences in community colleges.” Economics of Education Review, Volume 29, Issue 2. doi.org/10.1016/j.econedurev.2009.09.002. Accessed 19 October 2021.
2 Numerous publications on this topic are available at CAPR - The Center for the Analysis of Postsecondary Readiness, 2021, postsecondaryreadiness.org. Accessed 19 October 2021.
3 Numerous publications on this topic are available at CCRC - The Community College Research Center, 2021, ccrc.tc.columbia.edu. Accessed 19 October 2021.
4 ACT (2019). “Decline in College Readiness Continues Among US High School Grads, New ACT Report Finds.” leadershipblog.act.org/2019/10/decline-in-college-readiness-continues.html. Accessed 19 Oct 2021.
5 Burdman, Pamela (2017). “CSU needs more effective way to assess students’ math readiness.” edsource.org/2017/csu-needs-more-effective-way-to-assess-students-math-readiness/576864. Accessed 19 Oct 2021.
6 Colvin, Richard Lee (2020). “The Math Problem: Removing the Math Barrier to College Completion.” National Laboratory for Education Transformation (NLET). nlet.org/the-math-problem-report. Accessed 19 Oct 2021.
7 Burdman, Pamela (2017). “Placement tests land many students in a math maze instead of on pathways to success.” edsource.org/2017/placement-tests-put-many-students-into-math-maze-instead-of-pathways-to-success/576859. Accessed 19 Oct 2021.
8 Scott-Clayton, J., Crosta, P.M., & Belfield, C.R. (2014). “Improving the Targeting of Treatment: Evidence from College Remediation.” Educational Evaluation and Policy Analysis. Vol 36, Issue 3. https://doi.org/10.3102/0162373713517935. Accessed at DOI.org/10.3386/w18457 on 19 Oct 2021.
9 Scott-Clayton, J. (2018). “Evidence-based reforms in college remediation are gaining steam—and so far living up to the hype.” The Brookings Institution. brookings.edu/research/evidence-based-reforms-in-college-remediation-are-gaining-steam-and-so-far-living-up-to-the-hype. Accessed 19 Oct 2021.
10 Scott-Clayton, J., & Stacey, G. W. (2015). “Improving the accuracy of remedial placement.” Community College Research Center. ccrc.tc.columbia.edu/media/k2/attachments/improving-accuracy-remedial-placement.pdf. Accessed 19 Oct 2021.