Publications and Research
Document Type
Book Chapter or Section
Publication Date
2021
Abstract
Any placement decision is a gamble on the validity of the mech- anism used. The better the placement mechanism matches the actual pro- ficiencies required for success in a future, real-life context, the more accu- rately it will place students into the best classes for them and the more valid it will prove to be. But what happens if the most obvious, commonsensical approaches to placement that would appear to have the strongest validity— writing tests for placement into writing classes—prove unreliable? Rather than accurately placing students into the “right” class for them, we now know that writing placement tests frequently result in the underplacement of students into developmental courses that are not truly necessary for their success as college writers. Further, writing assessments used for the purpos- es of incoming college writing placement are part of this pattern and have produced racially inequitable placement patterns for uncountable numbers of students in higher education, including two-year colleges (TYCs). This chapter presents an analysis of racially disaggregated placement data for Kingsborough Community College, part of the City University of New York (CUNY) system, which recently revised its protocol for English placement in an attempt to increase accuracy and racial equity in placement into cred- it-bearing first-year composition (FYC). The CUNY system shifted from a practice of writing placement via a locally designed and scored timed writing test to an algorithmic placement mechanism—the “Proficiency In- dex”—that relies heavily on high school GPA. Given the complexities of multiple measures placement for BIPOC (Black, Indigenous, and People of Color) students, we’re encouraged to see that the new CUNY policy has resulted in a greater percentage of BIPOC students placing directly into our FYC courses.