The Curious Evolution and Disruptive Future of the Collegiate Learning Assessment
This week the Wall Street Journal carried an front page article entitled “Colleges Set to Offer Exit Tests” (the online title is different, and misleading in a different way). The prominent placement was not surprising given President Obama’s recent tour of colleges promising a more explicit look at the outcomes of higher education than rankings currently provide (more on outcomes and rankings in two upcoming articles…). Still, the title caught me a little off-guard given that I work in higher ed and had not yet received the memo that we were all administering exit tests this year.
As it turns out the, the article is a glancing look at administration of the Collegiate Learning Assessment Plus (Formerly the CLA, now abbreviated CLA+ rather than the less fortunate acronym “the CLAP”) at about 200 colleges this fall. It’s important to note up-front that the CLA+ is neither an exit text nor a “Post-college SAT” as suggested by the online headline- and I’ll explain why, shortly. Yet, while the article is problematic in lots of ways (starting with its title), it is the first CLA+ mention of any length that I have seen in the popular press, and seemed worthy of a follow-up. That’s because while the limited administration of the CLA+ this is far from revolutionary, it’s very possibly the first step towards something much bigger.
[Note: since writing, I’ve noticed that the Chicago Tribune has posted a similar article and borrows the “exit exam” language for its title, as has Fox Business, Business Insider, and that beacon of higher education investigative reporting: Cosmopolitan Magazine. The CLA marketing team must be on the move!]
Most folks who work in or around higher education know that CLA didn’t spring fully-formed from the knee of Zeus this past week. Developed in the late 90s and released in the year 2000 with funding from the Council for Aid to Education (a major tax-exempt charity started by a collective of businesses), it has become a sort of darling of accreditors and assessment groups like the Wabash Study while managing to gain the trust of a wide cross-section of traditional public and private institutions.
Unlike similar tools developed by the ACT (the CAAP) and ETS (the Proficiency Profile), which have been more widely adopted by public 2 years and subset of public 4 year institutions participating in the VSA , the CLA has seen penetration in many “elite” privates and publics- last year it was administered by Amherst and UT-Austin, and dozens of highly selective schools that wouldn’t touch something like the ACT’s CAAP with a ten foot pole have have used the CLA at least once .
Part of the very reason these schools have found the CLA appealing is that it is not an “exit test”- a term typically reserved for an exam that one must take, and often must pass, to graduate. In fact, using it as an exit test was impossible- the CLA has noted from the onset that unlike tools like the SAT, the CLA was intended only for measurement at the institutional level, primarily for internal assessment. Further, the CLA has always been (and thus far still is) entirely voluntary- institutions can’t require it of their students. Instead, it is administered to a sample of both first year students and seniors who often receive some sort of incentive in exchange for their roughly two hours spent completing the computer-based, which consists of a “performance task” requiring students to sift through a digital document library to answer a series of questions in a process intended to replicate “real-world” decision-making. The written responses are scored by a computer program, which the CAE argues has reliability levels similar to two trained human graders (but some critics have suggested can be gamed with non-sensical answers).
The CLA uses a regression model for its institutional reports (which has become at least less questionable since they switched to HLM in 2010) to control for student and campus characteristics to show change in students at the group-level over time. After the administration, schools get a student-level summary file and an institutional report showing whether their student improve more or less than predicted. Following pushback when institution names were released along with their scores after an early administration, further administrations simply provided reference to a “comparison group” and explicitly discourage institutions from publicizing their own results.
The important take-home here is that the form and function of the CLA up until 2012 was targeted at a core user base that bought into the concept of the CLA because provided a form of direct assessment that can be used internally and reported to accreditors without risking a challenge to their reputation. I would suggest that the vast majority institutions using the CLA this fall will be using it in the same way that they have for the past dozen years- as voluntary administrations to gather internal institutional metrics and satisfy accreditors. But this year they, and even those institutions that took it years ago, will be complicit in something larger that the Journal article correctly alludes to- setting up the CLA as a potential individual-level credential.
Assessment professionals received the first glimpse of this change as part of an email from the CAE in the fall of 2012. Here’s an excerpt:
“…it is with tremendous excitement that we share our next enhancement, CLA+, a version of the CLA that is designed to provide reliable information at the student (in addition to the institutional) level.
Launching in beta this spring and more formally next fall, CLA+ will, among other things, allow faculty to share formative feedback directly with students and open use of the assessment to the unique needs of each campus. The development of this enhanced version of the CLA will also allow the reporting of even more subscores (like scientific and quantitative reasoning, and critical reading and evaluation)”
Did you catch it? The opening phrase is marketing language targeted at the CLA’s traditional core audience- faculty committees and assessment contacts at regular ol’ universities and liberal arts colleges- and that’s certainly how the second paragraph reads, quite intentionally. But hear-you-me, a claim buried in that first sentence, that this new version can now provide “reliable information at the student…level” marks the opening gambit to become a whole ‘nother kind of heavy-hitter in higher education and beyond.
To do this, the CLA is taking advantage of an unspoken but widely understood bargain made by these traditional institutions- they were willing to administer the CLA and to suggest that it was at least the “most accurate metric available” for measuring what college is “really supposed to do”, as long as this satisfied accreditors’ demands for direct assessment without posing any risk to their reputation. In the old model, institutions remain the keeper of the keys for certifying whether students have completed a college education, and the CLA is a tool they use privately to improve.
Now, for the first time, the CLA is able to say to for-profits and third party providers (who may in turn target the test to adults entirely outside of the traditional system) “here is a metric that some of the country’s most elite colleges have said is the best tool for assessing their students progress. If your take this and do well, they must be as well prepared as students in those colleges.” I predict it won’t be long at all before we see those exact claims coming from these spaces- For-profit StraighterLine is already advertising the CLA+ at a heavily marked-up price for use as an additional credential to offer employers.
Read in this light, the other changes from the CLA to the CLA+ take on new implications:
- The addition of a new explicit quantitative metric (mimicking the verbal/math component of the SAT/ACT and establishing itself as a “complete” assessment of workforce skills)
- A shift in the scoring to the “more recognizable” 1600 point scale (the metric used by the SAT from almost its inception until a few years back when the new and still inconsistently-used writing section bumped the top score up to 2400- nearly every employer would still be more familiar with the 1600 scale)
- No longer date-dependent (the old CLA required you to administer the exam to students within a limited testing “window” in the fall and spring- Now, if you want to take the CLA on Christmas Eve or the 4th of July, go for it!)
- Now openly and explicitly about assessing individual students
It is certainly true that if you believe the CLA is a reliable and valid tool, then it continues to have some real new value for traditional colleges- as potential placement tests, assessments of subgroups of students (like a remedial pre-college program that meets in the summer), or possibly even as a service for interested students seeking formative feedback or a supplemental piece of evidence. Yet all of this pales in comparison to the new potential for the CLA to be used by non-traditional institutions and the for-profit third party education space- and you can see that in the shift in its marketing.
There have been efforts to use the CLA in a more public and comparative way before- Academically Adrift, 2011’s higher-education-is-failing beach-read, purported to show that the old version of the CLA was reliable and valid at the individual level (a claim that CAE took up after the publication of the book despite continued questions about its methodology) and that most students improved very little over the course of their college careers. In recent years, community colleges and for-profits, having little to lose and potentially much to gain in the way of reputation, have pushed for the ability to publicize CLA scores- all to little avail.
This Time Could Be Different
The CLA+ though, represents the first concerted push from CAE itself to become a major player in the individual-level assessment business (a multi-billion dollar industry, unlike the small-change niche of institutional assessment) and the timing has never been more ripe. Let’s consider the higher education ecosystem the CLA+ is stepping into:
- There is increasing public distrust of the value-added by higher education compared to its cost, fueled in part by rising tuition costs and student debt; institutions with more to lose than they have to gain (largely elite institutions where enrollment streams are built on “reputation”) have led pushback against public and standardized metrics, but they represent a declining percentage of higher education space in terms of both enrollment and lobbying dollars. Meanwhile…
- The percentage and number of students attending “traditional” for-profit colleges (where the emphasis is on a student receiving a degree from that single institution) has exploded since the early 90’s. They have been pressured more than any other sector to provide evidence of outcomes, and, with little to lose in the way of reputation, increasingly see value-added metrics as a way to set themselves apart from public and non-profit counterparts. The same could be said of community and technical colleges, which research indeed shows can be best bang for your education buck but provide little in the way of reputational capital.
- We may be seeing the first stages of an impending disruption eruption from unbundled, largely for-profit educational spaces that provide ways for students to abandon the traditional model by picking up credits, certifications, and experiences from multiple spaces. These include not only MOOCs, but a growing number of other stand-alone, largely for-profit spaces. Expert after expert has said that the key missing element is a viable credentialing option- and there’s money to be made for whoever figures it out.
- Colleges serve many purposes, but most businesses will acknowledge that at least one of those purposes is filtering- a way to narrow the resume pile in an employer’s market. Yet as the higher education landscape has become more expansive and more students have entered the pipeline, that filter has grown more porous. We want access and completion to be a conversation about skills, but for employers with limited spots, it can be a conversation about numbers- assessments like the CLA, with their now “familiar” 1600 scale, could provide a new, more standardized filtering metric (in the same way that colleges use tests like the SAT)
For these reasons alone, the likelihood of something like this happening was high, and just as with college and graduate school admissions tests, there will be a great deal of money to be made in the tests themselves, services around their administration (like verification and testing centers), and preparation for them. These are the comparisons worth drawing to tests like the SAT- except there’s one HUGE difference that all of these articles have missed and the CLA hasn’t acknowledged.
You’ll note that earlier I have referred to the claim of individual-level reliability (how consistently a test measures what it measures) and validity (how well-aligned a test is with whatever real-world skill it is trying to measure). Both of these aspects of measurement can be incredibly sensitive to the motivation of those students completing the test (essentially, how seriously they take it) and here is the big thing we don’t know yet:
What would happen if students started studying for the CLA+ or something like it? Currently the CLA model is very explicit: students aren’t supposed to study; . That makes sense when it is being used to capture institution-level changes in students over four years- students have no incentive to study for their own sake, and institutions probably get a rough sense of how students stand “as-is.” But just ask any high school junior and you’ll learn that the era of students taking the SAT without studying has gone the way of the dodo. Similarly, if CLA starts to be used to judge even a subset of institutions in any real way, institutions will start operating under a very different set of incentives. When some institutions and some students are using the tests in a high-stakes way and other institutions and students are using it in a very low stakes way, the test becomes less reliable comparative measure of actual institutional-level growth and can increasingly become a test of short-term preparation for that particular test. That criticism is regularly lobbed at the SAT and ACT, but they at least have decades of experience norming the test as a high-stakes instrument, and it is almost exclusively used in that way. What we have also seen is that when tests actually matter at the individual level, when students will study for it because it matters to them (and their parents), a for-profit industry will arise to game it.
How could this play out?
- I expect that we may already start to see a bit of this in for-profit spaces where it is being hawked as an alternative credential. The problem for those right now (and, really, for any CLA claim of validitt) is that there isn’t really any firm evidence that the CLA is in any way predictive of on-the-job success. Relatedly, there is currently no evidence that a CLA score, good or bad, has any correlation to success in the job market. Still, for-profits will latch onto the CLA’s language that it assesses the “sorts of skills that employers identify as most valuable in X survey,” and I won’t be surprised to see a gold seal reading “As featured in the Wall Street Journal” to spring up on the websites of for-profits offering it directly
- Watch for whether any college or university starts to use the CLA as a required, binding exit test. As noted earlier, the CLA is currently voluntary, and even if it were made mandatory, its potential for reducing graduation rates even slightly would serve as a huge disincentive for an institution of any type to require some level of “passing” score for graduation. Still, its not impossible that a for-profits, community college, or non-selective 4-year institution might make a gambit- higher education is becoming a competitive space, and the potential reputation-boosting upside might be worth it as an experiment
- It’s unlikely, but not impossible that a state government or the federal Department of Ed may will offer either incentive or exception to institutions requiring something like the CLA.
- More likely, we’ll see for-profit college guides and ranking lists start to request institutional CLA scores- first voluntarily, then, possibly, as a requirement.
- Perhaps the most potentially disruptive possibility is that colleges will not require something like the CLA, but Amazon, McKinsey, Microsoft, or another high-prestige employer starts accepting an alternative credential and makes the argument that it has worked for them as well as or better than college as a predictor of workplace success. This, like the use of the tool as a binding exit test, could flip the motivational switch for students and change the way the CLA works both on the ground and in the marketplace.
Some in higher education will see the CLA’s move as a bait-and-switch, although a dozen years of gaining credibility with the traditional higher ed sector before diving head-first into the big-money world of the non-traditional sector seems like a particularly ambitious long-con. But whether institutions took part in the CLA ten years ago or are thinking about it next year, they need to understand these changes and the motivations they represent in the present matter. We’re almost certainly going to see something happen in the next few years around credible 3rd party credentialing, whether the CLA+ or something else, and it will change the way that potential students and employers consider college- if they consider it at all.
Can this assessment CLA+ be correlated with outcomes of standardized tests? GRE or Millers Analogies.