Measuring college knowledge
Written by Andrea Aron-Schiavone|
March 22, 2012
What are we supposed to be learning in college?
Considering the wide array of classes students take and the varied interests they pursue, it seems that students could generate an infinitely diverse pool of knowledge. Such variety contributes to highly individualized college experiences which are meaningful and fulfilling to every member of the student body.
If each student’s learning experience in college can vary so greatly from person to person, is it possible to develop a test that measures an increase in knowledge from freshman to senior year?
The Collegiate Learning Assessment attempts to do just that. The CLA is a written test administered to college freshman and seniors and was developed for academic institutions to gain a sense of their own strengths and weaknesses. It strives to measure the “knowledge progression” of students by comparing a sample of freshman and senior scores.
There are three components to the CLA. In the 90-minute “Performance Task” section, students write a response weighing conflicting arguments, hypotheses or courses of action, while referencing related documents. The 45-minute “Make-an-Argument” section requires students to defend a position using well-reasoned support. Finally, the 30-minute “Critique-an-Argument” section asks students to assess the merit of a given argument.
Developing an internal measure for schools to assess areas for self-improvement in curricula is a noble goal, but to view the CLA as a measure of learning progression is, to paraphrase Jiminy Cricket, “a lovely thought, but not at all practical.”
The main problem is that the test creates a rather narrow operational definition of what “learning” is: It essentially equates a progression in learning with improvement in analytical thinking as reflected through persuasive writing. While these skills are important to develop in college, the CLA does a disservice to many students by not reflecting the diverse array of other skills and knowledge that they acquire.
Comparing scores of freshman and seniors in this domain is also not an accurate measure of student progression. College freshmen emerge from a rigid high school curriculum, where critical writing is emphasized in subjects required for graduation, such as English and history. Throughout college, students take specialized classes in their areas of interest. Seniors majoring in art, engineering, physics or chemistry may not score as well on the CLA as freshmen do simply because they may be more “out of practice” in completing such persuasive writing tasks that were not as integral to their major as other skills. On the other hand, seniors majoring in philosophy, English, government or history, who may have more recent experience writing persuasively, could earn better scores, not because they have learned more in college than their aforementioned peers, but because their areas of study are more conductive to the CLA’s tasks.
This also presents a problem in administering the CLA to a variety of colleges. Seniors from the College of William and Mary, many of whom have taken predominantly liberal-arts courses that promote analytical writing skills, may fare better than seniors from schools which focus on engineering or the performing arts, thanks to the nature of their specialty. Also, with regard to the “Writing Mechanics” component of all three measures, international students who are not as familiar with nuances of English grammar and syntax also may earn lower scores, even if they have learned just as much as their native-speaking peers.
While in theory the CLA would be a useful tool for self-assessment, we cannot claim that it measures how much a student learns from freshman to senior year because of the diversity of both the students’ education and their schools. Yet, this frustrating incongruence is the very quality that makes the college experience so rewarding — the variety of available knowledge one can gain is in many ways immeasurable.