Friday 18 April 2014

Are college students learning?


Students walk on the campus of University of California, Los Angeles.


Students walk on the campus of University of California, Los Angeles.






  • Ben Wildavsky: You can find lots of information on U.S. schoolchildren's performance

  • But no tests exist to find out how students are doing in higher education, he says

  • Wildavsky: We need evidence to be able to see how reforms affect learning at college

  • It also would be helpful to find out how undergrads stack up academically, he says




Editor's note: Ben Wildavsky is director of higher education studies at the Rockefeller Institute of Government, State University of New York, and policy professor at SUNY-Albany. The opinions expressed in this commentary are solely those of the author.


(CNN) -- If you want to know how U.S. schoolchildren are performing, you don't have to look far: A wealth of information is available, thanks to the National Assessment of Educational Progress.


Go online and see, for instance, that Massachusetts children outperform those in Texas, that average math scores have gone up nationally over the past 20 years and that the District of Columbia was the only urban district to improve in math and reading in grades 4 and 8 last year.


But what if you want to know how much students are learning in college? Here, the trail grows cold.


The Obama administration's proposed college ratings would measure access, affordability and outcomes such as graduation rates -- all of which are well worth tracking. But there's no proposal to find a way to measure student learning.



Ben


This failure to examine systematically what is, after all, the core mission of colleges is a big problem for U.S. higher education. We're awash in efforts to improve the quality and cost-effectiveness of our colleges. But without a better base of comparative evidence, we won't really know how these reforms affect learning.


That's why it's time for NAEP to go to college.


Tests of what college students know are nothing new, of course, within individual classrooms and institutions. Comparative, standardized measures are rarer, though -- and much more controversial among college leaders.


There's the Collegiate Learning Assessment, or CLA+, which examines critical thinking, reading and writing skills and is administered to small samples of undergraduates at about 200 colleges each year. But many wary administrators contend the CLA should be used for self-assessment and career placement rather than college-to-college comparisons. And some won't make their data public. That may be no surprise: a 2011 book drawing on CLA data found dismayingly low levels of student learning across the nation.





Gov. Walker: No diploma? No problem




Hypocritical not to pay college athletes?

It's the same with the National Survey of Student Engagement. A growing number of colleges use this survey, which doesn't measure learning directly but asks students about things such as how often they write long papers or talk with professors outside class. However, many still don't release the results off-campus, preferring to use them for internal improvement.


An intriguing new initiative led by Massachusetts will compare student learning across nine states by evaluating work such as student papers and lab reports. But because it will consciously avoid standardized tests, its ability to make credible cross-state comparisons may be limited.


The enormous advantages of an integrated, independent national test are clear. Like the NAEP used in elementary and secondary schools, a college NAEP could be administered to representative samples of students around the country. It would provide broad national and state-level results, including a breakdown of learning outcomes by race and socioeconomic status, and trends over time. It wouldn't assess individual colleges, which should alleviate the (overblown) anxieties of educators worried about curriculum-narrowing or one-size-fits-all rankings.


Still, the idea has long been contentious. It last received serious consideration more than 20 years ago, when the National Education Goals Panel recommended a voluntary national assessment to gauge the number of college graduates with advanced critical thinking and problem-solving skills. But higher education groups torpedoed the proposal. They argued, among other things, that a single measure would be unfair to institutions with large proportions of disadvantaged students, including community colleges.


Today, a collegiate NAEP would undoubtedly face similar objections. In fact, even as an idea, it already has -- in rather dramatic terms.


Clifford Adelman, senior associate at the Institute for Higher Education Policy, contended recently that the "shadow of a government test," taken by very different kinds of students, studying a variety of subjects at a wide range of institutions, "is enough to freeze the soul, let alone elementary statistical sanity."


But objections based on the diverse nature of American higher ed don't carry a lot of weight when lobbed at a sample-based test of vital academic skills that any student should be learning at any college.


Indeed, testing expert Gary Phillips of the American Institutes of Research, who once ran NAEP, has sketched a persuasive vision of a new experimental test that could make cross-state comparisons and also provide achievement data for different groups of institutions, from community colleges and liberal arts schools to Ivy League universities.


The next time a state boasts about the prowess of its universities, wouldn't it be nice to have an objective yardstick -- a reality-check that shows how those undergrads stack up academically against their peers in the rest of the country?


It's an appealing prospect. Why not give it a try?


Follow us on Twitter @CNNOpinion


Join us on Facebook/CNNOpinion



No comments:

Post a Comment