Greg Boone

College Scorecard: The value of data in higher ed

A few weeks ago the Department of Education released a trove of data about colleges and universities around the country. The site itself does a pretty great job of showing some high level information about how much a school costs and a few statistics about retention, graduation rates, and financial success after leaving.

Earlier this week NY Times columnist James Stewart wrote the latest hit piece on the project with a familiar refrain.

While Scorecard adds potentially valuable information to the dizzying array that is already available, it suffers from many of the same flaws that afflict nearly every other college ranking system: There is no way to know what, if any, impact a particular college has on its graduates’ earnings, or life for that matter.

That’s a fair critique. Other critics have called it “absurd,” and “a classic example of confusing causation and correlation.” But they’re missing the point.

There probably is some desire for a tool that will give one clear answer about what school to attend. That’s probably what some of the commercial rankings, like US News and World Report, are trying to do, but that’s not what Scorecard or the CFPB’s Paying for College tool are about. As Stewart says, “pay, of course, says nothing about the relative quality of different colleges.”

What’s valuable about the college scorecard is the magnitude and openness of data provided. This is a research tool, not a ranking system, and with greater access to research tools, students might be able to make better decisions as consumers.

The New York Times, for example, found major gaps in male and female graduate earnings at prominent universities. They found that female MIT grads experience a $58,100 gap in earnings compared to males; “women who enrolled at Harvard are making as much as men who enrolled at Tufts.”

A woman might rank MIT lower on her list with this knowledge. Or, maybe not. Despite the gap women from MIT still make more than grads from most of its competitors. Dismissing pay data misses the point that at some point every graduate, man or woman, privileged or not, has to pay for their education.

It’s also been interesting to see and think about how other people use these data. Journalists have drawn some tremendous insights out of these data already. But what if a high school computer science class took on a project of figuring out which schools are a “best fit” for students from their area? Based purely on financial aid and income data you should be able to make a good guess at how much debt a Minneapolis public school graduate should have after attending nearby colleges and universities. Building a small application out of a public data API seems like the start of a good lesson.

What about colleges? Scorecard’s diversity numbers tell a huge story. 25% of students at my alma matter receive a Pell Grant. 83% of the student body is white. My college has struggled to attract a more diverse student body, on just about every level, for a long time.

If college deans of students can identify areas where minority students are being underserved, these data can expose where student resources maybe should be increased. Maybe MIT should strengthen their career services and do a better job preparing women for salary negotiations. Maybe the history and women studies programs should get more attention so more students can study the historical roots of this problem.

All too often, though, students are bizarrely not considered a college’s primary user base. Presidents of universities typically listen to alumni and foundation donors over students about how the college is doing and what its priorities should be. With these data, though, a student body representative or college president could go to the board and say: We know that more than half of our graduates could be earning far more than they are, we need to increase student services, academic services, and design programs that put students first or we risk a donor base that can’t support us in the future.

Colleges around the country are talking about data driven decision making. Often that turns into prioritizing support for academic programs based on enrollment. Or finding data about donors to inform where to move money. Maybe with these data plus all the data these schools are already generating and not sharing we can see smarter, more student-focused decisions.