Banner

Understand College Rankings

By Eliot Applestein
Special to The Washington Post
August 24, 2000

 

A few days after U.S. News & World Report's 2000 College Rankings edition listed Caltech as the top university, Charlene Liebau's phone began to ring. Calls to the director of admissions came from students who had committed elsewhere but suddenly had a change of heart. They wanted No. 1.

When Cornell went from sixth in 1999 to 11th in 2000, former vice president Ronald Ehrenberg got the reverse reaction. "I actually had a freshman student come up to me because Cornell came up very high and now that it wasn't ranked as high, he was going to transfer to Georgetown because it wasn't as good a school in his view. This is symptomatic of how students in these competitive institutions want to be at the most highly ranked schools."

No one actually changed schools. But the perceptions that this "swimsuit edition" fosters remain a potent force. The college ranking issue is "the envy of the magazine industry," says Michael Brannick, president and CEO of Peterson's, the largest publisher of college guides in this country.

With the college selection process so baffling, it's tempting to use a magazine ranking as an easy guide. The savvy consumer, however, needs to know that the rankings influence the admissions process--both for students and colleges. They give the false impression that there really is a difference between the No. 1 and No. 6 institutions, that all of their hand-picked criteria are the right criteria for every student, and that colleges change quickly from year to year.

Ranking's Impact on Admissions

To improve their standings in the rankings, more colleges are accepting a higher percentage of their incoming freshman classes by early decision (ED). Penn, Columbia, Yale and Dartmouth filled more than 40 percent of this year's incoming freshman class with early-decision candidates. This lowers the acceptance rates for the larger pool of applicants who are applying for regular decision. With a higher rejection rate, colleges appear more selective and this helps ratings. Early decision also improves the college's yield because just about all of those accepted early will come.

Ronald Ehrenberg, author of the upcoming book "Tuition Rising: Why College Costs So Much" (Harvard Press), and James Monks of the Consortium on Financing Higher Education have analyzed the impact of the rankings on admission's policies at 30 private institutions that consistently have been ranked as top national universities and liberal arts colleges. Their findings suggest that changes in the yearly ranking have measurable effects on admission outcomes and pricing policies at institutions.

"If your ranking improved from five to one," says Ehrenberg, "the number of students who applied went up and the number of students you accepted went down. Of the number who you admitted and who actually went, SAT scores went up. In addition, these institutions didn't have to use as much financial aid to attract these students."

Conversely, Ehrenberg's research shows for the private institutions studied that a drop of one place in the rankings increases a school's admission rate by almost half a percentage point.

These schools had to admit more students because their applicant pool diminished as did the number of students willing to attend. With an increased admission rate, the SAT scores for the incoming freshmen class also dropped.

"It used to be," says Ehrenberg, "that institutions would make financial offers to students and if students came back and said, 'I don't think you understand my needs, perhaps you can reevaluate this,' they would make a change. But now the market has become so competitive that students come back and say that they have received an offer from one institution and then go to another and share this and want them to top it.

"Financial aid offices don't want to talk about this, but it is going on. Students have become so preoccupied with going to the best schools, and if you fall in the rankings, you have to provide more aid to the students to get them to come. So the rankings influence how much the institutions have to spend on aid."

A Love-Hate Relationship

"Even though colleges and universities constantly criticize the rankings and urge potential students and their parents to ignore them," says Ehrenberg, "every institution pays very close attention to the ratings and tries to take actions to improve its ranking."

U.S. News relies on data that colleges submit on 16 variables, including the percentage of alumni who contribute, acceptance rates, retention rate and class size.

A subjective academic "reputational survey" about the school from other colleges is worth 25 percent of the ranking. But 38 percent of responding colleges, including Stanford and Cornell, have refused to submit this data because they either didn't know about a particular school or they didn't feel that subjective data was helpful.

In 1996, Alma College in Michigan unsuccessfully tried to organize a boycott of the reputational survey by encouraging other college presidents, provosts and admissions directors to withhold reputational rankings. According to a spokesman for the college, however, "While a number of colleges supported our idea, several colleges that were ranked high on the list didn't want to participate because they felt they benefited from the rankings."

Forced to play the rankings game, some colleges may manipulate the data.

Says Caltech's Liebau: "Some schools do not include the SAT scores of their athletes who report to school earlier than other incoming freshmen to keep scores high. Or they may not include scores of international students."

The more applications a college receives, the more selective it appears. Some colleges may count applicants as anyone who has requested an application.

According to William Conley, dean of admissions at Case Western Reserve: "I think there are still some schools who can say a Part 1 application, even though it never is completed, is an application. Some institutions double dip. If a student applies to several different schools in the university, they will count all of these as if they are different applications."

Another variable looks at retention rates of students, but some majors are notorious for weeding out weak students.

"Forty percent of our students at Case Western are engineering majors," says Conley. "You will have a greater attrition level with engineering than liberal arts. We have thought that if we changed the number of students who we admitted into engineering from 40 percent to 35 percent, our retention rate would be higher. But we haven't done this."

Rankings give the impression that colleges change quickly. They do not.

For example, half of Cornell's employees are paid by New York state, notes Ehrenberg. But one year their employment benefits didn't show up in Cornell's accounts. An important component is how much a university spends on its faculty. When Cornell explained this to U.S. News & World Report, says Ehrenberg, the magazine allowed them to include these and that made Cornell look better.

"Not very much changes at these institutions," says Ehrenberg. "Colleges change very slowly. So it is very hard to believe that the education students have been getting at Cornell was changed over the past several years."

According to Ehrenberg, "The factors that USNWR uses in its formula, and the arbitrary weight that it assigns to each to compute the overall ranking, have changed over time. This leads to the possibility that Institution A may be ranked higher than Institution B one year, but lower the next year, even if nothing has changed at either institution, simply because the weights assigned to different factors have changed."

What do the rankings measure? "The rankings legitimize the kind of perceptions we have of what the medallion institutions are," says Conley.

"They have name recognition and so people play a word association. They will say, 'Stanford, Harvard, Penn.' The rankings have captured all of the informal assumptions about which institutions are of the highest quality because they have the highest name recognition. USNWR has not done anything to broaden an understanding of what makes a great university."

Adds Conley, "I think Americans want ranking, whether it's the top 10 luxury sedans or top 10 areas to live. It allows us an easy way to rate things. Families use it as one of many barometers to determine the college choice list. It provides a framework for some people to explore a better map. The danger is that the student may think that USNWR has done the kind of subjective, emotional homework for them."

 SIDEBAR:  Do-It-Yourself Ratings

A number of publications can help you form your own ranking system.

Peterson's and Barron's college guides group schools by cost. Peterson's provides a set of criteria that the student scores on how successful the university is, and how important these criteria are to him. Barron's provides an extensive list of questions.

Both publishing houses have chosen not to rank colleges on academic merit because of the inherent problems. Notes Peterson's CEO Michael Brannick, "I wouldn't base the selection of a college for myself or my children or anyone I know based on the rankings themselves."

Some schools, including Stanford, have established Web sites (www.stanford.edu/home/statistics) that offer information based on the Common Data Set (CDS), which may be helpful to prospective students.

The CDS (www.commondataset.org) relies on a set of standards and definitions for comparing colleges.

Such hard data, however, can only go so far. You wouldn't purchase a car recommended by Consumer Reports without going for a test drive. The same holds for college. There are too many variables beyond what you will find in guidebooks to know whether a college is right for you without looking. For example, rankings do not specify which majors are strong or weak at a college. You'll need to get this information from college counselors and by seeing the facilities.

Contact the admissions office to schedule a tour and group information session.

The tour is especially important. Be sure to ask lots of questions of your student tour guide. Here are some starters:

1. What other schools did you apply to and why did you choose this one?

2. What do you like most about this school?

3. What would you change if you could?

4. What's your major and what do you think are the strongest and weakest programs on campus?

5. How hard is registering for courses and have you been able to get the ones you wanted?

6. How's the food?

7. Are you in a fraternity (sorority) and does it make a difference if you are an independent and want to go to parties?

8. How much drinking/drugs are there on campus?

9. Is it necessary to have a car on campus?

10. Do many students travel abroad?

11. Are there many opportunities for internships/co-ops?

12. How wired is the campus? (Yahoo has rankings for this, but again, beware of the rankings.)

Ask the admissions office to give you the e-mail addresses of students you can ask questions of. Sit in on a  class when you visit. If possible, spend a night in a dorm.