From Metafilter: top 500 world universities.

Their methodology seems to be centered around mathematics and the sciences: a large fraction of their weight goes to Nobel Prizes, Fields Medals, papers in Nature, Science, and other publications counted by Science Citation Index. No factors such as SAT or GRE scores concerning the student body, nor anything about grant funding; I guess those would be too US-specific.


None: Nitpicks
The ranking methodology is highly backward looking, as recognitions such as Nobel Prizes are often given many years after the relevant work took place. The Economist for one, doesn't believe Cambridge fared that well after the Thatcher cuts. This will only be reflected thirty years hence, when work done today is being recognized by the Nobel committee. The Philosophical Gourmet has a ranking of the top research universities in the USA: While within the USA the two rankings tend to agree (usually the ranking in one is p/m 3 from the other), there are some notable differences. Michigan-Ann Arbor is #4 versus in one #18 in the other, Caltech is #13 vs #5, U Texas Austin #14 vs #29, Chapell Hill #22 vs #41, UCSD #23 vs #11, Brown #25 vs #52.
11011110: Re: Nitpicks
The Philosophical Gourmet appears to be rehashing other people's beauty-contest style rankings: "Because the 1995 National Research Council study (based on 1992-93 surveys) is out-of-date, I rely exclusively on the most current U.S. News academic reputation survey". I thought the Chinese thing was refreshing for its internationality — why should I restrict the places I compare myself to only those within the US? I don't similarly restrict the papers I read or the conferences I go to. I'm sure what you say about the Nobel Prize criterion being backwards looking is on target. The Fields Medal at least is restricted to younger researchers, so doesn't have quite the same issue associated with it, nor do the SCI pub counts. But I dislike most rankings' inclusion of opinion poll numbers — they can reflect a deeper knowledge of the institutions than the numbers show, but they can also be similarly backward looking and reflect less on the quality of the institution and more on the famousness of its name. This Chinese ranking, at least, appears to be using only objective quantifiable metrics. Whether they're the right metrics, or are combined in a way that makes any sense, are different questions, of course.
None: Re: Nitpicks
You make good points. Still, the general agreement of the two polls in spite of substantially different measures (and the somewhat undesirable reputational survey) suggests that either one is overall a good measurement proxy for quality (within a margin of error of p/m 3) unless there are some extraordinary circumstances, such as the Thatcher cuts.
None: rankings
There do seem to be other oddities in the rankings, although part of that may be my perception as a computer scientist (note that CS will not fare well under the ranking criteria, as Nobel prizes, Fields medals, and Nature/Science publications are irrelevant to computer scientists...) It still seemed odd to me that Hebrew University was ranked (significantly) higher than both the Technion and the Weizmann Institute...
None: Re: rankings
I agree - there are quite a few rankings that are rather bogus. I'm rather familiar with Canadian universities and it seems rather odd that the University of Saskatchewan be ranked higher than the University of Western Ontario (for example). Pretty much a joke, in fact.
11011110: Re: rankings
I'm not so familiar with them down to that level, but it seems to me that, at least in Computer Science, Waterloo should be a lot higher than the 200-300 range.