Is Webometrics an academic ranking?
The ranking is compiled from reliable prestigious open data sources. The editors has a long research experience and expertise in several quantitative scientific disciplines as you can check from the main editor profile in GS Citations.
The CSIC is the Spanish National Research Council, a public organization devoted to scientific and technical research that it universally recognized as one of the top R&D European institutions. Contrary to other ranking editors it is a non-profit institution, without any commercial links.
We strongly discouraged the use or publicity of any non-academic rankings, especially those published by anonymous editors or not institutionally backed and/or with strong commercial or political interests. Lack of careful explanations of methods used is ever very suspicious.
Is Webometrics a Ranking of university websites?
NO!. Web presence and visibility are used as indicators of global performance of a university. These indicators take into account the teaching commitment, the research results, the perceived international prestige, the links with the community, including industrial and economic sectors, of the university.
Web design is totally irrelevant as all the measurements are related to the contents published (quantity and quality), especially those open access documents of academic interest or relevance.
Usage statistics related to number of visits and visitors are weak indicators. The motivation and meaning of a visit is not correlated with the quality or usefulness of the contents. Most of the cases the visitors are local, usually the own students of the university, as can be shown in the sudden drops in the temporal distribution of visits during academic vacations. Clearly this indicator is strongly linked to the size of the institution (number of students).
What is the reliability of the Webometrics Ranking?
The Ranking Web correlates highly with other Rankings, especially those based on research results. It has a far larger coverage as it ranks more than 20 000 universities worldwide and also it takes into consideration more university missions (not only research).
The main discrepancies refer to individual universities with mistaken web policies (mostly web naming). Several institutions have two or more web domains, others have changed recently its main domain but still maintain a lot of information under the older one and even a few universities share its domain with external organizations.
In other cases, if the web performance of an institution is below the expected position according to their academic excellence, university authorities should reconsider their web policy, promoting substantial increases of the volume and quality of their electronic publications.
But the academic quality is not strongly related to the research performance?
This is true for research-intensive universities, probably not more than a few hundred ones worldwide. This is unfair with many others universities, with limited emphasis in in research activities, such as the open universities or those focused in business, technology or languages that traditionally attract a lot of foreign students.
For the Webometrics Ranking, the research results are also very important. We collect number of publications of the open citation bibliographic database Google Scholar that has a good coverage of academic institutional repositories worldwide.
Is Webometrics Ranking produced by a truly independent organization?
The Ranking is produced by the Cybermetrics Lab, a research group of the CSIC, the Spanish public organization devoted to scientific research. We are not linked to any company, newspaper or nongovernmental organization. Our mission is to apply new quantitative methods to the analysis of Higher Education and our declared objective is to promote Open Access initiatives. As a non-profit initiative, we do not accept payment for inclusion in the Ranking nor offer advertisement banners to any university.
Are there known biases in this Ranking?
Every ranking system has its shortcomings. Surveys are inadequate for large world rankings, correct figures are difficult to obtain for many bibliometric variables, weighting indicators are frequently controversial, and the interpretation of report cards is really difficult. Web Ranking combines in a single figure all the universities missions, but it not provides enough details to understand the relative contribution of each one. Good ranks are probably correlated with higher number of potential web authors (scholars + postgraduates), Open Access mandates, open or off-campus teaching and the technological focus (or level of investment in telecom and computer systems) of the university. Obviously, providing large quantities of quality English contents is also a great advantage.
What is your policy about data sharing?
Only the published data are available for non-profit purposes. We do not supply additional info to anonymous or commercial requests. For academic purposes we are open to share our results on the basis of joint academic projects.
Why previous editions of the Ranking are not available from the website?
There are two reasons. Firstly, as the methodology has changed (sometimes slightly, but for some editions considerably) the results are not comparable. We are trying to re-rank the older data with the new methodology but it is not an easy task, and it will not be ready soon. Secondly, the most important reason is policy related. Improving positions in the rank is not necessarily correlated with performance improvements of the institutions, so only current data is provided.
Only the ranks are provided, why not the raw data?
The Web is very dynamic, search engines’ results are changing continuously so the numbers are only valid for the moment they are collected. Moreover, these numbers are obtained from a combination of several sources to decrease the engines’ biases using a very simple mathematical procedure (log-normalization and median). The resulting values are not useful for individual comparative purposes and even can be misleading or prone to manipulation.
Case studies: Why the Imperial College rank is so much delayed?
Surprisingly a large number of universities maintain two or more main web domains that clearly penalize not only their Webometrics Ranking, but even more importantly their position in the search engines and their global internet visibility. Imperial College changed its domain several years ago from ic.ac.uk to imperial.ac.uk, a good move as the new domain includes the name of the institution. Unfortunately the university still maintains a lot of servers under the old domain in such a way that it is even more popular and visible than the new domain. Most of the universities marked with an (1) are in similar situations.
Case studies: Are Korean universities websites underperforming?
The problem with duplicate domains affects to a large number of Korean universities. Most of the top institutions in this country have two web domains that explain in part their poor performance in the Webometrics Ranking. Other important aspects to be taken into account are the content design with a clear tendency to target students, and the low ratio of academic pages in English. The absence of large scientific repositories has also catastrophic impact.