2017. Ranking Web of Universities. January New Edition

         The January edition (2017.1.1) is built with the indicators obtained during this month in order to maintain the freshness of the data of the most current and updated Ranking of Universities. This version is final.

         This text is the most updated info about the ranking's methodology. It always supersedes the contents provided in the general Methodology. It is also relevant the info provided in the Notes section.

Contact Info

         If you wish contact to editor, my email address is [email protected]. You can check my experience and expertise about these topics accessing the scientific papers I authored as described in my Google Scholar Citations profile.

         Please, do NOT send anonymous requests using email accounts such gmail.com, yahoo.com or hotmail.com. We are not going to answer any of these messages. We warmly thank information about bad practices, but we do not discuss specific ranks or comment on comparative results with other HEIs. Previous editions are not available. We devoted time for preparing this site contents explaining the ranking and its methodology. Please, as a courtesy read this text before asking trivial questions already answered in theses texts.

Info provided here supersedes the contents provided in Methodology

          Please, take into account that the following information:

          - Spanish edition is no longer updated. Please refer to the English version.

          - Exclusion of universities is reserved for diploma-mills, fake or doubtful institutions. Our decisions regarding not inclusion are final and we do not make any email exchange about these issues. However the absence of an institution could be due to an error, so please send to us information about the gaps or mistakes.

          - Universities without full 24/7 webservers availability are excluded. Our criteria are not to rank those universities failing to answer our ping requests from our facilities to their servers during two consecutive months' requests.

          - As previously informed, we also changed our policy regarding the universities with two or more central web domains, a bad practice that it is going to be even more penalized than before. Until now, all the main web domains of the university were ranked, but only the one with the better web indicators was published, even if this was the old or the not-preferred one in the central homepage. This procedure is still applied when both domains maintain their independence, but if the old domain redirects to the new one, this will be the one ranked and published. As expected, this is having strong (negative) impact on a few universities.

Changes in the calculation of indicators

         Published figures are RANKS (lower is better), intended for showing individual performances, but they are not the values used in the calculations. Due to technical issues several key changes have been done, so the following tablae describes the current methodology:

INDICATORS DESCRIPTION SOURCE WEIGTH
 PRESENCE   Size (number of pages) of the main webdomain of the institution. It includes all the subdomains
  sharing the webdomain and all the file types including rich files like pdf documents
 Google 10%
VISIBILITY

  Number of external networks (subnets) originating backlinks to the institutions webpages
  After normalization, the maximum value between the two sources is selected

 Ahrefs
 Majestic
50%
TRANSPARENCY
(or OPENNESS)
  Number of citations from Top authors according to the source
  See Transparent Ranking for additional info
 Google Scholar
 Citations
10 %

EXCELLENCE
(or SCHOLAR)

  Number of papers amongst the top 10% most cited in 26 disciplines
  Data for the five year period (2010-2014)

 Scimago 30 %

A few relevant facts about the Ranking

          Since 2004, the Ranking Web (or Webometrics Ranking) is published twice a year (data is collected during the first weeks of January and July for being public at the end of both months), covering more than 24,000 Higher Education Institutions worldwide. We intend to motivate both institutions and scholars to have a web presence that reflect accurately their activities. If the web performance of an institution is below the expected position according to their academic excellence, university authorities should reconsider their web, open access and transparency policy, promoting substantial increases of the volume and quality of their electronic publications.

        Data is collected between 1 and 20 of January or July, depending of the edition. Each variable is obtained at least two times during that period and the maximum value is chosen for discarding mistakes or errors. Volatility of search engines is very high so figures can be different and not easily replicated if the search is performed days later. Google info is very geographically biased, so for our purposes the data are collected using the google.com mirror domain, English as language of the interface and Madrid (Spain) as location.

        Final publication is done about late January or July, usually not before the 28th.  We reserve the right to correct minor errors, mainly those related to the names of institutions, but also specific problems with the data. As a general rule we do not discuss any figure or provide the raw values supporting specific ranks.

Bad practices

          During the last years we discovered that most of the unethical practices are done by individuals or groups not really representing the institution where they work. Until now a flagrant violation of ethical code was penalized with the exclusion of the university, but this was misleading as external visitors could assume the absence was due to a mistake. For avoiding this problem and pointing academic authorities to a serious misbehavior by somebody in charge of the websites, we decided to maintain the entry but marking with a 99999th rank those indicators grossly manipulated: The raw value of this indicator is set to zero.

          The Webometrics Rank of a university is strongly linked to the volume and quality of the contents it publishes on the Web. Such contents should be originated by the faculty and other members of the university or by special agreement with external authors. It is not fair to use external contents for improving the rank of the university. It is not correct to increase artificially the number of files in the website, especially in the repository, duplicating the same material in different file formats or splitting a document in many different files (for example, a pdf file for every page in a monograph!). This is not only an unethical behavior, but also according to the international law it is is illegal, as using the documents of third parties without permission, violating the copyright of other authors, institutions and editorials is a crime that could involve fines by damages in the order of millions of dollars and prison penalties. European Union law punishes even to link to websites that provides access to pirated contents.

          Visibility indicator intends to measure the impact of the contents of the websites, using external inlinks as a proxy. However we have discovered that certain CIOs are setting up external student forums specially suited for producing large number of links to the university webpages, using even systems that promote piracy, illegal drugs, pornography, terrorism or pederasty (crimes penalized with capital punishment in some countries!). In other cases university is expending huge amounts of money for buying links to link farms to artificially increase the number of links. In one case the university contracted 1000 domains with 30 million links altogether that is certainly very costly, very unprofessional and clearly a crime if public money is involved.

Additional exclusion criteria

          A few institutions, mostly religious affiliated "Colleges" in Philippines and Latin America are publishing web portals that cover all their education activities, including those from their Schools (Basic Education) and High Schools (Intermediate Education). It will be unfair to include in the Ranking these institutions for comparative purposes, even although most of the web contents were related to their Higher Education departments. We strongly advice these organizations to segregate with an independent domain their university-level activities if they wish to be included in future editions.