The Moscow International University Ranking 'The Three University Missions' (MosIUR) was first published in 2017. In five years it has become a recognised tool for assessing the performance of universities. Ivan Grigorev, Advisor to the Rector, spoke on the development and credibility of the global rankings, and how their data can be used by universities.
There are a number of university rankings – Times Higher Education, Quacquarelli Symonds,
Academic Ranking of World Universities and others, each of which uses its own criteria to determine the leading universities. What are the specifics of the Moscow International University Ranking 'The Three University Missions'?
First of all, it should be noted that it was the first ranking that systematically took into account the social mission of higher education institutions, traditionally referred to as the ‘third’. The first two missions are educational and scientific. By the time of the first publication of the Moscow ranking in 2017, the social mission of universities had been discussed in the academic literature for at least ten years. Our colleagues managed to create a system of indicators that reflect this social mission. The compilers of the ranking have also succeeded in developing a system that takes into account measurable indicators in all three areas of university activities, rather than expert opinions. These are what can be called the two main features of the MosIUR ranking.
Why is the social component so important?
The training of highly qualified specialists is, in my view, an important component in the advancement of society. Furthermore, it is important to understand that for students who come to university immediately after finishing school, and this is the majority of first-year students, socialisation and learning about living in an 'adult' society is an important aspect of their university experience. If you look at the Charter of our university, you will read that St Petersburg University serves society. I think that similar things are written in the charters or by-laws of other higher education institutions, including foreign ones. The social mission of universities has existed since their emergence. Although it is quite difficult to measure these initiatives in quantitative terms, the Three University Missions ranking has largely succeeded in doing so.
‘The goals, objectives and activities of St Petersburg University are outlined as follows:
- serving the needs of individuals and society in terms of intellectual, cultural, moral and spiritual development;
- preserving, developing and amplifying the ethical and cultural traditions of St Petersburg University, educating young people in the spirit of these traditions, enhancing the role of the humanities in the educational process;
- spreading the humanistic values and knowledge’.
How does the Moscow ranking measure the contribution of universities to the social development of the country?
The list of criteria includes: the number of public online courses posted on the largest global platforms (reflects the accessibility of quality education to citizens); the share in the total volume of publications in the country (indicates the efficiency of public investment in the university's activities); and the total number of university website pages indexed by the leading search engines (shows the informational transparency for the public) to namw just a few. The ranking also takes into account the number of alumni who are successful in various fields and have their personal page in Wikipedia, thus evaluating the impact of the university on the development of society.
A university could probably post more pages of its alumni on Wikipedia to boost its ranking, couldn't it?
It may try, but such activity is easily tracked. The Moscow ranking methodology takes into account both the year a graduate was born (no earlier than 1949) and the number of hits: pages that are not sought after by users are not considered in the calculations. Even if some university creates entries about everyone who has ever studied there, the relative number of hits will be negligible.
In this regard, it should be noted that data verification, both direct and indirect, is an important part of any ranking agency's activities, which they engage in systematically. For example, the compilers of Webometrics are constantly reviewing their methodology to ensure that it is as consistent and independent of manipulation as possible. So do the teams at Times Higher Education, Quacquarelli Symonds and the Moscow International University Ranking 'The Three University Missions'.
At a meeting of the working group of the Russian Rectors' Union it was decided to create a family of sector and subject rankings on the basis of MosIUR. What purpose would they serve? What kind of rankings would these be?
All leading rankings have gone through this stage: they first appear as global rankings and then, a few years later, when all the inconsistencies in methodology have been identified and corrected, are split up according to individual branches of knowledge. The fact is that any global ranking is, to a certain extent, a ‘crooked mirror’, which arouses irritation and criticism from the academic community. Experts argue that it is impossible to measure and compare universities that specialise in such diverse fields as mathematics, arts, and engineering, using a single metric. This observation is quite valid and substantial, since all classical universities are, indeed, multidisciplinary. Indeed, some areas of knowledge are disparate, including in measurable terms. For example, the QS ranking for Arts and Humanities does not use the publication citation score because it does not characterise the field. The same is true for Physical Education and Sports: there are certainly scientific publications in this field, but it is the number of Olympic champions and other achievements valued in this field that should be taken into account. Therefore, when the ranking agencies gather a sufficient database, they look at whether it can be applied to individual areas. If the result is unsatisfactory, more information has to be collected or the criteria have to be reformulated.
The Moscow International University Ranking follows the same track. For five years, the team has worked on the methodology of the principal ranking and now they believe that they are ready for more detailed field-based assessments. It is likely that they will assess as the teams of other rankings, such as QS, THE and ARWU do, who also started by looking at the broader fields.
When a university improves its position by some points in the QS rankings, it becomes its achievement to be proud of. How prestigious is the Moscow ranking today?
There are several answers to this question. First, this ranking has been accredited by the relevant professional communities and audited by PwC. A positive appraisal at this level is one of the elements of recognition.
Secondly, there are studies examining which universities receive de facto preferential treatment in the rankings of a particular country, and in that sense they all differ from each other. For example, French universities are poorly represented in the Times Higher Education World University Rankings, which is not surprising for a ranking that is published in the UK. Similarly, Russian universities were practically absent from the rankings until the authors increased the sample. Japanese universities are well represented in the Shanghai ranking, but not so well in others, and so on. Thus, we can expect Russian universities to be represented in more detail in the Moscow ranking than in the others. This is natural, since the unified state examination score as one of the quantitative indicators applies only to Russia, and not all countries have analogues of this exam. The opportunity to see our higher education institutions in the global context allows us to evaluate and compare them not only within the national system, which has been systematically done for many years, but also with higher education institutions in other countries. Nevertheless, the objectivity of the MosIUR is quite obvious as the top positions in the ranking are held by non-Russian higher education institutions.
Thirdly, academic publications on the subject have been referencing The Three University Missions ranking, and some universities have posted ranking information on their websites. This, too, is an element of recognition. Of course, no ranking will be instantly accepted by the global community. The Shanghai ranking began in 2003, and it was initially designed as an international listing of universities for those nationals of China who were considering study abroad. It was a tool designed for internal reference, so it was met with considerable scepticism from the global community when it was released. However, now, after almost 20 years, it has some weight in the eyes of the world.
In 2019, St Petersburg University was in the top 50 Russian universities according to the MosIUR ranking, while last year it was in the top 40. Would you please give some examples of how the results of MosIUR can be applied in the work of the University?
There is an order at St Petersburg University, according to which the interaction with any university that is in the top 300 of international global rankings is facilitated (for example, concerning the procedures for the recognition of the results of online courses or documents). The logic behind this is as follows: if a university ranks high, it means that it is considered good by the global community. The results of the Three University Missions gained recognition in 2019, when it established the reputation of having a quality level of a worldclass ranking. Then it turned out that some Russian universities are not at the top of the QS or ARWU rankings, but rank high in MosIUR. This guides us in choosing these universities and using their achievements to improve the quality of our work. However, it should be noted that decisions are not made automatically on the basis that a university appears in a particular ranking. It only simplifies the formal procedures and the choice is made depending on with whom we want to develop cooperation and what kind of specialists we need.
The Moscow ranking makes interaction with Russian universities easier to a certain extent, I would say. We often talk about the importance of international academic mobility and invest a lot in it, but compared to other countries, internal mobility in Russia is much less developed. The balance in our country may be described as being heavily biased towards competition rather than cooperation. This seems wrong, as we have many reputable universities, with which we should develop cooperation that is mutually beneficial. The need to develop internal academic mobility in Russia has been stressed at the highest level of government. I believe that the results of the Moscow ranking will contribute to this process.
How can government agencies use the ranking in their practice?
In my opinion, they could issue a similar order, for example, to take into account the position of the university in the Moscow ranking when making decisions about awarding grants or financial incentives. Many Russian universities are represented in MosIUR, so the data is quite substantial. There is a discussion of different ideas. The information about the top ranking of a university in the Three University Missions can be used, for example, to monitor its achievements or to simplify the process of state accreditation, since it is an external, objective assessment of the university's activities. So far the information from the ranking is not widely used at the state level. However, it seems that, as time goes by, the Moscow ranking will become widely recognised and then the position in this ranking will be considered as a ‘traditionally’ significant indicator.