National Coordination for Reforming Research Assessment

CoARA ( is being adopted by a number of Swiss higher education institutions. The 10 principles of CoARA imply a profound change in the criteria for evaluating scientific research, and at all levels of this evaluation, in a systemic way. For a successful implementation, university management will therefore need the support of researchers within their institutions, as well as from their supervisory authorities.

On the researcher’s side, and particularly at the start of their career, the dichotomy between the quest for reputation and the need for recognition by their institution and research funding bodies poses a real dilemma (Nicholas D. et al, 2019). On the side of the supervisory authorities (cantons, confederation), the subsidies granted to universities for basic research funding depend essentially on the number of FTEs (full-time equivalent) involved in research and the third-party funds raised.

In such a context, it is essential that the players responsible for implementing CoARA do so in a coordinated manner, and that, for example, a researcher is assessed in a similar way at all Swiss universities, with the active support from swissuniversities.

This reform of research evaluation rightfully requires to abandon the reductionism induced by single indices like the h-index for individual researchers or the even more questionable rankings of research institutions, whose biases have been clearly demonstrated (Moed H., 2017, Sayed O., 2019, Fauzi et al, 2020). A fair assessment of research activities and impact requires a more comprehensive and more qualitative approach. Such approaches have been developed for academic institutions (eg as well as for individual researchers ( To be successful, such a fair approach requires a collective effort from researchers in different disciplines, higher-education institutions and the supervisory authorities, a cooperation that is at the very heart of the CoARA principles.


European Commission, Directorate-General for Research and Innovation, Cabello Valdes, C., Rentier, B., & Kaunismaa, E. (2017). Evaluation of research careers fully acknowledging Open Science practices: rewards, incentives and/or recognition for researchers practicing Open Science, Publications Office.

Fauzi, M. A., Tan, C. N.-L., Daud, M., & Awalludin, M. M. N. (2020). University rankings: A review of methodological flaws. Issues in Educational Research, 30(1), 79–96.

Moed, H.F. (2017). A critical comparative analysis of five world university rankings. Scientometrics 110, 967–990.

Nicholas, D., Watkinson, A., Abrizah, A., Rodríguez-Bravo, B., Boukacem-Zeghmouri, C., Xu, J., Świgoń, M., & Herman, E. (2020). Does the scholarly communication system satisfy the beliefs and aspirations of new researchers? Summarizing the Harbingers research. Learned Publishing, 33, 132-141.

Sayed, O. (2019). Critical Treatise on University Ranking Systems. Open Journal of Social Sciences, 7, 39-51.

Photo of author

Dr Patrick Furrer

After 10 years of research in biophysics, Patrick Furrer advised ICT researchers for more than 12 years in the European ICT program. He then managed the research and innovation agenda of the HES-SO (, before joining swissuniversities as national open science coordinator. In 2019, he created, and he’s currently director of the Medical Library at the University of Lausanne.

Leave a comment