Exploring research quality: Insights from the 5th workshop of the Swiss Year of Scientometrics

The 5th Workshop of the Swiss Year of Scientometrics was held on November 14 at the University of Fribourg. The workshop focused on assessing research quality using bibliometric indicators. It was co-moderated by Simon Willemin and David Johann from the ETH Library of ETH Zurich.

This was the last event of the series of workshops that took place during the year-long project on Scientometrics. The workshops, that started in June 2023, included three events at the ETH in Zurich and another at EPFL in Lausanne. This last meeting took place in the bilingual University of Fribourg. Both, the University and City of Fribourg, are often regarded as a bridge between the two biggest linguistical regions of Switzerland. This occasion was not the exception, as the event received participants from universities and research centres from Geneva up to St. Gallen.

Online discussion with the lightning talk speakers
Discussion with the speakers of the lightning talks (ETH Zurich / Annette Guignard)

The workshop started with three on-line lightning talks, where speakers presented the situation of research quality assessment in Germany, England and Spain. Then, in the first collective session of the day, the attendants divided in groups, discussed the situation of research assessment in their respective Institutions, whether there existed an institutional definition of research quality, or if bibliometric indicators were used for the assessment of the research quality. This question, which primarily concerned universities, revealed that an official definition of research quality was absent in nearly all the institutions represented. According to the attendees, only one participant provided a definition related to the impact of research; however, this definition did not specify what was meant by impact.

None of the attendees reported any knowledge of research funding or resources being allocated in their institutions directly based on performance measured by bibliometric indicators. However, bibliometric data is collected in some institutions for purposes such as monitoring, planning, and reporting, often in the form of annual reports. In other institutions, the research output of individual researchers, groups, or departments is not evaluated using traditional bibliometric indicators.

Presentation at the SYoS workshop
David Johann and Simon Willemin moderating the workshop (ETH Zurich / Annette Guignard)

The second session of the day was centred directly on the specific bibliometric indicators. A collective effort was put in trying to answer if the various indicators provide information about the quality of research, when it makes sense to use them, and if there is a lack of new indicators or other type of information. Regarding quality, it was agreed that the most widely accepted definition of research quality is associated with those indicators based on the number of citations of a research piece, normalized per domain and year. In general, attendants expressed that quantitative indicators such as the number of publications, could not and should not be linked to the quality of research.

Answering these questions fully, however, requires making several assumptions. Notably, the production of bibliometric indicators relies mostly on privately-owned databases that are known to have biases, particularly with respect to scientific domain and language. Even when these databases are accepted as a valid source, the indicators produced often fail to measure what they claim to. For instance, gender-related indicators do not delve into the question if a research piece is related to gender-related topics, instead it measures the ratio of female authorship, relying on machine learning processes to infer the gender of authors based on their given names. This approach has significant flaws, as it may not account for linguistic and cultural nuances.

There was a consensus among most attendees that bibliometric indicators are useful for understanding how science operates in analytical terms, for monitoring purposes, informing policy decisions and implementing them. However, it was suggested that there is a lack of information in several other aspects of research output, such as peer-reviewing or conference organisation. Preregistration counts were also highlighted as significant information worth measuring and tracking, emphasizing that research assessment is a relative concept, inherently momentary and its results are context-specific and domain-dependent.

The SYoS project will continue in 2025, with the next meeting programmed for September 10th and 11th. Stay informed through this blog about the registration and speakers announcements. We look forward to count with your participation and input.

Lightning talks
Dr Barbara S. Lancho Barrantes (University of Brighton, Chair of the LIS Bibliometrics Committee): presentation
Prof. José Luis Aznarte Mellado (National Agency for Quality Assessment and Accreditation of Spain ANECA)
Bianca Kramer (Sesame Open Science) and Jeroen Bosman (Utrecht University): presentation
Photo of author

Dr Efrain Ochoa Martinez

Efrain Ochoa is a physicist by training. He works as Scientific Collaborator at the Research Promotion Service of the University of Fribourg, where he is in charge of data gathering and reporting for academic and sustainability rankings.

Leave a comment