Scholarly book evaluation often prioritises ‘prestige’, which leads to inconsistent and unfair outcomes. My previous research shows that such systems consider neither the intrinsic quality of the research nor the accessibility of the work itself.
Scholarly book evaluation often prioritises ‘prestige’, which leads to inconsistent and unfair outcomes. My previous research shows that such systems consider neither the intrinsic quality of the research nor the accessibility of the work itself.
Studies on transdisciplinary research often focus on how different forms of expertise are brought together to build robust knowledge. However, in policy and legal affairs, there are many situations in which it is not possible to use new transdisciplinary knowledge due to contextual factors, such as urgency, political expediency, or lack of resources.
Throughout a recent series of blog posts, we have been introducing the CWTS knowledge agenda for 2023-2028, which is divided into three new focal areas to organise our activities on specific themes.
A value-led perspective on Open Science In 2021, UNESCO approved its Recommendation on Open Science (OS). By signing this recommendation, 193 countries made a commitment to support the development of OS with a vision of science as a global public good.
When the Shanghai Ranking, also known as the Academic Ranking of World Universities (ARWU), was launched in 2003, Ton van Raan, director of CWTS at the time, sounded the alarm about the problematic way in which the ranking uses bibliometric data, for instance in attributing publications to universities.
The need to increase the transparency of university rankings is widely recognized, for instance in the ten rules for ranking universities that we published in 2017, in the work done by the INORMS Research Evaluation Working Group, and also in a recent report by a Dutch expert group on university rankings (co-authored by one of us). It is therefore not surprising that the announcement of the Open Edition of the CWTS Leiden Ranking in 2023 got
Classifying research publications into research topics or research areas is crucial for many bibliometric analyses. While there are lots of approaches for classifying publications, most of these approaches lack transparency.
How to find the most relevant scientific literature on topic X? How to evaluate the research carried out by department Y? And how to establish new strategic priorities for university Z? These are just a few examples of the many important decisions that researchers, research evaluators, and science policy makers need to make on a daily basis.
The past years have shown that science can play an important role in societal debates. Science was clearly pivotal in the development of COVID-19 vaccines. In addition, many of the interventions and policies, such as masking, school closures or even curfews, were presented as evidence-based solutions, motivated by scientific advances in our understanding of the virus.
Greetings from Peru, nestled in the heart of the Andes, where I find myself reflecting on two transformative events in my scientific journey: the CWTS Scientometrics Summer School (CS 3 ) and the 27th International Conference on Science, Technology, and Innovation Indicators (STI 2023). As a bibliometrics enthusiast for the past two decades, this experience has been nothing short of a revelation—a journey that prompted me to question
Open science was one of the key topics at the Science, Technology and Innovation Indicators (STI) conference that CWTS organised in September 2023 in Leiden, the Netherlands. Open science was not only discussed at the conference but was also put into practice in the publication and peer review process of the conference. By way of experiment, all papers submitted to the conference were published as a preprint before they were peer reviewed.