1
|
Dugerdil A, Babington-Ashaye A, Bochud M, Chan M, Chiolero A, Gerber-Grote A, Künzli N, Paradis G, Puhan MA, Suggs LS, Van der Horst K, Escher G, Flahault A. A New Model for Ranking Schools of Public Health: The Public Health Academic Ranking. Int J Public Health 2024; 69:1606684. [PMID: 38528851 PMCID: PMC10961396 DOI: 10.3389/ijph.2024.1606684] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2023] [Accepted: 02/26/2024] [Indexed: 03/27/2024] Open
Abstract
Objectives: As there is no ranking designed for schools of Public Health, the aim of this project was to create one. Methods: To design the Public Health Academic Ranking (PHAR), we used the InCites Benchmarking and Analytics™ software and the Web Of Science™ Core Collection database. We collected bibliometric data on 26 schools of Public Health from each continent, between August and September 2022. We included 11 research indicators/scores, covering four criteria (productivity, quality, accessibility for readers, international collaboration), for the period 2017-2021. For the Swiss School of Public Health (SSPH+), a network gathering faculties across different universities, a specific methodology was used, with member-specific research queries. Results: The five top schools of the PHAR were: London School of Hygiene and Tropical Medicine, Public Health Foundation of India, Harvard T.H. Chan School of Public Health, SSPH+, Johns Hopkins Bloomberg School of Public Health. Conclusion: The PHAR allows worldwide bibliometric ordering of schools of Public Health. As this is a pilot project, the results must be taken with caution. This article aims to critically discuss its methodology and future improvements.
Collapse
Affiliation(s)
- Adeline Dugerdil
- Institut de Santé Globale, Faculté de Médecine, Université de Genève, Geneva, Geneva, Switzerland
| | - Awa Babington-Ashaye
- Institut de Santé Globale, Faculté de Médecine, Université de Genève, Geneva, Geneva, Switzerland
| | - Murielle Bochud
- Center for Primary Care and Public Health, University Center of General Medicine and Public Health, Lausanne, Vaud, Switzerland
| | - Margaret Chan
- Vanke School of Public Health, Tsinghua University, Beijing, China
| | - Arnaud Chiolero
- Population Health Laboratory, Fribourg, Fribourg, Switzerland
- Institute of Primary Healthcare, Bern, Bern, Switzerland
- Department of Epidemiology, Biostatistics and Occupational Health, School of Population and Global Health, Faculty of Medicine and Health Sciences, McGill University, Montreal, QC, Canada
| | - Andreas Gerber-Grote
- School of Health Sciences, Zurich University of Applied Sciences, Winterthur, Zürich, Switzerland
| | - Nino Künzli
- Swiss Tropical and Public Health Institute (Swiss TPH), Basel, Basel-Landschaft, Switzerland
- University of Basel, Basel, Switzerland
- Swiss School of Public Health (SSPH+) Directorate, Zürich, Zürich, Switzerland
| | - Gilles Paradis
- Department of Epidemiology, Biostatistics and Occupational Health, School of Population and Global Health, Faculty of Medicine and Health Sciences, McGill University, Montreal, QC, Canada
| | - Milo Alan Puhan
- Department of Epidemiology, Institute of Epidemiology, Biostatistics and Prevention, Faculty of Medicine, University of Zurich, Zurich, Zurich, Switzerland
| | - L. Suzanne Suggs
- Institute of Public Health and Institute of Communication and Public Policy, Lugano, Ticino, Switzerland
| | - Klazine Van der Horst
- Department of Health Professions, Bern University of Applied Sciences, Bern, Bern, Switzerland
| | - Gérard Escher
- Geneva Science and Diplomacy Anticipator, Geneva, Geneva, Switzerland
| | - Antoine Flahault
- Institut de Santé Globale, Faculté de Médecine, Université de Genève, Geneva, Geneva, Switzerland
- Swiss School of Public Health (SSPH+) Directorate, Zürich, Zürich, Switzerland
| |
Collapse
|
2
|
Boric S, Reichmann G, Schlögl C. Possibilities for ranking business schools and considerations concerning the stability of such rankings. PLoS One 2024; 19:e0295334. [PMID: 38358966 PMCID: PMC10868868 DOI: 10.1371/journal.pone.0295334] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2023] [Accepted: 11/21/2023] [Indexed: 02/17/2024] Open
Abstract
In this article, we discuss possibilities for ranking business schools and analyse the stability of research rankings using different ranking methods. One focus is set on a comparison of publication-based rankings with citation-based rankings. Our considerations and discussions are based on a (small) case study for which we have examined all (six) business schools at public universities in Austria. The innovative aspect of our article is the chosen mix of methods and the explicit comparison of the results of a publication analysis with those of a citation analysis. In addition, we have developed a new indicator to check the stability of the obtained ranking results with regard to the individual business schools. The results show that the ranks of the individual business schools are quite stable. Nevertheless, we found some differences between publication-based and citation-based rankings. In both cases, however, the choice of the data source as well as switching from full to adjusted counting only have little impact on the ranking results. The main contribution of our approach to research in the field of university rankings is that it shows that focusing on a single (overall) indicator should be avoided, as this can easily lead to bias. Instead, different (partial) indicators should be calculated side by side to provide a more complete picture.
Collapse
Affiliation(s)
- Sandra Boric
- Department of Journals, Databases, and License Management, University Library Graz, University of Graz, Graz, Austria
| | - Gerhard Reichmann
- Institute of Operations and Information Systems, University of Graz, Graz, Austria
| | - Christian Schlögl
- Institute of Operations and Information Systems, University of Graz, Graz, Austria
| |
Collapse
|
3
|
Aroyewun TF, Olaleye SO, Adebisi YA, Perveen A. Bibliometric analysis of contributions to COVID-19 research in Malaysia. Ann Med Surg (Lond) 2022; 84:104823. [PMCID: PMC9636597 DOI: 10.1016/j.amsu.2022.104823] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2022] [Revised: 09/30/2022] [Accepted: 10/30/2022] [Indexed: 11/08/2022] Open
Affiliation(s)
| | - Sefiu Olalekan Olaleye
- Department of Pharmaceutical Chemistry, Faculty of Pharmacy, Federal University, Oye-Ekiti, Nigeria
| | | | - Asma Perveen
- Department of Psychology and Counselling, Universiti Pendidikan Sultan Idris, Perak, Malaysia
| |
Collapse
|
4
|
Dugerdil A, Sponagel L, Babington-Ashaye A, Flahault A. Rethinking International University Ranking Systems in the Context of Academic Public Health. Int J Public Health 2022; 67:1605252. [PMID: 36105177 PMCID: PMC9464806 DOI: 10.3389/ijph.2022.1605252] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2022] [Accepted: 08/16/2022] [Indexed: 11/17/2022] Open
|
5
|
Adebisi YA. Nigeria's scientific contributions to COVID-19: A bibliometric analysis. Ann Med Surg (Lond) 2022; 80:104316. [PMID: 35958287 PMCID: PMC9356628 DOI: 10.1016/j.amsu.2022.104316] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2022] [Accepted: 07/28/2022] [Indexed: 11/28/2022] Open
|
6
|
Schneijderberg C, Götze N, Müller L. A study of 25 years of publication outputs in the German academic profession. Scientometrics 2022. [DOI: 10.1007/s11192-021-04216-2] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
AbstractIn the weak evaluation state of Germany, full professors are involved in the traditional social governance partnership between the state, and the self-governing higher education institutions (HEI) and disciplinary associations. Literature suggests that formal and informal governance could trigger changes in academics’ publication behavior by valorizing certain publication outputs. In the article, secondary data from three surveys (1992, 2007 and 2018) is used for a multi-level study of the evolution of academics’ publication behavior. We find a trend toward the “model” of natural science publication behavior across all disciplines. On the organizational level, we observe that a strong HEI research performance orientation is positively correlated with journal articles, peer-reviewed publications, and co-publications with international co-authors. HEI performance-based funding is only positively correlated with the share of peer-reviewed publications. At the level of individual disciplines, humanities and social sciences scholars adapt to the peer-reviewed journal publication paradigm of the natural sciences at the expense of book publications. Considering how the academic profession is organized around reputation and status, it seems plausible that the academic profession and its institutional oligarchy are key contexts for the slow but steady change of academics’ publication behavior. The trend of changing academics’ publication behavior is partly related to HEI valorization of performance and (to a lesser extent) to HEI performance based-funding schemes, which are set by the strong academic profession in the weak evaluation state of Germany.
Collapse
|
7
|
Moskovkin VM, Zhang H, Sadovski MV, Serkina OV. Comprehensive quantitative analysis of TOP-100s of ARWU, QS and THE World University Rankings for 2014–2018. EDUCATION FOR INFORMATION 2021. [DOI: 10.3233/efi-211539] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
Abstract
The article examines the global university reputation race, launched in 2003. Between 2003 and 2010, there appeared a cluster of publications on the qualitative comparative analysis of their methodologies, and since 2010, a cluster of publications on the quantitative comparative analysis of university rankings has started to form. The review made it possible to identify a number of unsolved problems concerning the stability of university rankings, aggregation of the number of universities and their Overall Scores (Total Scores) by country in various rankings. Our study aimed at solving these tasks was carried out for TOP-100s of ARWU, QS, and THE rankings. When calculating the fluctuation range of the university rankings, the top twenty of the most stable and most unstable university rankings were identified in the rankings under study. The best values of the aggregated indicators by the number of universities and the Overall Scores were identified for the USA and the UK.
Collapse
|
8
|
Fine-grained academic rankings: mapping affiliation of the influential researchers with the top ranked HEIs. Scientometrics 2021. [DOI: 10.1007/s11192-021-04138-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
9
|
Huang CKK, Wilson K, Neylon C, Ozaygen A, Montgomery L, Hosking R. Mapping open knowledge institutions: an exploratory analysis of Australian universities. PeerJ 2021; 9:e11391. [PMID: 34026359 PMCID: PMC8121066 DOI: 10.7717/peerj.11391] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2020] [Accepted: 04/12/2021] [Indexed: 11/24/2022] Open
Abstract
While the movement for open research has gained momentum in recent years, there remain concerns about the broader commitment to openness in knowledge production and dissemination. Increasingly, universities are under pressure to transform themselves to engage with the wider community and to be more inclusive. Open knowledge institutions (OKIs) provide a framework that encourages universities to act with the principles of openness at their centre; not only should universities embrace digital open access (OA), but also lead actions in cultivating diversity, equity, transparency and positive changes in society. This leads to questions of whether we can evaluate the progress of OKIs and what are potential indicators for OKIs. As an exploratory study, this article reports on the collection and analysis of a list of potential OKI indicators. Data for these indicators are gathered for 43 Australian universities. The indicators provide high-dimensional and complex signals about university performances. They show evidence of large disparities in characteristics such as Indigenous employment and gender equity, and a preference for repository-mediated OA across Australian universities. We demonstrate use of the OKI evaluation framework to categorise these indicators into three platforms of diversity, communication and coordination. The analysis provides new insights into the Australian open knowledge landscape and ways of mapping different paths of OKIs.
Collapse
Affiliation(s)
- Chun-Kai Karl Huang
- Centre for Culture and Technology, Curtin University, Bentley, Western Australia, Australia
| | - Katie Wilson
- Centre for Culture and Technology, Curtin University, Bentley, Western Australia, Australia
| | - Cameron Neylon
- Centre for Culture and Technology, Curtin University, Bentley, Western Australia, Australia.,Curtin Institute for Computation, Curtin University, Bentley, Western Australia, Australia
| | - Alkim Ozaygen
- Centre for Culture and Technology, Curtin University, Bentley, Western Australia, Australia
| | - Lucy Montgomery
- Centre for Culture and Technology, Curtin University, Bentley, Western Australia, Australia.,Curtin Institute for Computation, Curtin University, Bentley, Western Australia, Australia
| | - Richard Hosking
- Centre for Culture and Technology, Curtin University, Bentley, Western Australia, Australia.,Curtin Institute for Computation, Curtin University, Bentley, Western Australia, Australia
| |
Collapse
|
10
|
Chen W, Zhu Z, Jia T. The rank boost by inconsistency in university rankings: Evidence from
14 rankings of Chinese universities. QUANTITATIVE SCIENCE STUDIES 2021. [DOI: 10.1162/qss_a_00101] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023] Open
Abstract
Abstract
University ranking has become an important indicator for prospective students, job recruiters, and government administrators. The fact that a university rarely has the same position in different rankings motivates us to ask: To what extent could a university’s best rank deviate from its “true” position? Here we focus on 14 rankings of Chinese universities. We find that a university’s rank in different rankings is not consistent. However, the relative positions for a particular set of universities are more similar. The increased similarity is not distributed uniformly among all rankings. Instead, the 14 rankings demonstrate four clusters where rankings are more similar inside the cluster than outside. We find that a university’s best rank strongly correlates with its consensus rank, which is, on average, 38% higher (towards the top). Therefore, the best rank usually advertised by a university adequately reflects the collective opinion of experts. We can trust it, but with a discount. With the best rank and proportionality relationship, a university’s consensus rank can be estimated with reasonable accuracy. Our work not only reveals previously unknown patterns in university rankings but also introduces a set of tools that can be readily applied to future studies.
Collapse
Affiliation(s)
- Wenyu Chen
- College of Computer and Information Science, Southwest University, Chongqing, 400715, P. R. China
| | - Zhangqian Zhu
- Department of National Defense Economy, Army Logistics University of Chinese People’s Liberation Army, Chongqing, 500106, P. R. China
| | - Tao Jia
- College of Computer and Information Science, Southwest University, Chongqing, 400715, P. R. China
| |
Collapse
|
11
|
Selten F, Neylon C, Huang CK, Groth P. A longitudinal analysis of university rankings. QUANTITATIVE SCIENCE STUDIES 2020. [DOI: 10.1162/qss_a_00052] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
Abstract
Pressured by globalization and demand for public organizations to be accountable, efficient, and transparent, university rankings have become an important tool for assessing the quality of higher education institutions. It is therefore important to assess exactly what these rankings measure. Here, the three major global university rankings—the Academic Ranking of World Universities, the Times Higher Education ranking and the Quacquarelli Symonds World University Rankings—are studied. After a description of the ranking methodologies, it is shown that university rankings are stable over time but that there is variation between the three rankings. Furthermore, using principal component analysis and exploratory factor analysis, we demonstrate that the variables used to construct the rankings primarily measure two underlying factors: a university’s reputation and its research performance. By correlating these factors and plotting regional aggregates of universities on the two factors, differences between the rankings are made visible. Last, we elaborate how the results from these analysis can be viewed in light of often-voiced critiques of the ranking process. This indicates that the variables used by the rankings might not capture the concepts they claim to measure. The study provides evidence of the ambiguous nature of university rankings quantification of university performance.
Collapse
Affiliation(s)
- Friso Selten
- Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands
| | - Cameron Neylon
- Centre for Culture and Technology, Curtin University, Perth, Australia
| | - Chun-Kai Huang
- Centre for Culture and Technology, Curtin University, Perth, Australia
| | - Paul Groth
- Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|