1
|
Jin H, Wang Q, Yang YF, Zhang H, Gao M(M, Jin S, Chen Y(S, Xu T, Zheng YR, Chen J, Xiao Q, Yang J, Wang X, Geng H, Ge J, Wang WW, Chen X, Zhang L, Zuo XN, Chuang-Peng H. The Chinese Open Science Network (COSN): Building an Open Science Community From Scratch. ADVANCES IN METHODS AND PRACTICES IN PSYCHOLOGICAL SCIENCE 2023. [DOI: 10.1177/25152459221144986] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/29/2023]
Abstract
Open Science is becoming a mainstream scientific ideology in psychology and related fields. However, researchers, especially early-career researchers (ECRs) in developing countries, are facing significant hurdles in engaging in Open Science and moving it forward. In China, various societal and cultural factors discourage ECRs from participating in Open Science, such as the lack of dedicated communication channels and the norm of modesty. To make the voice of Open Science heard by Chinese-speaking ECRs and scholars at large, the Chinese Open Science Network (COSN) was initiated in 2016. With its core values being grassroots-oriented, diversity, and inclusivity, COSN has grown from a small Open Science interest group to a recognized network both in the Chinese-speaking research community and the international Open Science community. So far, COSN has organized three in-person workshops, 12 tutorials, 48 talks, and 55 journal club sessions and translated 15 Open Science-related articles and blogs from English to Chinese. Currently, the main social media account of COSN (i.e., the WeChat Official Account) has more than 23,000 subscribers, and more than 1,000 researchers/students actively participate in the discussions on Open Science. In this article, we share our experience in building such a network to encourage ECRs in developing countries to start their own Open Science initiatives and engage in the global Open Science movement. We foresee great collaborative efforts of COSN together with all other local and international networks to further accelerate the Open Science movement.
Collapse
|
2
|
Krüger AK, Petersohn S. From Research Evaluation to Research Analytics. The digitization of academic performance measurement. VALUATION STUDIES 2022. [DOI: 10.3384/vs.2001-5992.2022.9.1.11-46] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
One could think that bibliometric measurement of academic performance has always been digital since the computer-assisted invention of the Science Citation Index. Yet, since the 2000s, the digitization of bibliometric infrastructure has accelerated at a rapid pace. Citation databases are indexing an increasing variety of publication types. Altmetric data aggregators are producing data on the reception of research outcomes. Machine-readable persistent identifiers are created to unambiguously identify researchers, research organizations, and research objects; and evaluative software tools and current research information systems are constantly enlarging their functionalities to make use of these data and extract meaning from them. In this article, we analyse how these developments in evaluative bibliometrics have contributed to an extension of indicator-based research evaluation towards data-driven research analytics. Drawing on empirical material from blogs and websites as well as from research and policy papers, we discuss how interoperability, scalability, and flexibility as material specificities of digital infrastructures generate new ways of data production and their assessment, which affect the possibilities of how academic performance can be understood and (e)valuated.
Collapse
|
3
|
Nishikawa-Pacher A, Heck T, Schoch K. Open Editors: A dataset of scholarly journals’ editorial board positions. RESEARCH EVALUATION 2022. [DOI: 10.1093/reseval/rvac037] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Abstract
Editormetrics analyses the role of editors of academic journals and their impact on the scientific publication system. Such analyses would best rely on open, structured, and machine-readable data about editors and editorial boards, which still remains rare. To address this shortcoming, the project Open Editors collects data about academic journal editors on a large scale and structures them into a single dataset. It does so by scraping the websites of 7,352 journals from 26 publishers (including predatory ones), thereby structuring publicly available information (names, affiliations, editorial roles, ORCID etc.) about 594,580 researchers. The dataset shows that journals and publishers are immensely heterogeneous in terms of editorial board sizes, regional diversity, and editorial role labels. All codes and data are made available at Zenodo, while the result is browsable at a dedicated website (https://openeditors.ooir.org). This dataset carries implications for both practical purposes of research evaluation and for meta-scientific investigations into the landscape of scholarly publications, and allows for critical inquiries regarding the representation of diversity and inclusivity across academia.
Collapse
Affiliation(s)
- Andreas Nishikawa-Pacher
- Department of Constitutional and Legal History, University of Vienna , 1010 Vienna , Austria
- TU Wien Bibliothek , Resselgasse 4, 1040 Vienna , Austria
- Department of International Relations, Vienna School of International Studies Austria
| | - Tamara Heck
- DIPF
- Leibniz Institute for Research and Information in Education , 60323 Frankfurt , Germany
| | - Kerstin Schoch
- Pop-Up Institute, HKS—University of Applied Sciences and Arts Germany
- Department Psychology and Psychotherapy, Witten/Herdecke University , 58455 Witten , Germany
| |
Collapse
|
4
|
Nishikawa-Pacher A. Measuring serendipity with altmetrics and randomness. JOURNAL OF LIBRARIANSHIP AND INFORMATION SCIENCE 2022. [DOI: 10.1177/09610006221124338] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Many discussions on serendipitous research discovery stress its unfortunate immeasurability. This unobservability may be due to paradoxes that arise out of the usual conceptualizations of serendipity, such as “accidental” versus “goal-oriented” discovery, or “useful” versus “useless” finds. Departing from a different distinction drawn from information theory—bibliometric redundancy and bibliometric variety—this paper argues otherwise: Serendipity is measurable, namely with the help of altmetrics, but only if the condition of highest bibliometric variety, or randomness, obtains. Randomness means that the publication is recommended without any biases of citation counts, journal impact, publication year, author reputation, semantic proximity, etc. Thus, serendipity must be at play in a measurable way if a paper is recommended randomly, and if users react to that recommendation (observable via altmetrics). A possible design for a serendipity-measuring device would be a Twitter bot that regularly recommends a random scientific publication from a huge corpus to capture the user interactions via altmetrics. Other than its implications for the concept of serendipity, this paper also contributes to a better understanding of altmetrics’ use cases: not only do altmetrics serve the measurement of impact, the facilitation of impact, and the facilitation of serendipity, but also the measurement of serendipity.
Collapse
|
5
|
Feenstra RA, Delgado López-Cózar E. The footprint of a metrics-based research evaluation system on Spain’s philosophical scholarship: An analysis of researchers’ perceptions. RESEARCH EVALUATION 2022. [DOI: 10.1093/reseval/rvac020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Abstract
The use of bibliometric indicators in research evaluation has a series of complex impacts on academic inquiry. These systems have gradually spread into a wide range of locations and disciplines, including the humanities. The aim of this study is to examine their effects as perceived by philosophy and ethics researchers in Spain, a country where bibliometric indicators have long been used to evaluate research. The study uses a mixed approach combining quantitative and qualitative data from a self-administered questionnaire completed by 201 researchers and from 14 in-depth interviews with researchers selected according to their affiliation, professional category, gender, and area of knowledge. Results show that the evaluation system is widely perceived to affect university researchers in significant ways, particularly related to publication habits (document type and publication language), the transformation of research agendas and the neglect of teaching work, as well as increasing research misconduct and negatively affecting mental health. Although to a lesser extent, other consequences included increased research productivity and enhanced transparency and impartiality in academic selection processes.
Collapse
Affiliation(s)
- Ramón A Feenstra
- Departamento de Filosofía y Sociología, Facultad de Ciencias Humanas y Sociales, Universitat Jaume I de Castelló , Avd/Sos Baynat s/n, Castelló de la plana 12071, Spain
| | - Emilio Delgado López-Cózar
- Departamento de Información y Comunicación, Facultad de Comunicación y Documentación, Universidad de Granada , Campus de Cartuja, s/n, Granada 18011, Spain
| |
Collapse
|
6
|
Philosophers’ appraisals of bibliometric indicators and their use in evaluation: from recognition to knee-jerk rejection. Scientometrics 2022. [DOI: 10.1007/s11192-022-04265-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
AbstractThe knowledge and stance of researchers regarding bibliometric indicators is a field of study that has gained weight in recent decades. In this paper we address this issue for the little explored areas of philosophy and ethics, and applied to a context, in this case Spain, where bibliometric indicators are widely used in evaluation processes. The study combines data from a self-administered questionnaire completed by 201 researchers and from 14 in-depth interviews with researchers selected according to their affiliation, professional category, gender and area of knowledge. The survey data suggest that researchers do not consider bibliometric indicators a preferred criterion of quality, while there is a fairly high self-perception of awareness of a number of indicators. The qualitative data points to a generalised perception of a certain rejection of the specific use of indicators, with four main positions being observed: (1) disqualification of the logic of metrics, (2) scepticism about the possibility of assessing quality with quantitative methods, (3) complaints about the incorporation of methods that are considered to belong to other disciplines, and (4) criticism of the consequences that this generates in the discipline of philosophy.
Collapse
|
7
|
Lemke S, Mazarakis A, Peters I. Conjoint analysis of researchers' hidden preferences for bibliometrics, altmetrics, and usage metrics. J Assoc Inf Sci Technol 2021. [DOI: 10.1002/asi.24445] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Affiliation(s)
- Steffen Lemke
- Web Science Department ZBW – Leibniz Information Centre for Economics Kiel Germany
| | - Athanasios Mazarakis
- Web Science Department ZBW – Leibniz Information Centre for Economics Kiel Germany
- Web Science Department Kiel University Kiel Germany
| | - Isabella Peters
- Web Science Department ZBW – Leibniz Information Centre for Economics Kiel Germany
- Web Science Department Kiel University Kiel Germany
| |
Collapse
|