Measure of Scientific Impact: How Altmetrics Can Innovate the Approach in a Multidimensional Model

Valeria Scotti1, Annalisa De Silvestri2, Luigia Scudeller3, Chiara Rebuffi1, Funda Topuz1 and Moreno Curti1 1 Center for Scientific Documentation, Scientific Direction, Fondazione I.R.C.C.S Policlinico San Matteo, Pavia, IT 2 Clinical Epidemiology and Biostatistics Unit, Scientific Direction, Fondazione I.R.C.C.S Policlinico San Matteo, Pavia, IT 3 ESCMID Medical Guidelines Director Scientific, Clinical Epidemiology and Biostatistics, Scientific Direction, IRCCS Ca’ Granda Ospedale Maggiore Policlinico Foundation, IT Corresponding author: Valeria Scotti (v.scotti@smatteo.pv.it)


Background
The problem of measuring the scientific and social impact of research publications has been of extreme interest to scientists and scholars, as the first source of research waste chain (Macleod et al. 2014) is the limited relevance of many research questions to patients. Taking into account their opinion in selecting research priorities should lead to improvement in research and decrease of waste. Waste results when the needs of users of research evidence are ignored. If researchers do not meet the needs of the users of research, evidence will have less of an effect on clinical and public health practice than it should. The principal users of clinical and epidemiological research are clinicians and the patients who look to them for help. Both are often frustrated by mismatches between the uncertainties that they wish to see addressed in research and the questions that researchers choose to investigate (Liberati 2011).
Alternative metrics claim to measure research impact outside the academic community. As a consequence, the concept of impact of research has been rapidly evolving in the recent years from a scenario evaluating only impact on scientific community to a scenario evaluating impact also on general society: altmetrics take their place alongside well-known terms as H-index or impact factor (IF). A new scenario for the evaluation of science opens, where the interaction between scholarly work and social networks and, more widely, society can be explored (Bornmann et al. 2018).
Librarians are interested in these themes, and their knowledge about this issue is rapidly growing (Gómez-Sánchez et al. 2019); their role is changing as they become more and more involved in research design.

Objectives
We seek to answer the following questions. How can we help our researchers with this new data? Using courses, training, help in completing a CV or something new? How can we use this data for the institution (compared with traditional methods)? What are the clinics that get the most citations and altmetric scores? Which lines of research are most attractive (maybe for funds or grants)? What was the citational trend over the years of our hospital?

Methods/Description
In this monocentric study, we collected the scientific production from the year 2011 (for a total of n. 3176) articles of our hospital.
With FileMaker 11 software (https://www.filemaker.com/), we have created a database that collects citations and altmetrics of all research articles produced by our researchers at Foundation IRCCS Policlinico San Matteo. FileMaker is a cross-platform relational database application: Data (field values) can be imputed or retrieved importing from another application. Field values in a FileMaker file can be: text, numbers, dates, times, timestamps, pictures, sounds, movies, enclosed files, calculated values, and summary values. It is possible to create reports to group or summarize data.
We retrieved citations for each article through the Web of Science and Scopus databases. Through the PMID and the DOI of each publication, we obtained each one's score on Almetric.com (Figure 1). Launching the update, the system was able to connect to both Web of Science, to Scopus and to Altmetric.com (Figure 2). Data can then be broken down by year, department, or unit.
We assessed the correlation between altmetrics, citation counts and traditional indexes, by Spearman's rank correlation coefficient. The magnitude of an effect size for correlation coefficients was evaluated as follows, as described by Cohen: small for correlation coefficients on the order of 0.1, medium for those on the order of 0.3, and large for those  on the order of 0.5 (Cohen 1988). In our study, we considered a correlation coefficient of greater than 0.3 significant, in line with many correlation coefficients reported in the literature (Hemphill 2003). The results of papers were then grouped across departments or research themes and both WoS citation and altmetrics scores were summed up. Also, we analyzed trends over time of both altmetrics and traditional indexes using the trend test across ordered groups (Cuzick 1985).

Results
a) Our institution every year promotes a course on the use of social media for researchers with great attendance. Some of our researchers are interested in including altmetrics scores in their CV. b) Good correlation between Altmetric.com score and traditional metrics (WOS citation) is observed (Figure 3) for the whole period and for each year considered separately (r ranging from 0.33 in 2012 and 0.45 in 2013).
Some research themes (defined using lines of research designed by Italian MoH for funding purposes) had an unexpectedly good altmetric score compared to traditional citations, such as chronic immunological diseases (this could be a sign of particular interests of patients and patient organizations). In contrast, bone marrow transplantation and related diseases have a greater citation index than the altmetric score (this could be much more interesting for the clinician and research community). Despite observing a good correlation (rho = 0.40) in department analysis, some discrepancies emerged (for example, papers from pain therapy unit have a higher score on altmetrics than on WoS citation). A high percentage of papers have their own altmetric score and the altmetric's total score increases every year (p for trend < 0.001).

Discussion
In our study, we confirmed the correlation of altmetric score with standard bibliometric indexes at the institutional level already shown in a previous work (Scotti et al. 2016) based on 2013 data.
As expected, comparisons between altmetrics scores and more traditional indexes can discriminate between research themes that may have a greater impact on the general lay public than on the research community. This could be important to identify those research areas that need more consideration if we want to reduce research waste that results when the needs of end-users of research are ignored. So it is becoming more and more evident that alternative metrics may play a crucial role in helping society as well as patient communities to retrieve reliable information on research needs. In synthesis, designing research not only based on systematic reviews of the available evidence (evidence-based research), but also on papers and themes more discussed by the public could result in less wasteful research, especially in the medical field.
Researchers, together with knowledgeable scientific journalists, could contribute to spreading relevant scientific results for the scientific education of the public. In fundraising activity, they may also highlight the value of the most successful research programs (meaning research programs that have a higher social impact, i.e., a higher altmetric score) to their institutions, because altmetrics measure the impact in real time. Showing how research is relevant to the general public is useful, especially for institutions and foundations funded by public money, such as the one taken into consideration in this study.
Altmetrics could contribute to the ' creation of value' and give a more complete prospective on the important question of the democratization of evaluation, as unlike citation metrics, altmetrics will track the impact outside the academy. As a matter of fact, a wider use of quantitative indicators and the emergence of altmetrics can be seen as part of the transition to a more accountable and transparent research system. As the San Francisco Declaration on Research Assessment (DORA) states, 'The outputs from scientific research are many and varied […] It is thus imperative that scientific output is measured accurately and evaluated wisely.'

Limitation and strength of the study
The data from our study are from a single institution, resulting in a smaller sample size compared to other studies. However, the in-depth analysis of various research themes or departments in a single institution reduces the heterogeneity inherent in data coming from different institutions, and allows for an analysis of a real-life situation and a pragmatic measure of the impact this new metric should have in addition to the traditional ones.

Conclusions and further works
Altmetrics are confirmed as an interesting complement to citations. We would like to further explore the possibility of combining altmetrics with traditional indicators in a more multidimensional model that could also include the single component of altmetric score, to assess the impact of scientific works over a given period of time, and to assess the reliability of such a complex model.