Research assessment is a broad concept. It can be said that it concerns the assessment of the research as well as the metrics of research contributions and results and research impact. It embraces both qualitative and quantitative methodologies including the use of bibliometric indicators, mapping and peer review.
It is important for researchers to be aware of the research impact since it helps them to:
- Strengthen their position when they claim tenure or promotion.
- Be more able to make future funding requests by showing the importance of their research.
- Understand the public and learn how to address it.
- Recognize who uses their work and ensure that it is properly acknowledged.
- Recognize partners relevant or not to their subject area.
- Manage their academic reputation.
Indicators and metrics
Metrics and indicators, most of which come from commercial providers/creators, can often be misunderstood or misused. Despite the fact that the indicators may be correct and accurate, the science and research environment and all the involved stakeholders make it difficult and, sometimes, unreliable to use indicators exclusively. In any case, they should be examined and used in a careful and proper way.
Below there is a review and a description of some of the indicators. The metrics indicators can provide information at either journal or article level and most of them are providers’ products, such as Thomson Reuters, Elsevier, and Google. While Web of Science (Thomson Rheuters) and Scopus (Elsevier) are commercial tools, accessible through subscription, Google Scholar is freely available but it lacks quality control. It is advisable that data from Google Scholar be used following a meticulous check and data cleaning.
|Impact factor||Journal||Thomson Reuters||Average citations in a 2-year period||field-dependent, less relevant to Humanities & Arts|
|Eigenvalue Score||Journal||Thomson Reuters||5-year average citations, self-citations are excluded||field-dependent, less relevant to Humanities & Arts|
|CiteScore||Journal||Elsevier||Average citations, 5-year time limit, all types of documents||field-dependent|
|Citations||Article||Thomson Reuters, Elsevier, Google Scholar||Item citations, it reflects the actual use of the paper.||Dependent on the field, the journal, the journal age, the publication length, self-citations, negative and indirect citations.|
|h-index||Article||Thomson Reuters, Elsevier, Google Scholar||Measurement of productivity and citations impact by a scholar using the collection of papers with most citations and their citation number (eg. h-index 6 = the scholar has 6 papers with at least 6 citations each.||field-dependent, length-of-career-dependent, flattens out positive outliers & neglects the long tail of lesser-cited papers|
|Usage data||Article||www (repository, publishers …)||Number of downloads and visits, total reads, total shares||More studies and research are required|
|Tweets, likes, comments||Article||www (Twitter, Facebook)||More studies and research are required|
Principles of Research Assessment
The fact that the assessment of research and researchers has been associated with metrics indicators for years has created a lot of difficulties and disadvantages. Therefore, it became obvious that the research assessment should focus on the actual content of the research and not, for example, on the journal titles. All the parties involved, such as researchers, funders, publishers and institutions, realized the need to improve the means and methods to assess research. This resulted in the development of good practices in research assessment, such as the ones described in the San Francisco Declaration on Research Assessment and the Leiden Manifesto. What is more, the European Commission has made extensive reference to research assessment and the changes that should be made to the system in the report it published concerning the future of Scholarly Publication and Communication in 2019. It recommends an assessment system which would be objective, neutral and austere and calls universities and research centres to develop a transparent and fair assessment system by adopting the principles outlined in the San Francisco Declaration and the Leiden Manifesto.
In 2012, the Declaration on Research Assessment (DORA) was drawn up during the Annual Meeting of the American Society for Cell Biology in San Francisco, USA, which covers every scientific field and is available in Greek, as well. The declaration focuses on the following subjects:
- the need to eliminate the use of journal-based metrics, such as Journal Impact Factors, in funding, appointment, and promotion considerations
- the need to assess research on its own merits rather than on the basis of the journal in which the research is published; and
- the need to capitalize on the opportunities provided by online publication (such as relaxing unnecessary limits on the number of words, figures, and references in articles, and exploring new indicators of significance and impact),
while the general recommendation is “Do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions”.
The Leiden Manifesto for Research Metrics was drawn up by five experts who suggested 10 principles for research metrics and published in Nature in 2015. The principles are roughly outlined below:
- Quantitative evaluation should support qualitative, expert assessment.
- Measure performance against the research missions of the institution, group or researcher.
- Protect excellence in locally relevant research.
- Keep data collection and analytical processes open, transparent and simple
- Allow those evaluated to verify data and analysis.
- Account for variation by field in publication and citation practices.
- Base assessment of individual researchers on a qualitative judgement of their portfolio.
- Avoid misplaced concreteness and false precision.
- Recognize the systemic effects of assessment and indicators.
- Scrutinize indicators regularly and update them