1. What are research metrics?
They are quantitative indicators or measures that provide some evidence of the
impact of a research output. A research output can be a journal, a journal article, a
book, book chapter or the overall research productivity.
They fall into 2 categories, bibliometrics and altmetrics.
- Bibliometrics are the traditional citation based metrics. They are based on
citation counts, counting how many times a publication has been cited in
another publication.
- Altmetrics are web based metrics. They are used for measuring the attention
or the interest of a scholarly work on various types of online platforms that
includes social media, research blogs, and Reference manager software,
educational sites like Wikipedia, news outlets, online forums and any other
online resources.
The use of both types of metrics contribute to evaluate and understand the impact of
a research.
Slide 4
Metrics for Journals
Journal Impact Factor
The most established metric for journal is the Journal Impact factor or JIF. It applies
to academic journals. It’s the very first metric designed for journal evaluation. It was
designed by Professor Eugene Garfield in 1955 as a measure to help librarians
selecting the purchase of important journals.
The 3 elements of the JIF calculation are the average number of citations a paper
receives- in a journal- in a given period of time. When a journal has an impact factor
2. of 5 it means that in the last three years, this journal averaged 5 citations per
published article.
This metric is also described an indicator of a journal citedness.
Journals are arranged in a ranking order from high to low JIF in their relevant field
categories.
JIF is calculated from the selected journals indexed in a source, for the JIF, its
source is the major database Web of Science that contains the science, social
sciences and art and humanities indexes.
The ranking of journals are presented in quartiles and percentiles. Some journals
appear in more than one categories.
This slide shows the JIF calculation for the journal Child development, its position in
the ranking order for the category developmental psychology. It is also ranked in
another category educational psychology, which means that the ranking and JIF for
this journal is likely to be different.
Slide 5
Other Journal Metrics
The JIF is not perfect. It has drawn many criticisms, and other metrics with different
calculations emerged to overcome its shortcomings. Some alternative metrics are:
- The Scimago Journal Rank or SJR that differentiates a citation coming from
an important journal into its formula.
- The immediacy index that indicates journals that publish cutting-edge
research.
- The Eigenfactor score that removes self-citations and calculates article
citations in a journal in the past five years.
3. - The CiteScore that includes citations to all documents, not only articles and
reviews, published in a journal in the past 3 years.
And there are other variations that attempts to provide a meaningful journal metric.
CiteScore and SJR are calculated from journals indexed in Scopus, the other major
multidisciplenary database, the main competitor to Web of Science. This slide shows
the alternative journal metrics CiteScore, SJR for the same journal Child
development, ranked in several categories different to those on the previous slide.
The impact here is shown in percentile and in rank number.
Slide 6
Citation Counts
Citation counts applies to journal article, books, book chapters. They remain the
best means of tracing engagement between researchers. They are used as
evidence of research impact to support applications for promotion, tenure and
funding.
They are also used for searching the literature and follow up the development of a
particular research topic. Hence the term citation tracking, forward looking at
citations received since the publication of a paper and backward looking at citations
cited in the paper, they are listed under references or the bibliography.
Citation counts are indicated with the terms cited by, times cited. In addition, Web
of Science provides
Usage count indicating number of times all Web of Science users have accessed or
saved the full text of a record in the last 180 days or since 2013.
It also flags the hot papers – those that receive citations shortly after publication,
4. and the Highly Cited Papers – those paper with a high citation count over many
years.
Scopus provides its additional citation count metrics. It displays the Snowball
metrics. They are a selection of metrics agreed by major research universities,
including the University of Oxford. With Snowball metrics, citations to papers are
analysed in context, the disciplines associated with them, the year of publication,
when it was published and the document type, if it’s a review or research paper.
They indicate the level of community engagement around an article and also the
relevance of that article in the field. This is also illustrated with citation benchmaking
metric showing the relevance of a paper to similar papers in percentile.
For books Web of Science contains Book Citation Indexes for Science, the Social
Sciences & Humanities, and it covers the period 2005 to –present. The selection
remains focus on science subjects. Scopus indexes books but they are book series
and like article shows citation count to those titles. You can run a book chapter
search or refine to book reviews on both databases.
Slide 7
Altmetrics
Altmetrics applies mainly to articles and books. They are supplementary indicators
of community engagement with research outputs but at a faster pace than the pace it
takes in publishing in a scholarly journal. Altmetrics tells who is talking about a
research online – if it’s a member of the public, Scientists Practitioners (doctors,
other healthcare professionals), Science communicators (journalists, bloggers,
editors), and in which country these individuals are from, how often a piece of
research is mentioned or discussed, what is being said about it.
5. The original altmetric badge is the colour doughnut designed by Altmetric company
which is available on the free version of Dimensions database.
There are other companies that provide altmetrics, Impact Stories, PlumXmetric.
PlumXmetrics is available in Scopus. They all indicate various activities around a
publication online, if a piece of research has been noticed, rated, reviewed, read
online, discussed, shared and used across various online platforms (twitter,
Facebook, Research blogs, forums, Wikipedia, news outlets, policy documents,
patents).
Slide 8
H-index
H -index is the metric that quantifies the impact of the researcher output over the
course of his or her career. It was suggested by Professor of physics Jorge E.
Hirsch in 2005. H-index is calculated by using the number of articles an author has
published to date and the number of citation each article receives. If a psychologist
has published 30 articles up to now, and if 10 of those papers have been cited each
10 times, her h-index is 10. If any of the other 20 papers with less than 10 citations
receive 11 or more citations in the future, the h-index will increase accordingly.
So the logic is the more paper a researcher publishes, the more is the potential for a
higher index. However, it takes time, a whole career. It’s expected for early career
researchers to have an index of 1 figure, and publishing many papers doesn’t
automatically means an increase in h index.
Another issue is capturing all citations, there isn’t one single source that provide
100% all data. As this slide illustrates, the h-index for the same researcher from
several different sources provides different numbers 71, 68, and 88. This is because
6. it is calculated with a different set of documents covered in these sources -235 from
Web of Science, 222 from Scopus and over 23 thousand from Google Scholar.
In addition of the h-index Google displays i10 index for authors who have 10
publications that have been cited 10 times.
There are other issues reported about the h-index formula:
- citation weight, not recognising highly-cited articles.
- citation increase for a research done by many collaborators with different
levels of involvement.
- for topics considered hot topics and attracting more citations as opposed to
specialised ones
- and variations in citation practices across disciplines.
It’s encouraging to know that Authors of break through discoveries, for example the
Nobel Prize winner Peter Higgs, didn’t produce many papers and had a low h index.