Citation metrics across
disciplines
Google Scholar, Scopus and
the Web of Science:
A cross-disciplinary comparison
Anne-Wil Harzing, Professor or International Management, Middlesex
University, London
Satu Alakangas, Research Librarian, University of Melbourne, Australia
Presentation outline
1. Brief introduction of my background
2. Audit culture in academia and the “danger” of peer
review
3. Introduction of our bibliometric study of 146 senior
academics across five disciplines
4. Citation metrics across disciplines and data-sources
with a focus on Social Sciences and Humanities
5. Conclusions and further reading
2
Quick Intro:
Anne-Wil Harzing
 My name?...., Yes Anne-Wil is one name and not part of my family name
 Started at Middlesex in July 2014
 2001-2014: Melbourne (PhD director 2004-2009, Associate Dean RHD, 2009-2010,
Associate Dean Research, 2010-2013)
 1991-2001: Bradford (UK), Maastricht, Tilburg & Heerlen (Netherlands)
 Productive and passionate researcher & research mentor
 79 international journal articles since 1995 (160+ publications in total)
 >12,000 Google Scholar citations, h-index 51, ISI citations: >4,000, top 1% most cited
world-wide in Economics & Business
 Service to the academic community
 Editorial board membership of a dozen journals
 Personal website since 1999, 1000-1500 visitors/day, many free resources
 Journal Quality List since 2000, 58th edition
 Publish or Perish since 2006, version 5 launched late October 2016
An “amateur” in bibliometrics (1):
Journal Quality
 1993: Conversation with Head of Department: “How do I know which journals
are the best journals, I have no clue?”
 Jan 2000: Bradford Management Centre, UK:
 “Why on earth are we using this “stupid” VSNU journal ranking list that ranks
my JIBS publication “C” and all other International Business journals “D” (just
like Brickworks, magazine for the building trade). I am sure there are better
journal rankings lists around”
 July 2000: The first incarnation of my JQL is published on www.harzing.com
 2017: The 58th edition of the JQL with18 rankings, >100 ISI cites + 50,000 page
visits/year
 2009: AMLE Outstanding article of the year award for “When Knowledge
Wins: Transcending the Sense and Nonsense of Academic Rankings” [most
highly cited article in management in 2009]
 2016: AMLE “Disseminating knowledge: from potential to reality – New open-
access journals collide with convention”
 How predatory Open Access journals completely distorted Thomson Reuters
Highly Cited Academics ranking (see also
http://www.harzing.com/publications/white-papers/authoring-esi-highly-cited-
papers)
4
An “amateur” in bibliometrics (2):
Citation analysis
 May 2006: University of Melbourne: Promotion
application to professor rejected: “you haven’t published
enough in A-journals”
 Oct 2006: Publish or Perish v1.0 released
 Jan 2007: Reapplied for promotion showing my work
had more citation impact than all but one of the other
professors, recent or longstanding
 2010: The Publish or Perish Book, self-published
through Amazon Createspace, reviewed in Nature,
Scientometrics and JASIST
 2017: 200th or so release of Publish or Perish, >270 ISI
cites, >2 million page visits to date
5
Increasing audit culture:
Metrics vs. peer review
 Increasing “audit culture” in academia, where universities,
departments and individuals are constantly monitored and
ranked
 National research assessment exercises, such as the ERA
(Australia) and the REF (UK), are becoming increasingly
important
 Publications in these national exercises are normally assessed
by peer review for Humanities and Social Sciences
 Citations metrics are used in the (Life) Sciences and
Engineering as additional input for decision-making
 The argument for not using citation metrics in SSH is that
coverage for these disciplines is deemed insufficient in WoS
and Scopus
6
The danger of peer review? (1)
 Peer review might lead to harsher verdicts than bibliometric
evidence, especially for disciplines that do not have unified
paradigms, such as the Social Sciences and Humanities
 In Australia (ERA 2010) the average rating for the Social Sciences
was only about 60% of that of the (Life) Sciences
 This is despite the fact that on a citations per paper basis
Australia’s worldwide rank is similar in all disciplines
 The low ERA-ranking led to widespread popular commentary
that government funding for the Social Sciences should be
reduced or removed altogether
 Similarly negative assessment of the credibility of SSH can be
found in the UK (and no doubt in many other countries)
7
The danger of peer review? (2)
 More generally, peer review might lead to what I have called
“promise over proof”
 Harzing, A.W.; Mijnhardt, W. (2015) Proof over promise: Towards a
more inclusive ranking of Dutch academics in Economics &
Business, Scientometrics, vol. 102, no. 1, pp. 727-749.
 Assessment of the quality of a publication might be
(subconsciously) influenced by the “promise” of:
 the journal in which it is published,
 the reputation of the author's affiliation,
 the sub-discipline (theoretical/modeling vs. applied, hard vs. soft)
 [Promise] Publication in a triple-A journal initially means that 3-4
academics thought your paper was a worthwhile contribution to the
field. But what if this paper is subsequently hardly ever cited?
 [Proof] Publication in a “C-journal” with 1,000+ citations means that
1,000 academics thought your paper was a worthwhile contribution
to the field
8
What can we do?
 Be critical about the increasing audit culture
 But: be realistic, we are unlikely to see a reversal of this trend. Hence in
order to “emancipate” the Social Sciences and Humanities, an inclusion of
citation metrics might help. However, we need to:
 Raise awareness about:
 Alternative data sources for citation analysis that are more inclusive (e.g.
including books, local and regional journals, reports, working papers)
 Difficulty of comparing metrics across disciplines because of different
publication and citation practices
 Life Science and Science academics in particular write more (and shorter)
papers with more authors each; 10-15 authors not unusual, some >1000
authors
 Suggest alternative data sources and metrics
 Google Scholar or Scopus instead of WoS/ISI
 hIa (Individual annualised h-index), i.e. h-index corrected for career length and
number of co-authors
 measures the average number of single-author equivalent impactful
publications an academic publishes a year (usually well below 1.0)
9
Need for comprehensive
empirical work across data-
sources and metrics
 Dozens of studies comparing two or even three
databases. However:
 Focused on a single or small groups of journals or a small
group of academics
 Only covered a small number of disciplines
 Hence our study provides:
 Cross-disciplinary comparison across all major disciplinary
areas
 Comparison of 4 different metrics:
 publications, citations, h-index
 hI,annual (h-index corrected for career length and
number of co-authors)
10
The bibliometric study (1):
The basics
 Sample of 146 Associate and Full Professors at the University of Melbourne
 All main disciplines (Humanities, Social Sciences, Engineering, Sciences, Life Sciences)
were represented, 37 sub-disciplines
 Two full professors (1 male, 1 female) and two associate professors (1 male, 1 female) in
each sub-discipline (e.g. management, marketing, accounting, economics)
 Citation metrics in WoS/ISI, Scopus and Google Scholar
 Collected citation data every 3 months for 2 years
 Google Scholar data collected with Publish or Perish
(http://www.harzing.com/resources/publish-or-perish)
 WoS/ISI and Scopus collected in the respective databases and imported into Publish or
Perish to calculate metrics
 The final conclusion: with appropriate metrics and data sources, citation metrics can
be applied in the Social Sciences and the Humanities
 ISI h-index: Life Sciences average lies 180% above Social Sciences average
 ISI h-index: Life Sciences average lies 670% above Humanities average
 GS hIa index: Life Sciences average lies 8% below Social Sciences average
 GS hIa index: Life Sciences average lies 80% above Humanities average
11
The bibliometric study (2):
Details on the sample
 Sample: 37 disciplines were grouped into five major disciplinary fields:
 Humanities: Architecture, Building & Planning; Culture & Communication,
History; Languages & Linguistics, Law (19 observations),
 Social Sciences: Accounting & Finance; Economics; Education; Management
& Marketing; Psychology; Social & Political Sciences (24 observations),
 Engineering: Chemical & Biomolecular Engineering; Computing & Information
Systems; Electrical & Electronic Engineering, Infrastructure Engineering,
Mechanical Engineering (20 observations),
 Sciences: Botany; Chemistry, Earth Sciences; Genetics; Land & Environment;
Mathematics; Optometry; Physics; Veterinary Sciences; Zoology (44
observations),
 Life Sciences: Anatomy & Neurosciece; Audiology; Biochemistry & Molecular
Biology; Dentistry; Obstetrics & Gynaecology; Ophthalmology; Microbiology;
Pathology; Physiology; Population Health (39 observations).
 Discipline structure followed Department/School structure at the
University of Melbourne
 Overrepresentation of the (Life) Sciences and underrepresentation of Social
Sciences beyond Business & Economics
 Overall, sufficiently varied coverage across the five major disciplinary fields
12
The bibliometric study (3):
Descriptive statistics
13
14
Different data-sources
between disciplines:
number of papers
15
Humanities Social Sciences Engineering Sciences Life Sciences
Web of Science 16 30 81 98 109
Scopus 21 34 103 101 123
Google Scholar 93 115 143 149 189
0
20
40
60
80
100
120
140
160
180
200
Averagenumber
ofpapers
Different data-sources
between disciplines:
number of citations
16
Humanities
Social
Sciences
Engineering Sciences Life Sciences
Web of Science 61 591 897 2612 3139
Scopus 100 782 1132 2558 3313
Google Scholar 871 2604 1964 3984 4699
0
500
1000
1500
2000
2500
3000
3500
4000
4500
5000
Averagenumber
ofcitations
Different data-sources
between disciplines:
number of citations
17
Web of Science Scopus Google Scholar
Humanities 61 100 871
Social Sciences 591 782 2604
Engineering 897 1132 1964
Sciences 2612 2558 3984
Life Sciences 3139 3313 4699
0
1000
2000
3000
4000
5000
Citations
Different data-sources
between disciplines:
h-index
18
Web of Science Scopus Google Scholar
Humanities 3.5 4.3 12.3
Social Sciences 9.6 12.0 21.5
Engineering 13.5 15.6 20.8
Sciences 25.6 25.6 30.1
Life Sciences 27.1 28.3 33.4
0.0
5.0
10.0
15.0
20.0
25.0
30.0
35.0
h-index
Different data-sources
between disciplines:
hIa index
19
Web of Science Scopus Google Scholar
Humanities 0.14 0.18 0.36
Social Sciences 0.32 0.42 0.66
Engineering 0.33 0.41 0.53
Sciences 0.44 0.45 0.57
Life Sciences 0.43 0.46 0.65
0.00
0.10
0.20
0.30
0.40
0.50
0.60
0.70
hIaindex
hIa: h-index corrected for academic age (to accommodate differences in career length) and number of co-
authors (to remove discipline bias)
Quick comparison across
disciplines
 H-index ISI data
 Life Sciences vs. Humanities: 27 vs. 3.5
 i.e. nearly 8 times as high
 Life Sciences vs. Social Sciences: 27 vs. 9.5
 i.e. nearly 3 times as high
 hIa-index GS data
 Life Sciences vs. Humanities: 0.61 vs. 0.34
 i.e. nearly 2 times as high
 Life Sciences vs. Social Sciences: 0.61 vs. 0.66
 i.e. 8% lower
22
Conclusion
 Will the use of citation metrics disadvantage the Social Sciences
and Humanities?
 Not, if you use a database that includes publications important in those
disciplines (e.g. books, national journals)
 Not, if you correct for differences in co-authorships
 Is peer review better than metrics in the Social Sciences and
Humanities?
 Yes, in a way…. The ideal version of peer review (informed, dedicated,
and unbiased experts) is better than a reductionist version of metrics
(ISI h-index or citations)
 However, the inclusive version of metrics (GS hIa) is probably better
than the likely reality of peer review (hurried semi-experts, potentially
influenced by journal outlet and affiliation)
 In research evaluation at any level use a combination of peer
review and metrics wherever possible, but:
 If reviewers are not experts, metrics might be a better alternative
 If metrics are used, use an inclusive database (GS or Scopus) and
career and discipline adjusted metrics
24
Want to know more?
 Harzing, A.W.; Alakangas, S. (2016) Google Scholar,
Scopus and the Web of Science: A longitudinal and
cross-disciplinary comparison, Scientometrics,
106(2): 787-804.
 For more details see:
http://www.harzing.com/research/quality-and-impact-
of-academic-research
Any questions?
25
Further reading on Google
Scholar as a source for citation
data
 Harzing, A.W.; Wal, R. van der (2008) Google Scholar as a new
source for citation analysis?, Ethics in Science and Environmental
Politics, 8(1): 62-71
 Harzing, A.W.; Wal, R. van der (2009) A Google Scholar h-index for
Journals: An alternative metric to measure journal impact in
Economics & Business?, Journal of the American Society for
Information Science and Technology, 60(1): 41-46.
 Harzing, A.W. (2013) A preliminary test of Google Scholar as a
source for citation data: A longitudinal study of Nobel Prize
winners, Scientometrics, 93(3): 1057-1075.
 Harzing, A.W. (2014) A longitudinal study of Google Scholar
coverage between 2012 and 2013, Scientometrics, 98(1): 565-575.
 Harzing, A.W.; Alakangas, S. (2016) Google Scholar, Scopus and
the Web of Science: A longitudinal and cross-disciplinary
comparison, Scientometrics,106(2): 787-804.
26
Further reading on problems
with the Web of Science and
new metrics
 Harzing, A.W. (2013) Document categories in the ISI Web of
Knowledge: Misunderstanding the Social Sciences?,
Scientometrics, 93(1): 23-34.
 Harzing, A.W.; Alakangas, S.; Adams, D. (2014) hIa: An individual
annual h-index to accommodate disciplinary and career length
differences, Scientometrics, 99(3): 811-821.
 Harzing, A.W.; Mijnhardt, W. (2015) Proof over promise: Towards a
more inclusive ranking of Dutch academics in Economics &
Business, Scientometrics, 102(1): 727-749.
 Harzing, A.W. (2015) Health warning: Might contain multiple
personalities. The problem of homonyms in Thomson Reuters
Essential Science Indicators, Scientometrics,105(3): 2259-2270.
 Harzing, A.W. (2017) Microsoft Academic (Search): a Phoenix
arisen from the ashes?, Scientometrics, 108(3):1637-1647
27

Citation metrics across disciplines - Google Scholar, Scopus, and the Web of Science

  • 1.
    Citation metrics across disciplines GoogleScholar, Scopus and the Web of Science: A cross-disciplinary comparison Anne-Wil Harzing, Professor or International Management, Middlesex University, London Satu Alakangas, Research Librarian, University of Melbourne, Australia
  • 2.
    Presentation outline 1. Briefintroduction of my background 2. Audit culture in academia and the “danger” of peer review 3. Introduction of our bibliometric study of 146 senior academics across five disciplines 4. Citation metrics across disciplines and data-sources with a focus on Social Sciences and Humanities 5. Conclusions and further reading 2
  • 3.
    Quick Intro: Anne-Wil Harzing My name?...., Yes Anne-Wil is one name and not part of my family name  Started at Middlesex in July 2014  2001-2014: Melbourne (PhD director 2004-2009, Associate Dean RHD, 2009-2010, Associate Dean Research, 2010-2013)  1991-2001: Bradford (UK), Maastricht, Tilburg & Heerlen (Netherlands)  Productive and passionate researcher & research mentor  79 international journal articles since 1995 (160+ publications in total)  >12,000 Google Scholar citations, h-index 51, ISI citations: >4,000, top 1% most cited world-wide in Economics & Business  Service to the academic community  Editorial board membership of a dozen journals  Personal website since 1999, 1000-1500 visitors/day, many free resources  Journal Quality List since 2000, 58th edition  Publish or Perish since 2006, version 5 launched late October 2016
  • 4.
    An “amateur” inbibliometrics (1): Journal Quality  1993: Conversation with Head of Department: “How do I know which journals are the best journals, I have no clue?”  Jan 2000: Bradford Management Centre, UK:  “Why on earth are we using this “stupid” VSNU journal ranking list that ranks my JIBS publication “C” and all other International Business journals “D” (just like Brickworks, magazine for the building trade). I am sure there are better journal rankings lists around”  July 2000: The first incarnation of my JQL is published on www.harzing.com  2017: The 58th edition of the JQL with18 rankings, >100 ISI cites + 50,000 page visits/year  2009: AMLE Outstanding article of the year award for “When Knowledge Wins: Transcending the Sense and Nonsense of Academic Rankings” [most highly cited article in management in 2009]  2016: AMLE “Disseminating knowledge: from potential to reality – New open- access journals collide with convention”  How predatory Open Access journals completely distorted Thomson Reuters Highly Cited Academics ranking (see also http://www.harzing.com/publications/white-papers/authoring-esi-highly-cited- papers) 4
  • 5.
    An “amateur” inbibliometrics (2): Citation analysis  May 2006: University of Melbourne: Promotion application to professor rejected: “you haven’t published enough in A-journals”  Oct 2006: Publish or Perish v1.0 released  Jan 2007: Reapplied for promotion showing my work had more citation impact than all but one of the other professors, recent or longstanding  2010: The Publish or Perish Book, self-published through Amazon Createspace, reviewed in Nature, Scientometrics and JASIST  2017: 200th or so release of Publish or Perish, >270 ISI cites, >2 million page visits to date 5
  • 6.
    Increasing audit culture: Metricsvs. peer review  Increasing “audit culture” in academia, where universities, departments and individuals are constantly monitored and ranked  National research assessment exercises, such as the ERA (Australia) and the REF (UK), are becoming increasingly important  Publications in these national exercises are normally assessed by peer review for Humanities and Social Sciences  Citations metrics are used in the (Life) Sciences and Engineering as additional input for decision-making  The argument for not using citation metrics in SSH is that coverage for these disciplines is deemed insufficient in WoS and Scopus 6
  • 7.
    The danger ofpeer review? (1)  Peer review might lead to harsher verdicts than bibliometric evidence, especially for disciplines that do not have unified paradigms, such as the Social Sciences and Humanities  In Australia (ERA 2010) the average rating for the Social Sciences was only about 60% of that of the (Life) Sciences  This is despite the fact that on a citations per paper basis Australia’s worldwide rank is similar in all disciplines  The low ERA-ranking led to widespread popular commentary that government funding for the Social Sciences should be reduced or removed altogether  Similarly negative assessment of the credibility of SSH can be found in the UK (and no doubt in many other countries) 7
  • 8.
    The danger ofpeer review? (2)  More generally, peer review might lead to what I have called “promise over proof”  Harzing, A.W.; Mijnhardt, W. (2015) Proof over promise: Towards a more inclusive ranking of Dutch academics in Economics & Business, Scientometrics, vol. 102, no. 1, pp. 727-749.  Assessment of the quality of a publication might be (subconsciously) influenced by the “promise” of:  the journal in which it is published,  the reputation of the author's affiliation,  the sub-discipline (theoretical/modeling vs. applied, hard vs. soft)  [Promise] Publication in a triple-A journal initially means that 3-4 academics thought your paper was a worthwhile contribution to the field. But what if this paper is subsequently hardly ever cited?  [Proof] Publication in a “C-journal” with 1,000+ citations means that 1,000 academics thought your paper was a worthwhile contribution to the field 8
  • 9.
    What can wedo?  Be critical about the increasing audit culture  But: be realistic, we are unlikely to see a reversal of this trend. Hence in order to “emancipate” the Social Sciences and Humanities, an inclusion of citation metrics might help. However, we need to:  Raise awareness about:  Alternative data sources for citation analysis that are more inclusive (e.g. including books, local and regional journals, reports, working papers)  Difficulty of comparing metrics across disciplines because of different publication and citation practices  Life Science and Science academics in particular write more (and shorter) papers with more authors each; 10-15 authors not unusual, some >1000 authors  Suggest alternative data sources and metrics  Google Scholar or Scopus instead of WoS/ISI  hIa (Individual annualised h-index), i.e. h-index corrected for career length and number of co-authors  measures the average number of single-author equivalent impactful publications an academic publishes a year (usually well below 1.0) 9
  • 10.
    Need for comprehensive empiricalwork across data- sources and metrics  Dozens of studies comparing two or even three databases. However:  Focused on a single or small groups of journals or a small group of academics  Only covered a small number of disciplines  Hence our study provides:  Cross-disciplinary comparison across all major disciplinary areas  Comparison of 4 different metrics:  publications, citations, h-index  hI,annual (h-index corrected for career length and number of co-authors) 10
  • 11.
    The bibliometric study(1): The basics  Sample of 146 Associate and Full Professors at the University of Melbourne  All main disciplines (Humanities, Social Sciences, Engineering, Sciences, Life Sciences) were represented, 37 sub-disciplines  Two full professors (1 male, 1 female) and two associate professors (1 male, 1 female) in each sub-discipline (e.g. management, marketing, accounting, economics)  Citation metrics in WoS/ISI, Scopus and Google Scholar  Collected citation data every 3 months for 2 years  Google Scholar data collected with Publish or Perish (http://www.harzing.com/resources/publish-or-perish)  WoS/ISI and Scopus collected in the respective databases and imported into Publish or Perish to calculate metrics  The final conclusion: with appropriate metrics and data sources, citation metrics can be applied in the Social Sciences and the Humanities  ISI h-index: Life Sciences average lies 180% above Social Sciences average  ISI h-index: Life Sciences average lies 670% above Humanities average  GS hIa index: Life Sciences average lies 8% below Social Sciences average  GS hIa index: Life Sciences average lies 80% above Humanities average 11
  • 12.
    The bibliometric study(2): Details on the sample  Sample: 37 disciplines were grouped into five major disciplinary fields:  Humanities: Architecture, Building & Planning; Culture & Communication, History; Languages & Linguistics, Law (19 observations),  Social Sciences: Accounting & Finance; Economics; Education; Management & Marketing; Psychology; Social & Political Sciences (24 observations),  Engineering: Chemical & Biomolecular Engineering; Computing & Information Systems; Electrical & Electronic Engineering, Infrastructure Engineering, Mechanical Engineering (20 observations),  Sciences: Botany; Chemistry, Earth Sciences; Genetics; Land & Environment; Mathematics; Optometry; Physics; Veterinary Sciences; Zoology (44 observations),  Life Sciences: Anatomy & Neurosciece; Audiology; Biochemistry & Molecular Biology; Dentistry; Obstetrics & Gynaecology; Ophthalmology; Microbiology; Pathology; Physiology; Population Health (39 observations).  Discipline structure followed Department/School structure at the University of Melbourne  Overrepresentation of the (Life) Sciences and underrepresentation of Social Sciences beyond Business & Economics  Overall, sufficiently varied coverage across the five major disciplinary fields 12
  • 13.
    The bibliometric study(3): Descriptive statistics 13
  • 14.
  • 15.
    Different data-sources between disciplines: numberof papers 15 Humanities Social Sciences Engineering Sciences Life Sciences Web of Science 16 30 81 98 109 Scopus 21 34 103 101 123 Google Scholar 93 115 143 149 189 0 20 40 60 80 100 120 140 160 180 200 Averagenumber ofpapers
  • 16.
    Different data-sources between disciplines: numberof citations 16 Humanities Social Sciences Engineering Sciences Life Sciences Web of Science 61 591 897 2612 3139 Scopus 100 782 1132 2558 3313 Google Scholar 871 2604 1964 3984 4699 0 500 1000 1500 2000 2500 3000 3500 4000 4500 5000 Averagenumber ofcitations
  • 17.
    Different data-sources between disciplines: numberof citations 17 Web of Science Scopus Google Scholar Humanities 61 100 871 Social Sciences 591 782 2604 Engineering 897 1132 1964 Sciences 2612 2558 3984 Life Sciences 3139 3313 4699 0 1000 2000 3000 4000 5000 Citations
  • 18.
    Different data-sources between disciplines: h-index 18 Webof Science Scopus Google Scholar Humanities 3.5 4.3 12.3 Social Sciences 9.6 12.0 21.5 Engineering 13.5 15.6 20.8 Sciences 25.6 25.6 30.1 Life Sciences 27.1 28.3 33.4 0.0 5.0 10.0 15.0 20.0 25.0 30.0 35.0 h-index
  • 19.
    Different data-sources between disciplines: hIaindex 19 Web of Science Scopus Google Scholar Humanities 0.14 0.18 0.36 Social Sciences 0.32 0.42 0.66 Engineering 0.33 0.41 0.53 Sciences 0.44 0.45 0.57 Life Sciences 0.43 0.46 0.65 0.00 0.10 0.20 0.30 0.40 0.50 0.60 0.70 hIaindex hIa: h-index corrected for academic age (to accommodate differences in career length) and number of co- authors (to remove discipline bias)
  • 20.
    Quick comparison across disciplines H-index ISI data  Life Sciences vs. Humanities: 27 vs. 3.5  i.e. nearly 8 times as high  Life Sciences vs. Social Sciences: 27 vs. 9.5  i.e. nearly 3 times as high  hIa-index GS data  Life Sciences vs. Humanities: 0.61 vs. 0.34  i.e. nearly 2 times as high  Life Sciences vs. Social Sciences: 0.61 vs. 0.66  i.e. 8% lower 22
  • 21.
    Conclusion  Will theuse of citation metrics disadvantage the Social Sciences and Humanities?  Not, if you use a database that includes publications important in those disciplines (e.g. books, national journals)  Not, if you correct for differences in co-authorships  Is peer review better than metrics in the Social Sciences and Humanities?  Yes, in a way…. The ideal version of peer review (informed, dedicated, and unbiased experts) is better than a reductionist version of metrics (ISI h-index or citations)  However, the inclusive version of metrics (GS hIa) is probably better than the likely reality of peer review (hurried semi-experts, potentially influenced by journal outlet and affiliation)  In research evaluation at any level use a combination of peer review and metrics wherever possible, but:  If reviewers are not experts, metrics might be a better alternative  If metrics are used, use an inclusive database (GS or Scopus) and career and discipline adjusted metrics 24
  • 22.
    Want to knowmore?  Harzing, A.W.; Alakangas, S. (2016) Google Scholar, Scopus and the Web of Science: A longitudinal and cross-disciplinary comparison, Scientometrics, 106(2): 787-804.  For more details see: http://www.harzing.com/research/quality-and-impact- of-academic-research Any questions? 25
  • 23.
    Further reading onGoogle Scholar as a source for citation data  Harzing, A.W.; Wal, R. van der (2008) Google Scholar as a new source for citation analysis?, Ethics in Science and Environmental Politics, 8(1): 62-71  Harzing, A.W.; Wal, R. van der (2009) A Google Scholar h-index for Journals: An alternative metric to measure journal impact in Economics & Business?, Journal of the American Society for Information Science and Technology, 60(1): 41-46.  Harzing, A.W. (2013) A preliminary test of Google Scholar as a source for citation data: A longitudinal study of Nobel Prize winners, Scientometrics, 93(3): 1057-1075.  Harzing, A.W. (2014) A longitudinal study of Google Scholar coverage between 2012 and 2013, Scientometrics, 98(1): 565-575.  Harzing, A.W.; Alakangas, S. (2016) Google Scholar, Scopus and the Web of Science: A longitudinal and cross-disciplinary comparison, Scientometrics,106(2): 787-804. 26
  • 24.
    Further reading onproblems with the Web of Science and new metrics  Harzing, A.W. (2013) Document categories in the ISI Web of Knowledge: Misunderstanding the Social Sciences?, Scientometrics, 93(1): 23-34.  Harzing, A.W.; Alakangas, S.; Adams, D. (2014) hIa: An individual annual h-index to accommodate disciplinary and career length differences, Scientometrics, 99(3): 811-821.  Harzing, A.W.; Mijnhardt, W. (2015) Proof over promise: Towards a more inclusive ranking of Dutch academics in Economics & Business, Scientometrics, 102(1): 727-749.  Harzing, A.W. (2015) Health warning: Might contain multiple personalities. The problem of homonyms in Thomson Reuters Essential Science Indicators, Scientometrics,105(3): 2259-2270.  Harzing, A.W. (2017) Microsoft Academic (Search): a Phoenix arisen from the ashes?, Scientometrics, 108(3):1637-1647 27