NISO	Recommended	Practices	
to	Support	Adoption	of	
Altmetrics
Todd A. Carpenter
3:AM Conference
Bucharest, Romania
September 29, 2016
We’re just not as alternative
as we used to be
Robert	Smith,	The	Cure
Ø US-based,	non-profit	industry	trade	association	
accredited	by	American	National	Standards	Institute
Ø Mission	of	developing	and	maintaining	technical	
standards	related	to	information,	documentation,	
discovery	and	distribution	of	published	materials	
and	media
Ø Volunteer	driven	organization:	400+	contributors	
spread	out	across	the	world
Ø Responsible	(directly	and	indirectly)	for	standards	
like	ISSN,	DOI,	Dublin	Core	metadata,	DAISY	digital	
talking	books,	OpenURL,	MARC	records,	and	ISBN
About	
August	20,	2016 2
August	20,	2016 5
White	Paper	Released	- June	2014
August	20,	2016 6
August	20,	2016 7
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Unimportant
Of	little	importance
Moderately	important
Important
Very	important
n=118
Community	Feedback	on	Project	Idea	Themes
Definitions and Use Cases
Code of Conduct
Data Metrics
Output Types for Assessment
Persistent Identifiers and Assessment
August	20,	2016 8
August	20,	2016 9
Definitions and Use Cases
August	20,	2016 10
Caveats
• Citations, usage data, and altmetrics are ALL
potentially important and potentially imperfect
• Please don’t use altmetrics (or any metrics) as an
uncritical proxy for scholarly impact – consider
quantitative and qualitative information too
• data quality and indicator construction are key factors
in the evaluation of specific altmetrics
(read as: garbage in, garbage out!)
August	20,	2016 11
What is Altmetrics? Definition
Altmetrics is a broad term that encapsulates the digital collection, creation,
and use of multiple forms of assessment that are derived from activity and
engagement among diverse stakeholders and scholarly outputs in the
research ecosystem.
The inclusion in the definition of altmetrics of many different outputs and
forms of engagement helps distinguish it from traditional citation-based
metrics, while at the same time, leaving open the possibility of their
complementary use, including for purposes of measuring scholarly impact.
However, the development of altmetrics in the context of alternative
assessment sets its measurements apart from traditional citation-based
scholarly metrics.
August	20,	2016 12
Use Cases
Developed eight personas, three themes:
Showcase achievement: Indicates stakeholder
interest in highlighting the positive achievements
garnered by one or more scholarly outputs.
Research evaluation: Indicates stakeholder interest
in assessing the impact or reach of research.
Discovery: Indicates stakeholder interest in
discovering or increasing the discoverability of
scholarly outputs and/or researchers.
August	20,	2016 13
Personas: academic/researcher
August	20,	2016 14
Persona: member of hiring
committee
August	20,	2016 15
Personas: publishing editor
August	20,	2016 16
Persona: librarian
August	20,	2016 17
Glossary
• Activity. Viewing, reading, saving, diffusing, mentioning, citing, reusing,
modifying, or otherwise interacting with scholarly outputs.
• Altmetric data aggregator. Tools and platforms that aggregate and offer
online events as well as derived metrics from altmetric data providers, for
example, Altmetric.com, Plum Analytics, PLOS ALM, ImpactStory, and
Crossref.
• Altmetric data provider. Platforms that function as sources of online events
used as altmetrics, for example, Twitter, Mendeley, Facebook,
F1000Prime, Github, SlideShare, and Figshare.
• Attention. Notice, interest, or awareness. In altmetrics, this term is
frequently used to describe what is captured by the set of activities and
engagements generated around a scholarly output.
August	20,	2016 18
Glossary (more...)
• Engagement. The level or depth of interaction between users and scholarly
outputs, typically based upon the activities that can be tracked within an
online environment. See also Activity.
• Impact. The subjective range, depth, and degree of influence generated
by or around a person, output, or set of outputs. Interpretations of
impact vary depending on its placement in the research ecosystem.
• Metrics. A method or set of methods for purposes of measurement.
• Online event. A recorded entity of online activities related to scholarly
output, used to calculate metrics.
August	20,	2016 19
Glossary (and even more...)
• Scholarly output. A product created or executed by scholars and investigators in the
course of their academic and/or research efforts. Scholarly output may include but is
not limited to journal articles, conference proceedings, books and book chapters,
reports, theses and dissertations, edited volumes, working papers, scholarly
editions, oral presentations, performances, artifacts, exhibitions, online events,
software and multimedia, composition, designs, online publications, and other
forms of intellectual property. The term scholarly output is sometimes used
synonymously with research outputs.
• Traditional metrics. The set of metrics based upon the collection, calculation, and
manipulation of scholarly citations, often at the journal level. Specific examples include
raw and relative (field-normalized) citation counts and the Journal Impact Factor.
• Usage. A specific subset of activity based upon user access to one or more scholarly
outputs, often in an online environment. Common examples include HTML accesses
and PDF downloads.
August	20,	2016 20
Code of Conduct
August	20,	2016 21
Code	of	Conduct	
• Why	a	Code	of	Conduct?
• Scope
• Altmetric	Data	Providers	vs.	
Aggregators
August	20,	2016 22
Code	of	Conduct	Key	Elements
• Transparency
• Replicability
• Accuracy
August	20,	2016 23
Code	of	Conduct:	Transparency
• How	data	are	generated,	collected,	and	
curated
• How	data	are	aggregated,	and	derived	data	
generated
• When	and	how	often	data	are	updated
• How	data	can	be	accessed
• How	data	quality	is	monitored
August	20,	2016 24
Code	of	Conduct:	Replicability
• Provided	data	is	generated	using	the	same	methods	over	
time
• Changes	in	methods	and	their	effects	are	documented
• Changes	in	the	data	following	corrections	of	errors	are	
documented
• Data	provided	to	different	users	at	the	same	time	is	
identical	or,	if	not,	differences	in	access	provided	to	
different	user	groups	are	documented
• Information	is	provided	on	whether	and	how	data	can	be	
independently	verified
August	20,	2016 25
Code	of	Conduct	:	Accuracy
• The	data	represents	what	it	purports	to	reflect
• Known	errors	are	identified	and	corrected
• Any	limitations	of	the	provided	data	are	
communicated
August	20,	2016 26
Code	of	Conduct:	Reporting
List	all	available	data	and	metrics	(providers	&	aggregators)	and	altmetrics data	providers	from	which	data	are	collected	(aggregators).
Provide	a	clear	definition	of	each	metric	provided.
Describe	the	method(s)	by	which	data	is	generated	or	collected	and	how	this	is	maintained	over	time.
Describe	any	and	all	known	limitations	of	the	data	provided.
Provide	a	documented	audit	trail	of	how	and	when	data	generation	and	collection	methods	change	over	time	with	any	and	all	known	effects	of	these	changes,	
including	whether	changes	were	applied	historically	or	only	from	change	date	forward.
Describe	how	data	is	aggregated.
Detail	how	often	data	is	updated.
Provide	the	process	of	how	data	can	be	accessed.
Confirm	that	data	provided	to	different	data	aggregators	and	users	at	the	same	time	is	identical	and,	if	not,	how	and	why	they	differ.
Confirm	that	all	retrieval	methods	lead	to	the	same	data	and,	if	not,	how	and	why	they	differ.
Describe	the	data	quality	monitoring	process.
Provide	process	by	which	data	can	be	independently	verified	(aggregators	only).
Provide	a	process	for	reporting	and	correcting	suspected	inaccurate	data	or	metrics.
August	20,	2016 27
Non-traditional Outputs
August	20,	2016 28
Alternative	outputs
June	25,	2016 29
Recommendations	re	Data	Metrics
• Metrics	on	research	data	should	be	made	
available	as	widely	as	possible
• Data	citations	should	be	implemented	following	
the	Force11	Joint	Declaration	of	Data	Citation	
Principles,	in	particular:
– Use	machine-actionable	persistent	identifiers
– Provide	metadata	required	for	a	citation
– Provide	a	landing	page
– Data	citations	should	go	into	the	reference	list	or	
similar	metadata.
June	25,	2016 30
Recommendations	for	Data	Metrics
• Standards	for	research-data-use	statistics	need	to	be	
developed.	
– Based	on	COUNTER;	consider	special	aspects	of	research	
data	
– Two	formulations	for	data	download	metrics:	examine	
human	and	non-human	downloads
• Research	funders	should	provide	mechanisms	to	
support	data	repositories	in	implementing	standards	
for	interoperability	and	obtaining	metrics.
• Data	discovery	and	sharing	platforms	should	support	
and	monitor	“streaming”	access	to	data	via	API	
queries.
June	25,	2016 31
Persistent	Identifiers
June	25,	2016 32
Altmetrics for #NISOALMI
August	20,	2016 33
39	presentation	slides	have	
been	downloaded	32,740	
times	(as	of	July	26,	2016
The	Phase	1	report	
published	in	2014	
downloaded	9,636	times
Pages	hosting	content	
related	to	this	project	were	
accessed	60,548	times
>2,000	people	attended	
the	22	in-person	
presentations	about	the	
project
Final	Report	has	been	downloaded	2,906	
times	in	the	7	days	since	publication
More	than	50	
articles/blogs/papers	
about	the	initiative
Where to next?
August	20,	2016 34
August	20,	2016 35
Initial
• Metrics	from	
provider
• Ad-hoc
Repeatable
• Common	
measurement	
criteria	from	
provider
• Documented	
measurements	
and	processes
• Comparable	and	
consistent
Defined
• Measurements	
defined/confirmed	
as	a	standard	for	
provider
• Made	public
• Business	processes	
followed	
consistently
• Transparent
Managed
• Standards	applied
• Controls	in	place
• Checks	and	
balances	repeated	
over	time
• Open	for	comment	
and	feedback
• Accountable
Governed
• Independent	
verification	or	3rd
party	audit
• Evolving	common	
industry	defined	
standards
• Trust	and	
confidence
Maturity	Model	for	Standards	Adoption
Increasing	trust	and	confidence	in	altmetrics
August	20,	2016 36
Promote
August	20,	2016 37
Operationalize
August	20,	2016 38
Iterate
August	20,	2016 39
August	20,	2016 40
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Unimportant
Of	little	importance
Moderately	important
Important
Very	important
n=118
Community	Feedback	on	Project	Idea	Themes
Key	Original	Ideas	Not	Yet	Done
• What	is	the	role	of	alternative	assessment	
metrics	in	research	evaluation	and	identify	
and	what	gaps	exist	in	data	collection	around	
evaluation	scenarios.
• Identify	best	practices	for	grouping	and	
aggregating	multiple	data	sources
• Identify	best	practices	for	grouping	and	
aggregating	by	journal,	author,	institution	and	
funder.
August	20,	2016 41
Steering Committee
August	20,	2016 42
Thank you to the
dozens of people
on the working groups
and
the hundreds of people who
participated
in brainstorming and commenting
on this effort!
August	20,	2016 43
For more
Project Site:
www.niso.org/topics/tl/altmetrics_initiative/
August	20,	2016 44
Questions?
Todd Carpenter
Executive Director
tcarpenter@niso.org
@TAC_NISO
National Information Standards Organization (NISO)
3600 Clipper Mill Road, Suite 302
Baltimore, MD 21211 USA
+1 (301) 654-2512
www.niso.org
August	20,	2016 45

NISO Altmetrics Recommended Practice 3:AM Conference presentation