The NISO Altmetrics Initiative
NISO Update
ALA Annual – June 28, 2015
Nettie Lagace (@abugseye)
NISO Associate Director for Programs
Why worth funding?
• In order to move out of “pilot” and “proof-of-
concept” phases …
• Altmetrics must coalesce around commonly
understood definitions, calculations and data
sharing practices
• Altmetrics must be able to be audited
• Organizations who want to apply metrics will
need to understand them and ensure consistent
application and meaning across the community
Phase 1: Brainstorming
October 9, 2013 - San Francisco, CA
December 11, 2013 - Washington, DC
January 23-24, 2014 - Philadelphia, PA
Round of 1-on-1 interviews – March/Apr
Phase 1 report published in June 2014
June 26-27, 2015 5
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Unimportant
Of little importance
Moderately important
Important
Very important
n=118
Community Feedback on Project Idea Themes
Phase 2: Development
Presentations of Phase 1 report (June 2014)
Prioritization Effort (June - Nov, 2014)
Project approval (Nov - Dec 2014)
Working group formation (Jan - March 2015)
Consensus Development (March 2015 - Feb 2016)
Trial Use Period (Feb 15 - May 16)
Publication of final recommendations (Aug 16)
Working Groups
• A development of definitions and descriptions of
use
• Bdefinitions for appropriate metrics and
calculation methodologies for specific output
types and promotion and facilitation of use of
persistent identifiers
• C development of strategies to improve data
quality through source data providers.
Steering Committee
Calendar
• April 2015 – Group(s) start working
• October 2015 – Draft document(s)
• Fall 2015 – Comment period(s)
• November 2015 – NISO Report to Sloan
Foundation
• Spring 2016 – Completion of final draft(s)
http://www.niso.org/topics/tl/altmetrics_initiative/
Thank you!
Questions?
@abugseye

NISO Update June 28, 2015 - Altmetrics Lagace

  • 1.
    The NISO AltmetricsInitiative NISO Update ALA Annual – June 28, 2015 Nettie Lagace (@abugseye) NISO Associate Director for Programs
  • 3.
    Why worth funding? •In order to move out of “pilot” and “proof-of- concept” phases … • Altmetrics must coalesce around commonly understood definitions, calculations and data sharing practices • Altmetrics must be able to be audited • Organizations who want to apply metrics will need to understand them and ensure consistent application and meaning across the community
  • 4.
    Phase 1: Brainstorming October9, 2013 - San Francisco, CA December 11, 2013 - Washington, DC January 23-24, 2014 - Philadelphia, PA Round of 1-on-1 interviews – March/Apr Phase 1 report published in June 2014
  • 5.
    June 26-27, 20155 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Unimportant Of little importance Moderately important Important Very important n=118 Community Feedback on Project Idea Themes
  • 6.
    Phase 2: Development Presentationsof Phase 1 report (June 2014) Prioritization Effort (June - Nov, 2014) Project approval (Nov - Dec 2014) Working group formation (Jan - March 2015) Consensus Development (March 2015 - Feb 2016) Trial Use Period (Feb 15 - May 16) Publication of final recommendations (Aug 16)
  • 8.
    Working Groups • Adevelopment of definitions and descriptions of use • Bdefinitions for appropriate metrics and calculation methodologies for specific output types and promotion and facilitation of use of persistent identifiers • C development of strategies to improve data quality through source data providers.
  • 9.
  • 10.
    Calendar • April 2015– Group(s) start working • October 2015 – Draft document(s) • Fall 2015 – Comment period(s) • November 2015 – NISO Report to Sloan Foundation • Spring 2016 – Completion of final draft(s)
  • 11.

Editor's Notes

  • #6 Develop specific definitions for alternative assessment metrics. (87.9%) Promote and facilitate use of persistent identifiers in scholarly communications. (82.8%) Develop strategies to improve data quality through normalization of source data across providers. (80.8%) Identify research output types that are applicable to the use of metrics. (79.8%) Define appropriate metrics and calculation methodologies for specific output types, such as software, datasets, or performances. (78.1%) Explore creation of standardized APIs or download or exchange formats to facilitate data gathering. (72.5%) Research issues surrounding the reproducibility of metrics across providers. (70.7%)