Data Collection
& Research Design
Assessment Practices & Metrics for 21st
Century Needs
Starr Hoffman, PhD, MLS, MA
Director, Planning & Assessment
UNLV Libraries
About my role
Assessment
Strategic planning
External reporting
University-wide initiatives
assessmentA culture of assessment seeks to:
● Emphasize user needs
● Facilitate evidence-based decision making
● Continuously improve
● Measure what’s meaningful
● Demonstrate value
Overview
This session will address the starting point of any assessment activity, including
an understanding of:
● Data that is commonly available in libraries,
● Discovering additional data and/or defining new metrics of success,
● Process of designing assessments,
● Relationship between “assessment” and “research,”
● Privacy concerns with collecting data,
● Challenges of collecting data from third-party systems.
What Data Do You Have?
...What Data Do You Need?
What data do you have?● Use data
○ circulation, gate counts, downloads,
database hits
● Collections data
○ holdings, acquisitions, weeding,
special collections, formats,
accessions
● Consortia
○ value/cost savings, other benefits
● Instruction
○ information literacy, workshops, etc.
● Transactions
○ reference, research consultations,
technical assistance
● Safety & facilities
● Digital collections, emerging
services…
What data do you have?
...How do you find it?
● Talk to department heads, managers about what they report
● Look at past department/division reports, library annual reports
● Ask front-line staff:
○ What do they track,
○ What spreadsheets they update,
○ How do they monitor progress or performance
● Interface with those in charge of third-party systems (databases, ILL, ILS)
What other data can you get?
What data can you get or should you get?
Start by asking stakeholders and library faculty/staff…
● What do you want to know?
● What problems are you trying to solve?
● What questions do you have about your unit and the work it does?
● What best showcases the value of the work you do?
These can prompt reconsidering what data (or other metrics) might answer those
questions.
Metrics: Considering Analysis
How will you analyze results?
How do you envision presenting or visualizing
the results?
Consider scale of assessment & scale of need
Look for examples of similar assessments
What other data can you get?
Other metrics to consider...
● Measures for programs (co-curricular)
○ Define purpose of each program
○ Ideally tie to a learning outcome or strategic goal
○ Measure student/patron’s response (qualitative)
○ Map to:
■ institutional learning outcomes
■ strategic goals
○ What lies outside that map: unintended outcomes
Assessing Co-Curricular Programs
Event Purpose
UNLV’s UULOs
(u-grad learning objectives)
Libraries’
Strategic Goals
Data &
Assessment
Therapy
Dogs
Study break; maintain student wellness
during stressful exams periods.
Encourage retention (RPC goal),
academic performance.
N/A
“Help students stay
healthy and well while
pursuing their academic
goals.”
Whiteboard
responses, ID
swipes
“Human
Library”
An opportunity to learn from individuals
whose beliefs and experiences differ
from their own.
“Respond to diverse perspectives
linked with identity.”
UNLV Libraries “...is open
to all people and affirms its
commitment and
dedication to diversity,
inclusion, equity, and
cultural awareness.”
Video testimonials,
white board, and
paper responses
LGBTQ
Panel
Students exposed to LGBTQI materials
in the Libraries’ Special Collections in
order to utilize library resources in future
curricular and co-curricular learning.
“Demonstrate lifelong learning skills,
including the ability to place problems
in personally meaningful contexts.”
“Function effectively in diverse
groups.”
“Expand co-curricular
programs ...in support of
academic achievement,
life skills and lifelong
learning.”
Attendance tracked;
photos of event
Research (Assessment) Design
Assessment vs. Research
Assessment Research
Context
Specific instance & context Generalizable knowledge
from a sample
Purpose
Improve a service, space,
or resource
Add to a body of knowledge
(usually through publication
or presentation)
Requirements
Usually no/few
requirements (unless
publication is anticipated)
Surveys, interviews, or other
data collection usually
requires IRB review
Formal satisfaction survey
(LibQual+) of students on
library services...
● Informs changes to
services
● Published article on
findings
● IRB application
required, may
provide exempt
status
Assessment vs. Research
Informal student
survey to inform
choice of new
library furniture
(unpublished,
results only used
locally)
Survey of librarian
experiences at
4-year institutions
Published article
IRB required (may
provide exempt
status)
Single location
Generalizable knowledge
(uses a large sample and/or a
common instrument)
Contributes to body of
research (published or
presented publically)
Performed for a
specific use
(improvement of
a service,
resource, etc.)
ASSESSMENT
RESEARCH
ACTION
RESEARCH
What to consider...
What is the purpose of your study?
LEAN TOWARD ASSESSMENT
● To make improvements… locally
● To move quickly
● To continuously improve and adjust
● To make actionable discoveries
LEAN TOWARD RESEARCH
● To make improvements… across the field
● To find out why something happens
● To improve practices in a broad context
● To perform a rigorous investigation
ACTION RESEARCH (Both!)
○ Performs research that is also used to make improvements
○ Adds to generalizable knowledge
○ Contributes knowledge and recommended practice to the profession
Assessment & Research Methods
● Quantitative:
○ Assumes objectivity of social facts
○ Things that can be measured, counted
○ May study large populations
○ Usually involves statistical analysis
○ Desire rigor
● Qualitative:
○ Assumes social constructions of reality
○ Often involves observation or interviews
○ Smaller populations or case studies
○ Provides rich, nuanced non-numeric
analysis
○ Desire trustworthiness
● Mixed Methods
○ Believes multiple methods can triangulate richer conclusions
○ Appreciates strengths of both design types
Some Examples of Methods & Design
Research
Design
Methodology Data Collection Data Analysis
Quant Survey Questionnaire Cross-tabulation,
Correlations
Quant Usability testing Focus group Descriptive statistics
Quant Experimental research Questionnaire (pre-test,
post-test)
Hypothesis testing
Qual Ethnography Observation Narrative analysis
Qual Case study Photo diaries Coding & themes
Mixed Surveys + Ethnography Questionnaire + observation Comparison of
conclusions (descriptive
stats, narrative)
Research Assessment & Research Methods
When choosing a design, method, and type of data collection, consider…
● What time do you have to conduct the study?
● How you want your results to look? (anticipate the analysis)
● What data is easy to gather?
● What data is meaningful to gather?
● What kinds of results would be “actionable” for assessment?
Criteria for Good Assessment Design
● It is usable, actionable?
○ Don’t assess what you know you cannot change.
○ Don’t gather data/evidence that will be hard to act on or respond to.
○ Focus on what will lead to actionable improvement.
● Is it realistic?
○ Don’t turn a small-scale project into a large one (scale).
○ “Good enough” and “done” are better than an incomplete, rigorous project.
○ How much work is involved not only in collection, but also in analysis & interpretation?
● Is it sustainable?
Privacy Concerns
Privacy Concerns
● Student data
● Patron checkout history
● Various kinds of identifying information:
○ Student ID number
○ Address
○ Birthdate
○ Zip code
○ Race/ethnicity
○ Student major
○ Student level (graduate, undergraduate, freshman, etc.)
Privacy Concerns: types of identifying data
Direct Identifiers
(aka Personally Identifying Information or PII)
● Name
● Social security number
● Driver’s license
● School ID (or other unique ID numbers)
● Other account numbers (library account)
● Phone number
● Email address
● Street Address
● Credit card info (library fees, fines)
● Photographs
Indirect Identifiers…
Shared by multiple people:
● Birthdate
● Zip code (postal code)
● Gender
● Blood type
● Class enrolled in
Masked (may be unique):
● License plate
● IP address
Methods of Anonymization & Security
For direct identifiers:
● Suppression:
○ Remove data
○ Replace data with pseudonyms (still
uniquely identified, but not by name)
● Aggregation
○ Group data instead of preserving individual
cases / observations
○ Think of census data, which is reported in
groups (by gender, race, or another
grouping variable)
For indirect identifiers:
● Generalization
○ Replace specific ages with age ranges
○ Use only first 3 digits of Zip Code
● Aggregation (see at right)
● Perturbation (advanced!)
○ Add statistical ‘noise’
○ Replace specific values with specific but
randomized values
○ Preserves statistical properties such as
mean, variance
In all cases: password protect or otherwise restrict permission and access to sensitive data.
De-Identifying Data (Anonymization)
Also
potentially
identifiable
Data Protection Considerations
● What kinds of direct or indirect identifiers do the data contain?
● Are there any unique or rare observations in the data?
● Which data points could be used together to identify an individual?
● Can other sources be linked to the data, making identification possible?
● Which features of the data do you want to keep? What can you sacrifice?
Getting Data From Other Systems
Data From Third-Party Systems
Some examples:
● Your ILS (Alma, III Millennium, etc.)
● Reference transactions (LibInsight, Desktracker, etc.)
● Database stats (COUNTER standard)
Challenges & Tips
Formatting data (exports, forms, queries)
Matching data points across systems
Data cleaning
Granularity
Frequency
Some Parting Thoughts
● Prompt library faculty/staff to think for themselves about:
○ What data they collect, why, and how they use it
○ What questions they have or problems they want to solve
○ What metrics would best answer those questions
● Prioritize assessment activities
○ Realize that you can’t measure everything
○ Prioritize:
■ Biggest needs?
■ Low-hanging fruit?
○ Grow a little each year
Any Questions?
Starr Hoffman, PhD, MLS, MA
Director, Planning & Assessment
UNLV Libraries
starr.hoffman@unlv.edu
De-Identifying Data (Anonymization)

Starr Hoffman - Data Collection & Research Design

  • 1.
    Data Collection & ResearchDesign Assessment Practices & Metrics for 21st Century Needs Starr Hoffman, PhD, MLS, MA Director, Planning & Assessment UNLV Libraries
  • 2.
    About my role Assessment Strategicplanning External reporting University-wide initiatives
  • 3.
    assessmentA culture ofassessment seeks to: ● Emphasize user needs ● Facilitate evidence-based decision making ● Continuously improve ● Measure what’s meaningful ● Demonstrate value
  • 4.
    Overview This session willaddress the starting point of any assessment activity, including an understanding of: ● Data that is commonly available in libraries, ● Discovering additional data and/or defining new metrics of success, ● Process of designing assessments, ● Relationship between “assessment” and “research,” ● Privacy concerns with collecting data, ● Challenges of collecting data from third-party systems.
  • 5.
    What Data DoYou Have? ...What Data Do You Need?
  • 6.
    What data doyou have?● Use data ○ circulation, gate counts, downloads, database hits ● Collections data ○ holdings, acquisitions, weeding, special collections, formats, accessions ● Consortia ○ value/cost savings, other benefits ● Instruction ○ information literacy, workshops, etc. ● Transactions ○ reference, research consultations, technical assistance ● Safety & facilities ● Digital collections, emerging services…
  • 7.
    What data doyou have? ...How do you find it? ● Talk to department heads, managers about what they report ● Look at past department/division reports, library annual reports ● Ask front-line staff: ○ What do they track, ○ What spreadsheets they update, ○ How do they monitor progress or performance ● Interface with those in charge of third-party systems (databases, ILL, ILS)
  • 8.
    What other datacan you get? What data can you get or should you get? Start by asking stakeholders and library faculty/staff… ● What do you want to know? ● What problems are you trying to solve? ● What questions do you have about your unit and the work it does? ● What best showcases the value of the work you do? These can prompt reconsidering what data (or other metrics) might answer those questions.
  • 10.
    Metrics: Considering Analysis Howwill you analyze results? How do you envision presenting or visualizing the results? Consider scale of assessment & scale of need Look for examples of similar assessments
  • 11.
    What other datacan you get?
  • 12.
    Other metrics toconsider... ● Measures for programs (co-curricular) ○ Define purpose of each program ○ Ideally tie to a learning outcome or strategic goal ○ Measure student/patron’s response (qualitative) ○ Map to: ■ institutional learning outcomes ■ strategic goals ○ What lies outside that map: unintended outcomes
  • 13.
    Assessing Co-Curricular Programs EventPurpose UNLV’s UULOs (u-grad learning objectives) Libraries’ Strategic Goals Data & Assessment Therapy Dogs Study break; maintain student wellness during stressful exams periods. Encourage retention (RPC goal), academic performance. N/A “Help students stay healthy and well while pursuing their academic goals.” Whiteboard responses, ID swipes “Human Library” An opportunity to learn from individuals whose beliefs and experiences differ from their own. “Respond to diverse perspectives linked with identity.” UNLV Libraries “...is open to all people and affirms its commitment and dedication to diversity, inclusion, equity, and cultural awareness.” Video testimonials, white board, and paper responses LGBTQ Panel Students exposed to LGBTQI materials in the Libraries’ Special Collections in order to utilize library resources in future curricular and co-curricular learning. “Demonstrate lifelong learning skills, including the ability to place problems in personally meaningful contexts.” “Function effectively in diverse groups.” “Expand co-curricular programs ...in support of academic achievement, life skills and lifelong learning.” Attendance tracked; photos of event
  • 15.
  • 16.
    Assessment vs. Research AssessmentResearch Context Specific instance & context Generalizable knowledge from a sample Purpose Improve a service, space, or resource Add to a body of knowledge (usually through publication or presentation) Requirements Usually no/few requirements (unless publication is anticipated) Surveys, interviews, or other data collection usually requires IRB review
  • 17.
    Formal satisfaction survey (LibQual+)of students on library services... ● Informs changes to services ● Published article on findings ● IRB application required, may provide exempt status Assessment vs. Research Informal student survey to inform choice of new library furniture (unpublished, results only used locally) Survey of librarian experiences at 4-year institutions Published article IRB required (may provide exempt status) Single location Generalizable knowledge (uses a large sample and/or a common instrument) Contributes to body of research (published or presented publically) Performed for a specific use (improvement of a service, resource, etc.) ASSESSMENT RESEARCH ACTION RESEARCH
  • 18.
    What to consider... Whatis the purpose of your study? LEAN TOWARD ASSESSMENT ● To make improvements… locally ● To move quickly ● To continuously improve and adjust ● To make actionable discoveries LEAN TOWARD RESEARCH ● To make improvements… across the field ● To find out why something happens ● To improve practices in a broad context ● To perform a rigorous investigation ACTION RESEARCH (Both!) ○ Performs research that is also used to make improvements ○ Adds to generalizable knowledge ○ Contributes knowledge and recommended practice to the profession
  • 19.
    Assessment & ResearchMethods ● Quantitative: ○ Assumes objectivity of social facts ○ Things that can be measured, counted ○ May study large populations ○ Usually involves statistical analysis ○ Desire rigor ● Qualitative: ○ Assumes social constructions of reality ○ Often involves observation or interviews ○ Smaller populations or case studies ○ Provides rich, nuanced non-numeric analysis ○ Desire trustworthiness ● Mixed Methods ○ Believes multiple methods can triangulate richer conclusions ○ Appreciates strengths of both design types
  • 20.
    Some Examples ofMethods & Design Research Design Methodology Data Collection Data Analysis Quant Survey Questionnaire Cross-tabulation, Correlations Quant Usability testing Focus group Descriptive statistics Quant Experimental research Questionnaire (pre-test, post-test) Hypothesis testing Qual Ethnography Observation Narrative analysis Qual Case study Photo diaries Coding & themes Mixed Surveys + Ethnography Questionnaire + observation Comparison of conclusions (descriptive stats, narrative)
  • 21.
    Research Assessment &Research Methods When choosing a design, method, and type of data collection, consider… ● What time do you have to conduct the study? ● How you want your results to look? (anticipate the analysis) ● What data is easy to gather? ● What data is meaningful to gather? ● What kinds of results would be “actionable” for assessment?
  • 22.
    Criteria for GoodAssessment Design ● It is usable, actionable? ○ Don’t assess what you know you cannot change. ○ Don’t gather data/evidence that will be hard to act on or respond to. ○ Focus on what will lead to actionable improvement. ● Is it realistic? ○ Don’t turn a small-scale project into a large one (scale). ○ “Good enough” and “done” are better than an incomplete, rigorous project. ○ How much work is involved not only in collection, but also in analysis & interpretation? ● Is it sustainable?
  • 23.
  • 24.
    Privacy Concerns ● Studentdata ● Patron checkout history ● Various kinds of identifying information: ○ Student ID number ○ Address ○ Birthdate ○ Zip code ○ Race/ethnicity ○ Student major ○ Student level (graduate, undergraduate, freshman, etc.)
  • 25.
    Privacy Concerns: typesof identifying data Direct Identifiers (aka Personally Identifying Information or PII) ● Name ● Social security number ● Driver’s license ● School ID (or other unique ID numbers) ● Other account numbers (library account) ● Phone number ● Email address ● Street Address ● Credit card info (library fees, fines) ● Photographs Indirect Identifiers… Shared by multiple people: ● Birthdate ● Zip code (postal code) ● Gender ● Blood type ● Class enrolled in Masked (may be unique): ● License plate ● IP address
  • 26.
    Methods of Anonymization& Security For direct identifiers: ● Suppression: ○ Remove data ○ Replace data with pseudonyms (still uniquely identified, but not by name) ● Aggregation ○ Group data instead of preserving individual cases / observations ○ Think of census data, which is reported in groups (by gender, race, or another grouping variable) For indirect identifiers: ● Generalization ○ Replace specific ages with age ranges ○ Use only first 3 digits of Zip Code ● Aggregation (see at right) ● Perturbation (advanced!) ○ Add statistical ‘noise’ ○ Replace specific values with specific but randomized values ○ Preserves statistical properties such as mean, variance In all cases: password protect or otherwise restrict permission and access to sensitive data.
  • 27.
  • 28.
    Data Protection Considerations ●What kinds of direct or indirect identifiers do the data contain? ● Are there any unique or rare observations in the data? ● Which data points could be used together to identify an individual? ● Can other sources be linked to the data, making identification possible? ● Which features of the data do you want to keep? What can you sacrifice?
  • 29.
    Getting Data FromOther Systems
  • 30.
    Data From Third-PartySystems Some examples: ● Your ILS (Alma, III Millennium, etc.) ● Reference transactions (LibInsight, Desktracker, etc.) ● Database stats (COUNTER standard)
  • 31.
    Challenges & Tips Formattingdata (exports, forms, queries) Matching data points across systems Data cleaning Granularity Frequency
  • 32.
    Some Parting Thoughts ●Prompt library faculty/staff to think for themselves about: ○ What data they collect, why, and how they use it ○ What questions they have or problems they want to solve ○ What metrics would best answer those questions ● Prioritize assessment activities ○ Realize that you can’t measure everything ○ Prioritize: ■ Biggest needs? ■ Low-hanging fruit? ○ Grow a little each year
  • 33.
    Any Questions? Starr Hoffman,PhD, MLS, MA Director, Planning & Assessment UNLV Libraries [email protected]
  • 34.