Learning Analytics in Higher Education 1st
Edition Jaime Lester install download
https://ebookmeta.com/product/learning-analytics-in-higher-
education-1st-edition-jaime-lester/
Download more ebook from https://ebookmeta.com
We believe these products will be a great fit for you. Click
the link to download now, or visit ebookmeta.com
to discover even more!
The Analytics Revolution in Higher Education Big Data
Organizational Learning and Student Success 1st Edition
Jonathan S. Gagliardi
https://ebookmeta.com/product/the-analytics-revolution-in-higher-
education-big-data-organizational-learning-and-student-
success-1st-edition-jonathan-s-gagliardi/
Digital Agency in Higher Education Transforming
Teaching and Learning 1st Edition Toril Aagaard
https://ebookmeta.com/product/digital-agency-in-higher-education-
transforming-teaching-and-learning-1st-edition-toril-aagaard/
India Higher Education Report 2022: Women in Higher
Education 1st Edition N.V. Varghese
https://ebookmeta.com/product/india-higher-education-
report-2022-women-in-higher-education-1st-edition-n-v-varghese/
Lloyd's Maritime Atlas of World Ports and Shipping
Places 2022-2023 32nd Edition Taylor & Francis Group
https://ebookmeta.com/product/lloyds-maritime-atlas-of-world-
ports-and-shipping-places-2022-2023-32nd-edition-taylor-francis-
group/
Lectures on Modern Convex Optimization 2020-2023:
Analysis, Algorithms, Engineering Applications Aharon
Ben-Tal
https://ebookmeta.com/product/lectures-on-modern-convex-
optimization-2020-2023-analysis-algorithms-engineering-
applications-aharon-ben-tal/
Leadership Resilience in a Digital Age 1st Edition
Young
https://ebookmeta.com/product/leadership-resilience-in-a-digital-
age-1st-edition-young/
Miss Mated BBW Paranormal Shape Shifter Romance Raging
Falls Book 4 1st Edition Milly Taiden Taiden Milly
https://ebookmeta.com/product/miss-mated-bbw-paranormal-shape-
shifter-romance-raging-falls-book-4-1st-edition-milly-taiden-
taiden-milly/
Key Out of Time 1st Edition Andre Norton
https://ebookmeta.com/product/key-out-of-time-1st-edition-andre-
norton/
WJEC Eduqas Media Studies For A Level Year 1 and AS
Student Book Unknown
https://ebookmeta.com/product/wjec-eduqas-media-studies-for-a-
level-year-1-and-as-student-book-unknown/
How to Win at Spread Betting An analysis of why some
people win at spread betting and some lose Patel Alpesh
B Kiri Paresh H
https://ebookmeta.com/product/how-to-win-at-spread-betting-an-
analysis-of-why-some-people-win-at-spread-betting-and-some-lose-
patel-alpesh-b-kiri-paresh-h/
Learning Analytics in
Higher Education
Learning Analytics in Higher Education provides a foundational understanding
of how learning analytics is defined, what barriers and opportunities exist,
and how it can be used to improve practice, including strategic planning,
course ­
development, teaching pedagogy, and student assessment. Well-known
­
contributors provide empirical, theoretical, and practical perspectives on the
current use and future potential of learning analytics for student learning and
data-driven decision-making, ways to effectively evaluate and research ­
learning
analytics, integration of learning analytics into practice, organizational barriers
and opportunities for harnessing Big Data to create and support use of these
tools, and ethical considerations related to privacy and consent. Designed to
give readers a practical and theoretical foundation in learning analytics and
how data can support student success in higher education, this book is a valu-
able resource for scholars and administrators.
Jaime Lester is Professor of Higher Education at George Mason University,
USA.
Carrie Klein is a PhD Candidate, and Research and Teaching Assistant in the
Higher Education Program at George Mason University, USA.
Aditya Johri is Associate Professor of Information Sciences and Technology at
George Mason University, USA.
Huzefa Rangwala is Associate Professor of Computer Science at George ­
Mason
University, USA.
Learning Analytics
in Higher Education
Current Innovations, Future
Potential, and Practical Applications
Edited by Jaime Lester, Carrie Klein, Aditya Johri
and Huzefa Rangwala
First published 2019
by Routledge
711 Third Avenue, New York, NY 10017
and by Routledge
2 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN
Routledge is an imprint of the Taylor & Francis Group, an informa business
© 2019 Taylor & Francis
The right of Jaime Lester, Carrie Klein, Aditya Johri, and Huzefa
Rangwala to be identified as the authors of the editorial material,
and of the authors for their individual chapters, has been asserted in
accordance with sections 77 and 78 of the Copyright, Designs and
Patents Act 1988.
All rights reserved. No part of this book may be reprinted or
reproduced or utilised in any form or by any electronic, mechanical,
or other means, now known or hereafter invented, including
photocopying and recording, or in any information storage or retrieval
system, without permission in writing from the publishers.
Trademark notice: Product or corporate names may be trademarks
or registered trademarks, and are used only for identification and
explanation without intent to infringe.
Library of Congress Cataloging-in-Publication Data
A catalog record for this title has been requested
ISBN: 978-1-138-30213-6 (hbk)
ISBN: 978-1-138-30217-4 (pbk)
ISBN: 978-0-203-73186-4 (ebk)
Typeset in Bembo
by codeMantra
Contents
List of Tables vii
List of Figures viii
Preface ix
Acknowledgements xvi
1		Absorptive Capacity and Routines: Understanding
Barriers to Learning Analytics Adoption in Higher Education 1
Aditya Johri
2		Analytics in the Field: Why Locally Grown Continuous
Improvement Systems are Essential for Effective
Data-Driven Decision-Making 20
Matthew T. Hora
3		Big Data, Small Data, and Data Shepherds 45
Jennifer DeBoer and Lori Breslow
4		Evaluating Scholarly Teaching: A Model and Call for an
Evidence-Based Approach 69
Daniel L. Reinholz, Joel C. Corbo, Daniel J. Bernstein, and
Noah D. Finkelstein
5		Discipline-Focused Learning Analytics Approaches with
Users Instead of for Users 93
David B. Knight, Cory Brozina, Timothy J. Kinoshita,
Brian J. Novoselich, Glenda D. Young, and Jacob R. Grohs
vi Contents
6		Student Consent in Learning Analytics: The Devil in the
Details? 118
Paul Prinsloo and Sharon Slade
7		Using Learning Analytics to Improve Student Learning
Outcomes Assessment: Benefits, Constraints, & Possibilities 140
Carrie Klein and Richard M. Hess
8		Data, Data Everywhere: Implications and Considerations160
Matthew D. Pistilli
Editors 187
Contributors 188
Index 193
Tables
2.1 The six repositories where organizational information can be
stored and accessed 29
4.1 Rubric of components of scholarly teaching 76
4.2 Summary of six perspectives of institutional change theories,
from Kezar (2013) 82
4.3 A summary of roles of three key layers for enacting a scholarly
and discipline-grounded teaching evaluations 83
5.1 Selected themes emerging from the focus group with students 96
5.2 Summary of themes related to data instructors would find useful 99
5.3 Change in how time is spent engaging with course content
reported in hours per week between high-stakes tests 1 and 2 110
5.4 ID card use by student over the semester by college 113
6.1 Simple consent versus informed consent (Whitney et al.,
2004, p. 55) 121
Figures
1.1 Theoretical framework (adapted and modified from Martin et al.,
2003, itself adapted from Zahra  George, 2002, and others) 12
2.1 The processes of decision chain data use 32
3.1 Model for the integrated Multiple Perspective Insights framework 48
3.2 Reasons student cited for positive response to the CAF (Graph
courtesy of Dr. Saif Rayyan and The MIT Faculty Newsletter) 56
3.3 Our integrated set of methods/methodologies, aligned with
data sources and resultant findings (modified from Chen, 2015) 58
3.4 Spearman correlations between CAF behaviors and student
outcomes (adapted from Chen, 2015) 62
3.5 Comparison of the centroids of the two student clusters for
their behaviors on online homework problems 63
5.1 Overall unique sessions by week grouped by final course grade 106
5.2 LMS usage by day of the week grouped by final course grade 107
5.3 Performance pathways in statics based on four distinct tests 109
5.4 Average final daily ID usage for a high variation and low
variation student (one semester broken into quarters) 112
5.5 Average student ID usage by day of the semester 112
5.6 Daily ID usage over the semester by GPA quintile (5th is highest) 114
6.1 A conceptual overview of the generative mechanisms for
considering consent 131
6.2 Typology for consent in learning analytics 134
8.1 The Potter Box (Potter, 1965) 177
Preface
Jaime Lester
Introduction
In 2013, the same amount of data were generated in ten minutes as was gen-
erated previously in all of recorded history (Zwitter, 2014). In the last decade,
learning analytics has evolved in education alongside the Big Data revolution.
The ability to mine and analyze large amounts of institutional data is useful
for higher education institutions, which are facing increasing environmental
pressures to provide evidence of learning, institutional accountability, and in-
creased retention and completion rates (Norris  Baer, 2013). The existence
of these data has made their use integral data-driven management of higher
education institutional goals and practices (Slade  Prinsloo, 2013).
Due to their volume, velocity, and variety, learning analytics have the
­
potential to bring clarity from complexity, allowing for organizations to ­
better
understand trends and correlations in data (Macfadyen  Dawson, 2012;
­
Norris  Baer, 2013, p. 13). These insights can be used to improve peda-
gogy, course design, student retention, and decision-making by providing
­
personalized feedback for users. Within this context, learning and advising
management systems, based on learning analytics, are being developed to better
measure, analyze, report, and predict data related to student learning, ­
retention,
and completion. These learning analytics-informed systems have the potential
to generate new insight into courses and student learning by creating respon-
sive feedback mechanisms that can shape data-informed decision-making as it
relates to teaching, learning, and advising.
Given the potential and increasing presence of learning analytics in higher
education, it is important to understand how learning analytics is defined,
what barriers and opportunities exist, and how it can be used to improve
organizational and individual practices, including strategic planning, course
x Preface
development, teaching pedagogy, student assessment, and ethical use. This
edited book is designed to give readers a practical and theoretical founda-
tion in learning analytics in higher education, including an understanding
of the challenges and incentives that are present in the institution, in the
individual, and in the technologies themselves. The authors of this book
explore the current use and future potential of learning analytics for student
learning and data-driven decision-making, ways to effectively evaluate and
research learning analytics, integration of learning analytics into practice,
organizational barriers and opportunities for harnessing Big Data to create
and support use of these tools, and ethical considerations related to privacy
and consent.
Among questions that are explored and answered are (1) What are the foun-
dational assumptions and functions of learning analytics algorithms? (2) What
effects do learning analytics technologies have on student learning, pedagogical
development, and assessment of teaching and learning? (3) What role do insti-
tutional context, technological capacity, and individual beliefs play in promot-
ing or constraining adoption and integration of learning analytics technologies
in higher education? (4) What are the ethical considerations related to use of
learning analytics or other predictive data and associated interventions? and
(5) What are the practical implications and future research recommendations
associated with learning analytics based?
Defining Learning Analytics
Learning analytics has arguably grown out of the field of education data min-
ing and the explosion of Big Data that has occurred during the past decade.
Education mining is the development and use of research methods to leverage
large-scale or ‘big’ data from educational settings to better understand student
learning and contexts (Siemens  Baker, 2012). While there is no uniformly
accepted definition, learning analytics is generally understood to be the “mea-
surement, collection, analysis and reporting of data about learners and their
contexts, for purposes of understanding and optimizing learning and the envi-
ronments in which it occurs” (Siemens, 2013, p. 3). Learning analytics is a form
of educational data mining that specifically uses predictive analysis on Big Data
with the intention of creating platforms for intervention. One of the first ex-
amples of a learning analytics platform was the Purdue Course Signals Project,
which developed a traffic light visualization to represent student performance
in higher education courses (Arnold  Pistilli, 2012). The signals were available
to students to help them better assess their learning and faculty to utilize new
communication tools to support student success. Behind the Signals tool were
some of the first learning analytics algorithms and new forms of data visual-
ization. Importantly, learning analytics is concerned with understanding and
inferring certain key characteristics about student learning, not a more generic
Preface xi
use of predictive analytics (e.g., institutional or academic analytics) for business
use (e.g., predicting student enrollment trends).
Learning analytics is an emerging field, and the few studies currently pub-
lished have largely focused on the user side with emphasis on specific ­
analytics
tools, such as Purdue Course Signals, or data visualization (Ali, ­
Hatala,
Gašević,  Jovanović, 2012; Arnold  Pistilli, 2012; Duval, 2011; ­
Jayaprakash,
Moody, Lauría, Regan,  Baron, 2014; Kosba, Dimitrova  Boyle, 2005;
Lockyer, Heathcote  Dawson, 2013; Park  Jo, 2015; Santos, Verbert,
­
Govaerts,  Duval, 2013; Verbert, Duval, Klerkx, Govaerts,  Santos, 2013).
The results of many of these studies conclude that learning analytics is useful
when tracking student information (i.e., grades) and creating communication
with peers and instructors; however, poorly designed data visualizations and
communications can also inhibit use. An even smaller number of studies have
focused on the impact of organizational dynamics on learning analytics adop-
tion and use. Hora, Bouwma-Gearheart, and Park (2017), for example, found
that a lack of time, incentives, and training for pedagogy negatively impact
the ability for instructors to accurately and efficiently use learning analytics
tools. ­
Further, Klein, Lester, Rangwala, and Johri (in press) found that mis-
alignment between technological components and abilities and user needs and
practices inhibits adoption of these tools by individuals, recommending that
users be included in the design, purchase, and implementation of these tools. In
­
response to the impact of organizational barriers and incentives, Arnold, Lonn,
and ­
Pistill (2014) developed The Learning Analytics Readiness Instrument,
building upon similar work by Norris and Baer (2013) to assist institutions in
evaluating their capacity to integrate learning analytics into their institutional
processes and cultures. This book helps to provide a multidisciplinary approach
to the literature on learning analytics and provides multiple new research ave-
nues to further knowledge.
Audience and Need
This book is intended for anyone who works in higher education and uses
learning and/or advising management systems, especially those based on learn-
ing analytics algorithms. Information in the book will be relevant for faculty,
advisors, and administrators who are interested in the potential and challenges
related to implementation, adoption, and integration of these systems on their
campuses and within their classrooms and advising sessions. Researchers in
higher education will also be interested in the interdisciplinary and multi-
method discussion of analyzing the impact of learning analytics on student
success and organizational decision-making.
One of the main audiences for this book is information technology, in-
structional designers, and institutional research offices that are regularly inter-
facing with companies and organizations that develop learning analytics tools
xii Preface
for higher education. Companies such as Blackboard™, D2L, EAB, Ellucian,
Moodle, and Salesforce, in partnership with universities, have developed learn-
ing analytics tools to assist in tracking student success. The rise of the power
of new algorithms has created a marketplace with little empirical guidance for
higher education administrators who are selecting among a variety of new, and
often expensive, tools. This book provides a better understanding of the capa-
bilities of learning analytics tools, how to critically consider their use and meth-
odologies, how to develop tools that are useful to users, how to integrate these
tools into practice, and how to use these tools and their data into organizational
decision-making. This book serves as a valuable resource for all higher edu-
cation administrators who are evaluating or adopting learning analytics tools.
Other main audiences for this book are higher education faculty and ad-
visors, who are seeking to integrate learning analytics tools into their prac-
tice. Studies on learning management systems often note that these systems
are often not used to their full potential (Bichsel, 2012; Dahlstrom, Brooks 
­
Bichsel, 2014), and a more recent study (Klein, Lester, Rangwala, Johri, in
press) identifies the challenges in adoption of learning analytics tools. Simply,
more intentionality in the form of professional development and consistent in-
stitutional decision-making is needed to support integration of learning analyt-
ics into practice. This book provides information on the relationship between
institutional decision-making and supporting widespread adoption of learning
analytics, the need for faculty professional development tied to the values that
undergird teaching and advising practice, and the importance of including us-
ers in the process of learning analytics development and implementation. Or-
ganizations that work on faculty development and those focused on student
learning and assessment will be interested in the contents of this book.
Finally, this book appeals to higher education, learning analytics, and other
education scholars who are working on the myriad of questions related to
learning analytics. Across multiple disciplines—engineering education, com-
puter science, communication, and psychology—scholars are exploring the
complexity of learning analytics in multiple education sectors. This book pro-
vides innovative approaches to learning analytics research, including those us-
ing multiple forms of data- and user-informed methods. The value of this book
is that it brings together scholars from these disciplines to explore the complex
nature of learning analytics creation, adoption, and impact on student success
within the unique context of higher education.
Overview of the Chapters
The first chapter in the book, written by Aditya Johri, explores the positive
potential of learning analytics for higher education practice and argues that
learning analytics has failed to gain widespread adoption in higher education,
especially in comparison to corporate settings. He also shows, through a series
Preface xiii
of case studies, how successful implementation of learning analytics initiatives
is often hampered by the capacity and routines of both organizations and their
members. Drawing on organizational studies concepts of absorptive capacity
and routines, Johri outlines a new model of how colleges and universities can
more effectively adopt learning analytics by addressing issues of capacity and
routine during design and implementation.
Matthew Hora draws from a case study of a California research university to
explore the intersection of data-driven decision-making and learning analytics
in Chapter 2. Using an organizational context framework, Hora argues that ed-
ucators draw upon a variety of numeric data and other information (e.g., student
feedback, conversations with colleagues), operate within institutional contexts
that are poorly designed to facilitate continuous improvement, create novel and
often low-tech solutions to the lack of quality data, engage colleagues and stu-
dents in their use and analysis of instructional data, and respond to ­
external
mandates for data use in very different ways. His chapter underscores the com-
plexity of data-driven decision-making and the interplay of organizational
capacity, individual routines and practices, and technological alignment that
exists in data-rich environments.
The next chapter authored by Jennifer DeBoer and Lori Breslow begins the
discussion of learning analytics to examine student learning in the classroom.
DeBoer and Breslow propose a sophisticated methodology, with a mixed-
method and a multi-stage approach that leverages small data with the emer-
gence of new Big Data used in learning analytics. This chapter also argues for
the efficacy of multidisciplinary research teams that combine the expertise of
education researchers and faculty in other disciplines (i.e., engineering and sci-
ence). With data on student learning often coming in multiple forms, such as
student surveys, exams, and transcript data, their model is a guide for who to
engage and the steps needed to conduct a thorough analysis.
Chapter 4, by Reinholz and colleagues, looks at the other side of the class-
room, the evaluation of instructors, and how to utilize learning analytics to
create more accurate teaching evaluations for promotion and tenure processes.
Using a framework that defines teaching as a scholarly activity analogous to
research, the authors outline how multiple forms of data, from students, faculty
peers, and reflections from the faculty members, themselves, can be integrated
into online learning analytics programs to more accurately and effectively
evaluate faculty teaching. The chapter concludes with a clear strategy for im-
plementation, a challenge that is outlined by Johri in the second chapter of
this book.
Chapter 5, by Knight et al., from multiple higher education institutions,
describes a mixed-method approach to learning analytics data analysis. The
authors suggest that incorporation of qualitative methods into learning an-
alytics studies (which are often quantitative in nature) allows for a clearer
understanding of student success. Largely in agreement with the central
xiv Preface
argument of Chapter 3 by DeBoer and Breslow, Knight et al., describe a
process of engaging users, in this case, undergraduate student researchers, to
construct and make meaning of data, or potential data, produced via learning
analytics methods. The chapter continues by showing results from learn-
ing analytics analysis and the limitations of explaining those results without
the engagement of users like undergraduate students. This chapter combined
with that of DeBoer and Breslow creates a compelling argument for more
mixed-method and user-engaged models for learning analytics in the higher
education setting.
Prinsloo and Slade, in Chapter 6, outline a major concern in the collection
and analysis of Big Data including learning analytics in higher education—­
student consent. The chapter begins with a broad overview of consent from
the medical research tradition and turns to consent in the digital environment.
Prinsloo and Slade effectively argue that the digital environment provides
unique complexities to consent in the form of data property, intent of data use,
control over data, and privacy. As learning analytics is concerned with the anal-
ysis of existing Big Data often collected for other purposes, these arguments are
directly relevant. The chapter concludes with broader ethical considerations
and recommendations for consent.
In Chapter 7, Klein and Hess provide an overview of how learning analytics
data can inform student learning outcomes assessment efforts in higher educa-
tion. The chapter begins with an overview of traditional assessment measures
and then explores how the timely, visualized, personalized, and predictive na-
ture of learning analytics data can enhance those efforts. Using examples from
tools and approaches researched in extant theoretical and empirical studies,
they show that use of learning analytics in assessment provides dynamic for-
mative feedback to users, allowing them to make more timely, informed de-
cisions during the learning process. They also highlight the need for learning
analytics-enhanced assessment to be inclusive of informed and empowered data
users (per the arguments by Johri, DeBoer and Breslow, and Knight et al., in
this book), be built on trusted foundations, and be cognizant of the specific
implications of using learning analytics data in practice. The chapter concludes
with recommendations for implementation of learning analytics-enhanced as-
sessment initiatives.
The last chapter of the book is by Matthew Pistilli, an individual who was
integral to the Purdue Signal Project. His chapter outlines the broad themes
across the other chapters and presents future implications and practical con-
siderations for learning analytics in higher education. Organized around four
major questions, Pistilli argues that learning analytics is a growing and dynamic
field that requires careful and thoughtful implementation. For example, Pistilli
specifically outlines the complexity of student data and how the diversity of
students in higher education today leads to a lack of uniformity in data. He
concludes and provides a response to a central question: How should—not
can—data be used, and to what ends?
Preface xv
References
Ali, L., Hatala, M., Gašević, D.,  Jovanović, J. (2012). A qualitative evaluation of evo-
lution of a learning analytics tool. Computers  Education, 58, 470–489.
Arnold, K. E., Lonn, S.,  Pistilli, M. D. (2014, March). An exercise in institutional
reflection: The learning analytics readiness instrument (LARI). In Proceedings of the
Fourth International Conference on Learning Analytics and Knowledge (pp. 163–167). New
York, NY: ACM.
Arnold, K. E.,  Pistilli, M. D. (2012, April). Course signals at Purdue: Using learning
analytics to increase student success. In Proceedings of the 2nd international conference on
learning analytics and knowledge (pp. 267–270). New York, NY: ACM.
Bichsel, J. (2012, August). Analytics in higher education: Benefits, barriers, progress, and recom-
mendations (esearch report). Louisville, CO: EDUCAUSE Center for Applied Research.
Retrieved from http://net.EDUCAUSE.edu/ir/library/pdf/ERS1207/ers1207.pdf.
Dahlstrom, E., Brooks, D. C.,  Bichsel, J. (2014). The current ecosystem of learning man-
agement systems in higher education: Student, faculty, and IT perspectives (research report).
Louisville, CO: EDUCAUSE, September 2014. Available from www.educause.
edu/ecar.
Duval, E. (2011, February). Attention please!: Learning analytics for visualization and
recommendation. In Proceedings of the 1st international conference on learning analytics and
knowledge (pp. 9–17). New York, NY: ACM.
Hora, M. T., Bouwma-Gearhart, J.,  Park, H. J. (2017). Data driven decision-making
in the era of accountability: Fostering faculty data cultures for learning. The Review of
Higher Education, 40(3), 391–426.
Jayaprakash, S. M., Moody, E. W., Lauría, E. J., Regan, J. R.,  Baron, J. D. (2014).
Early alert of academically at-risk students: An open source analytics initiative. Jour-
nal of Learning Analytics, 1(1), 6–47.
Klein, C., Lester, J., Rangwala, H.,  Johri, A. (in press). Learning analytics tools in
higher education: Adoption at the intersection of institutional commitment and
individual action. The Review of Higher Education.
Kosba, E., Dimitrova, V.,  Boyle, R. (2005, July). Using student and group models to
support teachers in web-based distance education. In International Conference on User
Modeling (pp. 124–133). Berlin, Heidelberg: Springer.
Lockyer, L., Heathcote, E.,  Dawson, S. (2013). Informing pedagogical action: Align-
ing learning analytics with learning design. American Behavioral Scientist, 57(10),
1439–1459.
Norris, D. M.,  Baer, L. L. (2013). Building organizational capacity for analytics.
Educause Learning Initiative, 7–56.
Park, Y.,  Jo, I. H. (2015). Development of the Learning Analytics Dashboard to Sup-
port Students’ Learning Performance. J. UCS, 21(1), 110–133.
Santos, J. L., Verbert, K., Govaerts, S.,  Duval, E. (2013, April). Addressing learner
issues with StepUp!: An evaluation. In Proceedings of the third international conference on
learning analytics and knowledge (pp. 14–22). Leuven: ACM.
Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behav-
ioral Scientist, 57(10), 1380–1400.
Siemens, G.,  d Baker, R. S. (2012, April). Learning analytics and educational data
mining: towards communication and collaboration. In Proceedings of the 2nd interna-
tional conference on learning analytics and knowledge (pp. 252–254). ACM.
Verbert, K., Duval, E., Klerkx, J., Govaerts, S.,  Santos, J. L. (2013). Learning analyt-
ics dashboard applications. American Behavioral Scientist, 57(10), 1500–1509.
Acknowledgements
This book was supported in part by a grant from the National Science Founda-
tion under grant IIS-1447489.
1
Absorptive Capacity and
Routines
Understanding Barriers to Learning Analytics
Adoption in Higher Education
Aditya Johri
Introduction
As I sit here writing this near the start of a new semester, it is hard to imagine
that it is almost a decade since the term ‘Big Data’, in its current incarnation,
“the mining and processing of petabytes’ worth of information to gain insights
into customer behavior, supply chain efficiency and many other aspects of busi-
ness performance” (Pearson  Wegener, 2013, p. 1), was first introduced in the
mainstream media by The Economist (2010). Since then, the notion of data an-
alytics has infused almost all thinking about how organizations go about their
business; and data-driven organizations and organizations driving data-driven
practices, including infrastructures such as cloud computing, have become the
jewels of the business world (e.g., Amazon™, Google™, etc.). It was reported
in a recent study of more than 400 large companies conducted by Bain 
Company that early adopters of Big Data analytics had a significant lead over
the rest of the corporate world (Pearson  Wegener, 2013). The companies
that had adopted Big Data analytics, according to this report, were (1) twice as
likely to be in the top quartile of financial performance within their industries,
(2) five times as likely to make decisions faster than market peers, (3) three times
as likely to execute decisions as intended, and (4) twice as likely to use data fre-
quently when making decisions (Pearson  Wegener, 2013). Given ­
reports like
this, it is not surprising that many organizations, spanning various industries,
are looking toward data analytics as way to propel themselves forward.
Higher education institutions are also cognizant of the potential value of
analytics to improve organizational performance. As a result, at least two lead-
ing ideas and communities—educational data mining (EDM) and Learning
Analytics (LA)—have emerged on the scene (Lester, Klein, Rangwala,  Johri,
2017). EDM, which has a more computational stance, is concerned largely with
2 Aditya Johri
developing, researching, and applying computerized methods to detect patterns
in large collections of educational data that would otherwise be hard or impos-
sible to analyze due to the enormous volume of data within which they exist.
EDM, as the name implies, is defined as “the application of data mining (DM)
techniques to this specific type of dataset that come from educational environ-
ments to address important educational questions” (Romero  Ventura, 2013,
p. 12). Overall, EDM researchers and practitioners analyze data generated by
any type of information system that supports supporting learning or education,
defined broadly—schools, colleges, or universities. These data are broad and
include interactions of individual students within an educational system (e.g.,
navigation behavior, input in quizzes, and interactive exercises) but also ad-
ministrative data (e.g., school, school district, teacher), demographic data (e.g.,
gender, age), and so forth. The other allied field, LA, is concerned more with
learners directly and includes as its purview “the measurement, collection, anal-
ysis and reporting of data about learners and their contexts, for purposes of un-
derstanding and optimizing learning and the environments in which it occurs”
(Siemens et al., 2011, p. 4). Whereas LA is largely concerned with improving
learner success (Gašević , Dawson  Siemens, 2015), the practitioners or LA
differentiate academic analytics as “the improvement of organizational pro-
cesses, workflows, resource allocation, and institutional measurement through
the use of learner, academic, and institutional data. Academic analytics, akin to
business analytics, are concerned with improving organizational effectiveness”
(Siemens et al., 2011, p. 4). For the purposes of this chapter, I am going to use
LA as a catchall for all the data and analysis techniques mentioned above.
We hear continuously about how LA has the potential to change higher
education and how these data are making inroads, but the reality at the level
of everyday work practices is different. This is not to say that higher educa-
tion organizations are not leveraging data and analytics (Arroway, Morgan,
O’Keefe, Yanosky, 2016), but the adoption lags behind what organizations in
other sectors are doing or moving toward. According to Bichsel (2012), in
2012, between 55% and 65% of institutions reported engaging in data activity
at the level of finance and human resources, but less than 20% of institutions
reported data analytics activity in the functional areas of instructional manage-
ment, centralized information technology (IT) infrastructure, student learning,
strategic planning, alumni and advancement, research administration, library,
cost to complete a degree, human resources, facilities, faculty promotion and
tenure, faculty teaching performance, procurement, and faculty research per-
formance. A lot of the fervor is still about the potential and not necessarily
the actual implementation of LA. For instance, the most innovative work in
terms of using machine learning and DM is limited largely to researchers in
the educational research community (SOLAR, EDM, etc.). Companies that
are providing products are, for the most part, using rudimentary techniques for
data analysis and presentation—bar graphs, pie charts, and large Excel™ files.
Absorptive Capacity and Routines 3
This is complicated by data access issues, including data ethics and privacy, and
as many authors in this volume point out, that is not necessarily a bad thing.
Yet, a lack of clarity around data use has limited the development of innovative
applications. There is also limited understanding of the impact LA can have
beyond the immediate concerns that most higher education institutions, espe-
cially publicly funded ones, are facing, such as student retention. Tuition fees
are becoming an ever larger portion of the budget as public funding is declin-
ing. The other funding option is externally funded grants, which essentially
means overhead, and, therefore, this is another area in which analytical efforts
are targeted. Finally, LA is also prevalent in reporting, as accreditation con-
cerns overwhelm institutions often at the expense of institutional effectiveness.
At the infrastructure level—data warehousing, for instance—there has been
significant uptake of information technology (IT) in most higher education
institutions, and, therefore, there is a reasonable expectation that slowly LA will
percolate to other aspects of higher education institutions.
In this chapter, my goal is to shed light on what prevents a greater adoption
of LA within higher education. As opposed to other chapters in the volume, I
first use a personal perspective on LA, based on my experiences as an instructor,
an administrator, and a researcher, to shed light on what I believe are some es-
sential issues that need to be addressed. After that, I look at some recent reports
that shed light on what organizations—largely business enterprises—have to do
in order to leverage data analytics successfully. I then move toward some the-
oretical explanations for the relative lack of data analytics application in higher
education organizations and use these theoretical underpinnings to examine
three case studies from my own experience working on an LA tool research
project—cases, which are likely familiar to readers from their own experience.
Finally, I end with practical considerations for overcoming barriers to the use
of LA in higher education.
I have to start with a caveat—my personal characteristics and experiences
shape my experience of using LA. I am a technology adopter and so are most
of the people who worked on the project I refer to in this chapter. I sit in an
engineering school and teach analytics. I have a research interest in this area. I
know people I can reach out to when I need help with technology, and I know
the resources to refer to when I hit a wall. This is probably not the case for most
people on campus. Just as an example, there is a wide variation in just using a
learning management system (LMS) across the institution. Some of this has to
do with lack of technological expertise, but a lot of it has to do with a lack of
understanding of what LMS can and does add to teaching. In many cases, it
actually does little beyond acting as a repository of resources. Even in terms of
using email, which is a standard practice now, there is a diversity of tools people
use (a large part of the user base actually using Gmail™ to access the university
mail services). Therefore, there will always be a vast variation in any kind of
technology use on a campus, and this, itself, I will argue, is problematic.
4 Aditya Johri
Missed Opportunities for Learning Analytics
I start with reflecting on my work this week; a busy week as the new semester
starts shortly. Here are the instances of data or analytics use I can come up with
in my work. I looked at some data about my research expenditures—Excel™
charts sent to me to make sure expenditures were progressing as planned and
charges were correct. I looked at it, saw some numbers in red, and went about
taking action by emailing a few folks to ensure things were corrected. I had
six such reports to go through, and most of my action was taken when things
were way off or when things were off but I thought it was a temporary issue.
The other data I looked at were the number of students enrolled in my classes
to make sure that one of the classes had the minimum number of enrollments.
Otherwise, I would have done some more advertising and publicity for the
class. Then, I went into our LMS, Blackboard™, to set up course pages for
the upcoming semester. I copied some stuff, I updated some other stuff, and I
tweaked some settings. Now, my hope is that things will work smoothly when
the semester starts. There doesn’t seem to be a lot of analytics going on here
and minimal use of data. It is clear that as a faculty member, the use of data and
analytics is not a part of my everyday practices or of what some organizational
theorists will call my ‘routines’. The idea of routines or practices that have be-
come a norm and are embedded across the organization is a critical one for my
argument, and I will address it in detail later.
Are there instances, though, within my work practices where data and an-
alytics would have been important for me or would have helped me in some
ways? Can it be integrated into my routines—actions I have to take habitually?
Certainly. For instance, I would much rather be able to do a dynamic review
of my grant expenditures. Grant funding comes with a time stamp—it has
to be spent within a specific amount of time, and the funds can be used only
for certain activities and items as specified in the proposal. Yet, there is some
leeway wherein spending can be different than what is exactly proposed. This
means that it is important to monitor and make adjustments as the grant pe-
riod progresses. The systems in place to monitor spending—many driven by
federal regulations—make the monitoring problematic. There is always a delay
between spending and when it is posted against the grant account, for instance.
There is also a lag, as finding a student with the right expertise can take time.
All this makes running the grant very fluid, and this problem is compounded
with each additional grant. Hence, some way of continuous monitoring is es-
sential. And yes, I know there are systems that allow me to do that to some
extent, and these vary by the institution, but they are cumbersome to use and
no single system provides me with all the information I need to take action.
At the end of the day, it will take an email or a face-to-face visit with a fiscal
person or a post-award administrator to resolve the issue, and often there is no
single point of contact. The grants office is responsible for certain issues, while
the home department of college is responsible for others.
Absorptive Capacity and Routines 5
When it comes to teaching, in addition to the number of students, it would
be great if I could get some information about the students. Now, there is a
way in which I can log in to another system, dig through a few screens, and get
their photos and degree information, but, once again, it is cumbersome. What
I really want to know is their prior knowledge, their achievements, and their
interests. In my role as a teacher, my primary responsibility lies in ensuring
students learn. A lack of knowledge of what students actually know, except for
a few broad markers, is a real barrier to how I go about my work. Not that I
will be able to take care of all the variance in prior knowledge, but at least I will
have some idea of where the students are coming from. In an ideal world, their
years of schooling and acceptance into a program should convey some of that,
but the reality is quite different.
Finally, the LMS is a black box, where I put in content and effort but noth-
ing much comes out. In one of my classes, which I teach online, everything is
managed through our LMS, Blackboard™—the course readings and videos,
the quizzes, the discussions, the reflection assignments submitted by students;
all of this is online. Yet, I have very limited knowledge of what is actually going
on in the class until I see a submission from a student and need to grade that.
There is no overview dashboard that tells me who has looked at the content,
who is on track to meet the deadline, how long is it taking students to read
the content, and so on. The end-of-semester evaluations are not mandatory,
and, therefore, it is hard to interpret and use that data to revise the course (al-
though it is still used to evaluate faculty as that is the only data point available).
The data go somewhere, somebody benefits—makes a lot of money off my
effort—but nothing feeds back to my work. And yes, there are ways to better
monitor the use of the system—you can turn on the option to store the number
of views, for instance—but most of that information is summative. Formative
analytics is unfortunately missing.
This is the crux of the issue—none of my work practices incentivize me to
put effort into utilizing analytics more effectively. I still try my best to incor-
porate these data, because I want to work more efficiently. From research prac-
tices, to teaching practices, to advising, nothing is built to draw on or benefit
from data and analytics, and, hence, nothing does. The incentive for grants is
real—Who wants to go over budget? So, I pay some attention. And even if I
want to change the way I teach or run grants using more data and analytics,
it is hard to do unless the infrastructure is in place. To some extent, these is-
sues are personal, and organizational members need to be invested and willing
to make changes, but without the infrastructure in place, these jobs becomes
much harder. Why, after all this effort, is this still the case? I try to address this
issue in this chapter. I’m not technology averse or analytic averse; I even have
multiple grants on this topic and write about it. So, why hasn’t it made it into
my practice? Is this lack of integration of LA into my practice a problem, and,
if so, what is the solution? Here are some practical ideas, and later, I will discuss
why these often fail to make their way into higher education.
6 Aditya Johri
Practical Ideas for Success with Learning Analytics
Let’s take a normative look at what needs to be done if one wants to leverage
analytics in a meaningful manner. Nothing is more normative than prescrip-
tions by professional consulting firms such as McKinsey, so I am drawing on
multiple reports and papers from them including the following: Arellano, DiL-
eonardo, and Felix (2017), Brown, Kanagasabai, Pant, and Pinto (2017), Chui,
Henke, and London (2017), Kirkland and Wagner (2017).
Data Capture and Availability
One of the first issues that needs to be addressed for any form of analytics to
be performed is the capture of data. Without data—useful data—there is no
scope for any analysis to be performed. The proliferation of digitization across
organizations means that it is possible to capture a wide variety of data and also
to acquire large volumes of it. For instance, a retailer now has access not only
to sales data and customer information through their credit cards but also their
online customer profiles and even log data for every action that they perform
on the retailer’s website. In higher education organizations, similarly, there is
the opportunity to capture a variety of data about students such as their incom-
ing Grade Point Average (GPA), high school performance, their interaction
with an LMS, and even their swipe access data using their student identity
card. Of course, these increased data bring with them numerous challenges for
capture, storage, and analysis, especially in regard to whether useful data are
being captured. For instance, if we take the mission of a higher education insti-
tution to be improving student learning, we need to then think about whether
the data that are captured and can be analyzed assist us with this mission. As
of now, we have very little that data speak to learning that helps us understand
students’ cognitive process or misconceptions. We have grades and GPA infor-
mation, which is more a signal of achievement rather than cognition. At best,
it is an indirect marker of knowledge. To leverage useful LA data that assess
learning requires the collection of disparate data, sources need to be monitored
and stored—from student admission and enrollment data to their activities on
the LMS.
Modeling and Analysis
The second important step in the analytic process is the availability and use
of different mathematical models that can take useful data and turn them into
something actionable—and provide insights that allows us to better under-
stand an issue. There are dozens, if not more, models or techniques available
for analyzing data, including those that draw on traditional statistics and so-
cial science such as statistical models, visualization, social network analysis,
Absorptive Capacity and Routines 7
sentiment analysis, influence analytics, discourse analysis, concept analysis, and
sense-making models, and those that draw on computational DM such as clas-
sification, clustering, Bayesian modeling, relationship mining, and discovery
with models (Romero  Ventura, 2013). These techniques have been used for
predicting student performance, providing feedback for supporting instruc-
tors, recommending problems or contents to students, creating alerts for stake-
holders such as students to complete a task, domain modeling to describe the
domain of instruction in terms of concepts, and for planning and scheduling fu-
ture courses. These applications though have largely used existing techniques,
and very little development has taken place of techniques that are unique to
LA or modified for LA. Therefore, it is an open question as to the value these
techniques add and also an open area of research to develop techniques that are
driven by LA requirements.
Embedded Analytics within Organization for Action
Once data and techniques are available, the real challenge of LA, which is to
add value to the organization, begins. There are two steps in this process:
(1) embedding of LA across the organization and (2) creation of practices that
leverage LA capabilities. In order to add value to the organization, LA tools
and their data have to be embedded across existing practices, or new practices
have to be created across the institution (Pistilli, Willis,  Campbell, 2014).
In other words, LA has to scale across all people that can actually use the data
and techniques to make a difference. This is one of the most prominent barri-
ers to use of LA in higher education. In order to make informed decisions, an
organization needs human resources with expertise, users have to be trained
to use data-driven practices, and user functions have to be aligned with tech-
nological capabilities (Bean and Kiron, 2013). It’s a tall order, and one way to
think more about it is in terms of an organization’s absorptive capacity and
of instantiation, or some would say reification, of that capacity in everyday
routines.
Absorptive Capacity and Routines
One lens to examine the diffusion of LA among faculty members is absorptive
capacity, an organizational theory that was introduced by Cohen and Levinthal
(1990). Developed to describe the behavior of a firm, this perspective posits
that the firm’s absorptive capacity—or an organization’s ability to recognize,
assimilate, and apply new information in innovative ways—greatly relies upon
the firm’s prior related knowledge (Lane, Koka  Pathah, 2006). The authors
argue that that prior learning shapes subsequent learning (Cohen  Levin-
thal, 1990). In addition, cumulative experience in utilizing new and external
knowledge increases an individual’s absorptive capacity. Furthermore, Cohen
8 Aditya Johri
and Levinthal (1990) note that an organization’s absorptive capacity depends
on the absorptive capacities of key ‘gatekeepers’ within the organization who
interface with the external environment and can translate new information
to be useable within the organization. If new ideas—such as LA—are too
distant from an organization’s existing knowledge base and practices, Cohen
and Levinthal’s (1990) theory would predict that it would be difficult for
those ideas or innovations to gain traction, diffuse, and become sustainable.
Within higher education, absorptive capacity has largely been used to ex-
plain diffusion (or nondiffusion) of innovations in the context of university-
industry relationships (e.g., Azagra-Caro, Archontakis, Gutiérrez-Gracia, 
Fernández-de-Lucio, 2006), but in recent years, scholars have also started to
use the concept to examine organizational dynamics. For instance, Da Silva
and Davis (2011) use absorptive capacity to explain faculty members’ research
scholarship and proposed that individual characteristics, such as task motiva-
tion and creativity, relate to a faculty member’s ability to generate creative,
new research. They also posited that a faculty member will leverage external
sources of research-related knowledge to produce creative research above and
beyond those individual characteristics if they have prior relevant knowledge
to do so or have extrinsic motivation via institutional policies expecting high
research productivity (Da Silva  Davis, 2011). Furthermore, faculty who
perceive support for creative research from supervisors, colleagues, and non-
work sources (i.e., family and friends) also would theoretically have a stronger
relationship between their creative performance and innovative performance.
In other words, perceptions of support impact faculty members’ potential ab-
sorptive capacity—or the ability to identify, acquire, and assimilate external
knowledge—and their realized absorptive capacity—the ability to exploit and
implement that knowledge (Zahra  George, 2002). Therefore, the absorp-
tive capacity framework helped identify both individual- and organizational-
level factors that influenced faculty members’ research innovation. In a similar
vein, the uptake of LA within higher education can be seen as the ability of indi-
vidual users—faculty, staff, or students—and of the organization—assessment
office—to be able to acquire and assimilate external knowledge of LA and
then exploit it for their purposes. This could mean that faculty develop pro-
ficiency with using LMS and are able to provide students a better learning
experience. It can mean formative and continuous assessment of programs
through data comes from multiple sources such as student performance, LMS,
course assignments, etc. In some ways, this seems simple enough, so what is
the barrier? It is routines or the habitual and procedural use of these capabil-
ities across the institution.
The concept of routines, which can take the form of standard operating pro-
grams, procedures, norms, habits, and so on, has been advanced across a range
of organizational theories (Cyert  March, 1963; March  Simon, 1958). In
Absorptive Capacity and Routines 9
simple terms, routines consist of rules, heuristics, and norms at different levels
of organization activities (Pentland  Rueter, 1994). A critical element of rou-
tines, as opposed to other practices, is that routines are practices that become a
standard (Pentland  Feldman, 2005; Pentland  Rueter, 1994). A distinction
is made by Lewin, Massini, and Peeters (2011) between meta-routines that are
higher level routines and are associated with a bundle of specific lower level
routines that can be seen as practices routines (standard operating procedures)
that express a higher level meta-routine. From the perspective of this chapter,
the important aspect of routines is that they can be considered to constitute the
building blocks of organizational capabilities. If something has to be made a part
of how things work, they have to become routinized.
According to Lewin et al. (2011) who advance a routine-based theory of ab-
sorptive capacity, the overall effectiveness of absorptive capacity is determined
by the extent to which organizations develop processes that address routines,
both at the organizational level and the individual practiced routines. Absorp-
tive capacity and routines interact as absorptive capacity can enable or restrict
change in routines by moderating exploration. One common way in which
this can happen is that an organization can create routines that actually bring
new ideas into the organization. Yet, the success of this routine will depend
on how selected routines (within this meta-routine) play out. If these routines
discourage variation by the way in which selection is made, overall absorptive
capacity is reduced.
Any organization at any given time is full of routines, and higher educa-
tion institutions are no different. There are routines as simple as regular emails
through mailing lists that employees receive, regular alerts from the LMS, reg-
ular faculty meetings, and so on. However, for an institution to grow and
innovate, it is critical that routines change and new routines get designed and
adopted (Lewin, Massini,  Peeters, 2011). Routines in and of themselves are
adaptable, and they evolve over time as new knowledge, often as new people
and innovations, are introduced. Then, through a selection and retention pro-
cess, some changes become a constant. At the time a new routine gets estab-
lished or an old routine gets modified, it is hard to predict the outcome of this
routine; therefore, there is an element of trying things out to see what works.
Routines that work will often get replicated. For instance, if one department
is successful at implementing some form of student advising that works, then
others will follow suit. Some routines get reified and formally embedded or-
ganizationally in the form of rules, procedures, norms, or habits, and others
are contextual and idiosyncratic to a unit or a department within the organi-
zation. For example, every organization has procedures to allow students to
enroll in classes and withdraw from classes. Not every department has Friday
happy hours, and different research centers might have different kinds of proj-
ect meetings.
10 Aditya Johri
Proposed Framework for Learning Analytics Capacity Building
The most direct examination of how absorptive capacity can support higher
education innovation relevant to LA comes from Martin, Massy, and Clarke
(2003)’s paper on absorptive capacity and learning technologies. Martin et al.
(2003) use absorptive capacity to explain why e-learning, despite its potential
to become a booming industry, did not diffuse more rapidly in Europe. They
propose a model that integrates adoption, diffusion, and assimilation processes
related to e-learning and advance several propositions. Drawing on Zahra
and George (2002)’s reconceptualization of absorptive capacity, the proposed
framework also makes a distinction between potential and realized capacities.
I propose a modified version of their model as a framework to examine how
the failure of LA to become routinized within higher education is a barrier to
its adoption. I incorporate more directly the idea of routines in the framework
to propose that the absorptive capacity for LA is influenced by two sets of an-
tecedent factors: (1) the nature of LA—the technology, the techniques, and the
data—that is available to an organization or an individual and (2) the capacity
or prior knowledge that exists in the organization or individual to utilize LA.
In the case of LA, in particular, the distinction between potential and realized
capacities is strongly applicable. As I argued earlier, the emphasis so far within
LA as it related to higher education has been on the potential of it rather than
what has been realized. Similar to Martin et al. (2003), I propose that potential
capacities can be further subdivided into two specific ones—the acquisition of
knowledge and the assimilation of knowledge. Realized capacities, on the other
hand, relate to the transformation of knowledge and the exploitation of knowledge.
Acquisition
This dimension means an organization’s or individual’s dynamic capacity to iden-
tify and acquire external knowledge about LA. According to Martin et al. (2003),
this dynamicity has three important subcomponents—the potential speed, intensity,
and direction of knowledge acquisition. In practice, this can mean the speed with
which new appointments are made of experts or the speed with which an individual
updates their knowledge. The intensity relates to the depth of knowledge ­
acquired
related to a software or technique—how much prior knowledge is needed to
acquire new knowledge? The direction refers to the target – is the new knowledge
needed by or for students, faculty, or advisors.
Assimilation
This dimension refers to the organization’s processes or individual’s work prac-
tices that allow them to understand and act on information or knowledge about
LA they acquire from other sources (Martin, Massy,  Clarke, 2003). One
Absorptive Capacity and Routines 11
critical issue with assimilation often is the technological jargon, complicated
algorithms, or lack of understanding of data that does not match an existing
knowledge base or heuristics of the organization or of individuals. Overcom-
ing this barrier then requires retraining or hiring of new experts. It might also
mean changing existing processes to allow for new information to enter the
system. It is commonly acknowledged in higher education organizations that a
lack of trained personnel is a barrier to assimilation of LA. For instance, person-
nel trained in analytics are hard to recruit, as the skills are not easily available
and private industry is often able to lure trained academic personnel. Therefore,
a new LA system can require significant retraining of staff, pulling them from
their existing routines and practices.
Transformation
In the context of LA, transformation can be thought of as the ability to further
develop existing knowledge by fusing it with new knowledge. Transformation
has often been associated with the capability to take two previously incom-
patible or incongruous frames of references and combine them in a novel way
to produce a new model or schema (Zahra  George, 2002). This fusion, if
successful, often alters the manner in which an organization or individual per-
ceives itself and interprets its environment. This transformation is akin to the
creation of new routines or work practices, and, in many ways, it is rare but
also a truly innovative aspect of successful organizations. For many higher ed-
ucation institutions, this means being able to introduce cloud computing in its
infrastructure, continuously analyze new data, and create actionable insights to
improve efficiency and effectiveness.
Exploitation
The end game, if LA has to be influential in higher education, is its practi-
cal application across the institution and by a range of individuals who work
there—staff, faculty, administrators, advisors, etc. This means that LA is a part
of common routines across the organization so that LA is used to improve
various functions. Even though the application might be short term and still
be useful, it is the long-term application—reification in routines—that can
produce systematic and systemic change. This shift can be seen with many
institutions creating new offices and hiring new personnel that are devoted
to analytics. It can also be seen in an attempt to purchase novel software that
promises actionable knowledge across a range of function. Figure 1.1 displays
a synthesized analytical framework, which hypothesizes that the nature of the
LA system or technology combined with users’ prior knowledge with tech-
nology will influence adoption or non-adoption after moving through their
absorptive capacity filters.
12 Aditya Johri
Some (Unsuccessful) Case Studies for Use of Learning Analytics
As an example of what is possible with LA, what steps can be taken to reach
that goal, and how adoption fails, I now discuss three case studies from my own
experience. The first two scenarios derive from research, and, to some extent,
application, on a project that was externally funded and on which I served as
co-principle investigator. The overall goal behind the project was to better un-
derstand issues of student retention using institutionally available data and to be
able to predict and support students who were likely to struggle with or be un-
successful in their educational goals. After almost half a year of discussions and
negotiations, we were able to access the data. These data consisted of a range of
student records including their LMS participation. To protect student privacy,
each student was given a unique identifier, and we did not have access to student
names. Although some demographic data about the students were available,
they were not analyzed. The third case comes from trying to use an off-the-
shelf software to analyze students’ success in terms of retention and graduation.
Figure 1.1 
Theoretical framework (adapted and modified from Martin et al.,
2003, itself adapted from Zahra  George, 2002, and others).
Case Study 1: Understanding Student Retention and
Persistence
The first challenge we wanted to tackle was understanding student reten-
tion. In particular, improving student retention in science, technology,
engineering, and mathematics, or STEM, majors has been a real concern
for higher education institutions. Given that the primary project investiga-
tor (PI) and I were both in the engineering school, one of the first projects
we undertook was to study student retention in STEM (Almatrafi, Johri,
Rangwala,  Lester, 2017). We used the data of students who started in
Fall 2009 and Spring 2010 to project the retention rate in every STEM
major at each semester for eight semesters. The data included 328 students
who matriculated in engineering and 299 students who matriculated in
Absorptive Capacity and Routines 13
science for that year. We looked at all students who were admitted be-
tween 2009 and 2014, both direct admits and transfers. Transfer students
are often neglected in studies of retention and persistence, especially in
engineering, and our institution has a large number of transfer students,
so this was of special interest to us. We found that engineering students
were more persistent than science and math students, with retention rates
over 60% for engineering students compared to 40% in math, for instance.
Persistence rates for first-time students were lower than transfer students
in engineering. Also, as has been reported previously, most migration out
of discipline occurred in the first two years of enrollment. We also found
that among the enrolled students, a large number of engineering students
(almost 20%) did not declare a major, some until late in their studies. In
contrast, in the college of science, all enrolled students had declared a
major by the eighth semester. This work was further expended to look at
an ever larger population of students (Chen, Johri,  Rangwala, 2018).
Case Study 2: Understanding Student Trajectories
In a second study (Almatrafi, Johri, Rangwala,  Lester, 2016; Sweeney,
Rangwala, Lester,  Johri, 2016) from the same project, we compared
course trajectories of students who performed well academically and grad-
uated in four years with those who did not (low performers). The goal
was to identify factors related to how course-taking choices and degree
planning affected students’ academic performance. The dataset consisted
of information for three majors within the engineering school: civil, en-
vironmental, and infrastructure engineering (CEIE), computer science
(CS), and IT. The data included more than 13,500 records of 360 students.
The analysis showed that low performers postponed taking certain courses
until the latter end of their program, and this delay had consequences for
taking other courses and, subsequently, their graduation. We also found
a trend, whereby low performing students enrolled in a set of courses
within a specific semester—took certain courses concurrently—that the
high performing counterparts did not.
Case Study 3: Off-the-Shelf Software Use
The final case study comes from my attempts to use a third-party system
purchased from a vendor by my institution. The system was designed to
allow me to better understand how students were performing in a timely
manner. As department chair at that time, it was useful for me to know
14 Aditya Johri
Discussion and Conclusion
How can the framework I proposed earlier help us interpret these case studies?
Whether it is experimenting with LMS such as BlackBoard™, Moodle, or
Sakai, or systems for student admissions, uptake is slow and stability is hard
to achieve. As Borrego, Froyd, and Hall (2010) note, “adoption levels will be
higher in situations where change agents focus on clients’ (i.e., faculty and ad-
ministrators) needs over promoting adoption of a specific innovation” (Borrego
et al., 2010, p. 203). In technology-driven projects though, this is rarely the
case—the user almost always comes last.
Fundamentally, it comes down to the routines. The existing routines are
such that integration of new knowledge of this kind is not the reality. It is not
hard to see the breakdowns and also the inability to routinize for each of the
cases. The research project, although productive, did not result in ready-to-use
applications, and the analysis that was done was interesting for us researchers,
but we were not the target audience in terms of policy changes. Those who
could apply the data, such as advisors, had already existing routines that were
productive for them and which would not have been improved by the analytics
produced by the research or the off-the-shelf software that provided basic in-
formation on retention or progression of students.
Yet, what I found interesting was that the breakdown was not at the ‘poten-
tial’ stage of the process. There was significant acquisition and assimilation of
new knowledge related to LA across the organization at multiple levels. The
team had faculty and students who understood different aspects of the project
and were able to absorb different technological advances and knowledge. This
knowledge was about data, about analytical techniques, about higher education
issues, about the infrastructure needed, and so on. In addition to the faculty
other institutional actors also played a crucial role starting with staff working
this information, but other than providing a visual representation, which
can be useful, there was not a lot that I learned from the system that I did
not already know from my own teaching experience and feedback from
faculty, advisors, and senior administrators. There were no new insights,
although it did confirm things what I knew anecdotally. In any case, I
passed on the use of the system to the student advisors, as I thought they
would benefit more from it. There was a community of them and so they
could use the information from that system to have a dialogue. But the
truth is that they knew more about what was going on than the system
could ever tell them, since they were always in direct contact with the
students and their information was more current than what was in the
system.
Absorptive Capacity and Routines 15
on institutional data and analytics. Through negotiations that involved many
meetings, data access was granted and some advances were made in terms of
coming up with routines—standard procedures—to respond to additional
request for data. Where no routine existed, for instance, related to data sharing
practices, they were created.
All this work was exploited through analysis and what was produced
was research studies and papers; there were also some demonstrable inter-
faces developed. What was transformed was research and teaching; what did
not happen was adoption of the products into real systems that could impact
decision-making about intended audiences. So even though we created new
knowledge, through data analytics, it did not cross the barrier to adoption—it
was not used for any form of decision-making. It can be argued that a research
project does not really have to be adopted as a system and routinized, but,
in this case, we tried. We tried by trying to build systems, studying student
use, advisor use, and so on, but convincing those who have decision-making
power at a higher level is where the breakdown happens. Fundamentally, this
is not any different than most technology-related project in higher education
(as many of the chapter authors in this volume point out). The faculty and
staff, or students, are important for routines to form and become reified, but
bottom-up approaches are just one aspect of organizational change. In higher
education, top-down approaches are equally important, as the resources (e.g.,
funding or hiring) are controlled by those higher up in the hierarchy, and,
therefore, they can better incentivize formation of new routines.
Overall, what is present in the organization is an ecosystem of tools. Given
the range of users, functions, and roles that the user base has, it is not inconceiv-
able that for any kind of INFT, an ecosystem approach will be required. From
an organizational perspective, something central is always better—in terms of
support, security, and maintenance—but given the user base, that might not
work out. The other critical ingredient toward mass adoption is to start with
something small that is really useful and usable and then build from that. For
instance, just a simple tool that allows me to explore the class composition will
be invaluable. Once I use it regularly and see its value, I am more likely to
move on to more complex information and analytics. For something to become
a routine, it is important that it allows first for flexibility so that the user can
try it out and be able to adapt it. The user also has to perceive that they have a
choice and that the LA fills a real need. Here are a few steps that can be taken
to support routinizing to build LA capacity:
1. Technology design and adaptation: As is common for most technology-
adoption processes, one crucial requirement to ensure that LA is integrated
into organizational routines is to design technology using a user-centered
approach. Across a range of technological products, there is growing
evidence that using a user-centered approach is essential for user adoption
16 Aditya Johri
(Johri, Dufour, Lo,  Shanahan, 2013; Klein, Lester, Rangwala,  Johri,
in press). This alone though is not sufficient, as the needs of users vary and
changes and technological products also need to be tailorable by users or
those who are supporting users to user needs so that they can be adapted
by the end user. For instance, it is essential that LMS has a way for faculty
to monitor class participation, but depending on how a course is struc-
tured and how technologically advanced the instructor is, and, therefore,
adaptation of monitoring interfaces or dashboards is essential for it to be-
come part of faculty practices. Similarly, a department chair who needs to
monitor or assess all courses in the department needs to be able to use that
interface at a meta-level, and so on.
2. Sharing of practices across the organization: Given the diversity of dif-
ferent units within a higher education institution—departments, support
offices, etc.—it is important to share practices that work both within
similar kinds of job profiles and units but also across them. It is especially
important to include people who have domain knowledge about some
organizational aspect as well as knowledge of LA. For instance, advisors
who use LA proficiently can shed light on what works and how some
elements of what they do might benefit faculty. These sharing sessions
can be face-to-face but also consist of emails, mailing lists, and other
elements that fit within existing routines. Sharing of practices also has to
be both bottom-up and top-down, and itself will involve the creation of
new routines. The sharing can be led by higher-ups in the organizations,
by peers, or by anyone who has achieved advanced proficiency or has
been able to use a LA system effectively. One way to accelerate adoption
of LA would be to target specific audiences such as new faculty or other
employees and ensure that they are exposed to the different systems that
exist within the organization. Often, if other established faculty do not
use these systems, it is likely that new faculty will be exposed to them
through their departmental mentoring. The other goal of these sessions
can be to evaluate new technologies to ensure that not every LA offer-
ing is seen as a useful tool and that there is some common evaluation or
brainstorming around its use.
3. Integrated technology and organization development: Although this
idea has been around for over two decades (Pipek, Wulf,  Johri,
2012; Wulf  Rhode, 1995), integrating technology and organization
development is an approach more appropriate for the age of analytics
than any other. The primary notion behind this idea is that any new
technology, especially any kind of INFT, cannot simply be installed
within an organization without it affecting organizational practices.
Therefore, it is important to think of technology and organizational
development as occurring together and planning for inception for both
the new technology and the organization to change when the two
Absorptive Capacity and Routines 17
come together. This notion captures the two ideas advanced above that
any technology needs to be tailorable and that organization practices
need to be created to support technology adoption. These two aspects
need to work together at multiple levels—practices of individuals,
groups, and the organization as a whole. For instance, as an instructor,
if I have to better understand students using LMS, I need new dash-
boards and analysis, but I also need to change my practices to ensure
that I am taking what the system is telling me into account. If all I do
is monitor without any changes in my practices, the LA is not going to
be very effective.
Acknowledgements
This research was partially supported by the U.S. National Science Foundation
Awards # 1447489  1712129. I would like to thank Jaime Lester and Carrie
Klein for comments and feedback. The work on absorptive capacity derives
from discussions with David B. Knight.
References
Almatrafi, O., Johri, A., Rangwala, H.,  Lester, J. (2016). Identifying course trajec-
tories of high achieving engineering students through data analytics. Proceedings of
ASEE 2016.
Almatrafi, O., Johri, A., Rangwala, H.,  Lester, J. (2017). Retention and persistence
among STEM students: A comparison of direct admit and transfer students across
engineering and science. Proceedings of ASEE Annual Meeting.
Arellano, C., DiLeonardo, A.,  Felix, I. (2017). Using people analytics to drive busi-
ness performance: A case study. McKinsey Quarterly, July 2017.
Arroway, P., Morgan, G., O’Keefe, M., Yanosky, R. (2016). Learning analytics in higher
education (Research report). Louisville, CO: EDUCAUSE Center for Applied Re-
search. Retrieved from https://library.educause.edu/~/media/files/library/2016/2/
ers1504la.pdf
Azagra-Caro, J., Archontakis, F., Gutiérrez-Gracia, A.,  Fernández-de-Lucio, I.
(2006). Faculty support for the objectives of university-industry relations versus de-
gree of RD cooperation: The importance of regional absorptive capacity. Research
Policy, 35(1), 37–55.
Bean, R.,  Kiron, D. (2013). Organizational alignment is key to big data success. MIT
Sloan Management Review, 54(3), 6.
Bichsel, J. (2012). Analytics in higher education: Benefits, barriers, progress, and recommen-
dations (Research Report). Loiusville, CO: EDUCAUSE Center for Applied Re-
search, August 2012. Available from www.educause.edu/ecar
Borrego, M., Froyd, J. E.,  Hall, T. S. (2010). Diffusion of engineering education in-
novations: A survey of awareness and adoption rates in US engineering departments.
Journal of Engineering Education, 99(3), 185–207.
Brown, B., Kanagasabai, K., Pant, P.,  Pinto, G. S. (2017). Capturing value from your
customer data. McKinsey Quarterly, March 2017.
18 Aditya Johri
Chen, Y., Johri, A.,  Rangwala, H. (2018). Running out of STEM: A comparative
study across STEM majors of college students at-risk of dropping out early. Proceed-
ings of Learning Analytics and Knowledge (LAK).
Chui, M., Henke, N.,  London, S. (2017). How to win in the age of analytics. ­
McKinsey
Quarterly, January 2017.
Cohen, W. M.,  Levinthal, D. A. (1990), Absorptive capacity: A new perspective on
learning and innovation. Administrative Science Quarterly, 35, 128–152.
Cyert, R. M.,  March, J. G. (1963). A behavioral theory of the firm. Blackwell, Oxford, UK.
Da Silva, N.,  Davis, A. R. (2011). Absorptive capacity at the individual level: Linking
creativity to innovation in academia. The Review of Higher Education, 24(3), 355–379.
Economist (2010). Data, data everywhere. Special report on Managing Information,
www.economist.com/node/15557443
Gašević, D., Dawson, S.,  Siemens, G. (2015). Let’s not forget: Learning analytics are
about learning. Tech Trends, 59(1), 64–71. doi:10.1007/s11528-014-0822-x
Johri, A., Dufour, M., Lo, J.,  Shanahan, D. (2013). AdWiki: Socio-technical systems
engineering for managing advising knowledge in higher education. International
Journal of Sociotechnology and Knowledge Development, 5(1), 37–59.
Kirkland, R.,  Wagner, D. (2017). The role of expertise and judgment in a data-driven
world. McKinsey Quarterly, May 2017.
Klein, C., Lester, J., Rangwala, H.,  Johri, A. (In Press). Learning analytics tools in
higher education: Adoption at the intersection of institutional commitment and
individual action. The Review of Higher Education.
Lane, P. J., Koka, B.,  Pathak, S. (2006). The reification of absorptive capacity: A
critical review and rejuvenation of the construct. Academy of Management Review,
31(4) 833–863.
Lester, J., Klein, C., Rangwala, H.,  Johri, A. (2017). Learning analytics in higher
education. ASHE Monograph Series, 3(5), 9–133.
Lewin, A. Y., Massini, S.,  Peeters, C. (2011) Microfoundations of internal and
external absorptive capacity routines. Organization Science, 22(1), 81–98. doi:10.1287/
orsc.1100.0525
March, J. G.,  Simon, H. A. (1958). Organizations. New York, NY: Wiley.
Martin, G., Massy, J.,  Clarke, T. (2003). When absorptive capacity meets institutions
and (e)learners: adopting, diffusing and exploiting e-learning in organizations. Inter-
national Journal of Training and Development, 7(4), 228–244.
Pearson, T.,  Wegener, R. (2013). Big data: The organizational challenge. Bain  Company.
www.bain.com/publications/articles/big_data_the_organizational_challenge.aspx
Pentland, B. T.,  Feldman, M. S. (2005). Organizational routines as a unit of analysis.
Industrial and Corporate Change, 14(5), 793–815.
Pentland, B. T.,  Rueter, H. (1994). Organizational routines as grammars of action.
Administrative Science Quarterly, 39(3), 484–510.
Pipek, V., Wulf, V.,  Johri, A. (2012). Bridging artifacts and actors: Expertise sharing
in organizational ecosystems. Journal of Computer Supported Cooperative Work, 21(2–3),
261–282.
Pistilli, M. D., Willis, J. E.,  Campbell, J. P. (2014). Analytics from an institutional
lens: Definition, theory, design and impact. In J. A. Larusson  B. White (Eds.),
Learning analytics: From research to practice (pp. 79–102). New York, NY: Springer
Books. doi:10.1007/978-1-4614-3305-7
Absorptive Capacity and Routines 19
Romero, C.,  Ventura, S. (2013). Data mining in education. WIREs Data Mining
Knowledge Discovery, 3, 12–27. doi:10.1002/widm.1075
Siemens, G., Gašević, D., Haythornthwaite, C., Dawson, S., Buckingham-Shum, S.,
Ferguson, R., … Baker, R. (2011). Open learning analytics: An integrated  modular-
ized platform. Open University Press. https://solaresearch.org/wp-content/uploads/
2011/12/OpenLearningAnalytics.pdf
Sweeney, M., Rangwala, H., Lester, J.,  Johri, A. (2016). Next-term student per-
formance prediction: A recommender systems approach. Journal of Educational Data
Mining, 8(1), 22–51.
Wulf, V.,  Rhode, M. (1995). Towards an integrated organization and technology
development. In Proceedings of designing interactive systems (pp. 55–64). New York,
NY: ACM.
Zahra, S. A.,  George, G. (2002), Absorptive capacity: A review, reconceptualization
and extension. Academy of Management Review, 27(2), 185–203.
2
Analytics in the Field
Why Locally Grown Continuous Improvement
Systems are Essential for Effective Data-Driven
Decision-Making
Matthew T. Hora
Introduction
Dr. Lee’s department was awash in data, including end-of-term course evalua-
tions, final exams, homework results, and students’ online activities on course
websites. In the spring of 2015, my colleagues and I visited the offices and lecture
halls of Dr. Lee and his colleagues at a large, public California research univer-
sity as part of a research project on data-driven decision-making (DDDM) in
the field. There we heard complaints about ineffective online course evaluation
systems and the value of hallway conversations with colleagues, observed sophis-
ticated analytics in use but also rudimentary and low-tech systems of formative
feedback such as a worn shoebox at the front of a classroom where students
deposited 3 × 5 cards with their complaints and questions. It quickly became
apparent that for mechanical engineering departments like the one where Dr.
Lee was a faculty member, working with these various forms of data was not a
simple exercise of reporting grades at the end of each semester, but was also part
of a high-stakes exercise in disciplinary and institutional accreditation. With the
reputation and continued operations of academic programs hinging on the qual-
ity and thoroughness of accreditation reports that ran over 400 pages long, it was
no surprise that Dr. Lee observed that issues surrounding educational data were
“a pretty big deal in our department.”
The study that led me to Dr. Lee’s department was motivated by increasing
claims about the potential for sophisticated data analyses to discern patterns and
predict outcomes that could ultimately solve longstanding problems ­
plaguing
higher education such as institutional inefficiency (Lane, 2014), ineffective
­
institutional data management systems (Cheslock, Hughes,  Umbricht,
2014), limited student learning in science courses (Wright, McKay, Hershock,
Miller,  Tritz, 2014), and poor student completion rates (Picciano, 2012;
Analytics in the Field 21
Siemens, 2013; Treaster, 2017). Advocates of DDDM in higher education also
argue that since other sectors, such as business and health care, have embraced
predictive analytics with vigor and apparent success, then higher education
must also do so (Zimpher, 2014). In the case of Learning Analytics (LA), which
entails the measurement, analysis, and reporting of student-centered data (e.g.,
course-taking patterns, keystrokes on course websites, grades) in order to im-
prove and optimize learning, one of the central arguments behind its potential
is that these increasingly large datasets could be analyzed in ways that would
add clarity, precision, and an element of prediction to postsecondary educators
understanding of how to support student success (Lane  Finsel, 2014; Long 
Siemens, 2011).
Besides the analytic advantages suggested by the new technologies, com-
puter power, and data mining techniques that are closely tied to the analytics
movement, two other forces should be recognized as contributing to the in-
creasing attention being paid to Big Data and analytics in higher education: ac-
countability pressures and skepticism about teachers’ abilities to make informed
decisions. First, higher education in the early 21st century is facing a new wave
performance-oriented reforms and accountability pressures from state legis-
lators and accreditors that require institutions to establish, track, and report
data on various performance metrics, a policy development that is leading to
the development of new institutional data systems that are focused on ensur-
ing compliance with these new policies (Alexander, 2000; Rabovsky, 2014).
In doing so, institutions can be viewed as using technology and data systems
to cultivate (and respond to) a culture of accountability as opposed to using
these new tools to foster a culture of student-centered learning (Halverson 
Shapiro, 2012). Consequently, some observers have argued that DDDM and
analytics can be considered to be “one of the most prominent strategies for
educational improvement in the country” (Coburn  Turner, 2012, p. 100).
Second, the DDDM movement in education is also informed by the notion that
teachers generally rely upon two forms of information to make instructional
decisions—intuition and anecdote—which are insufficiently objective and
robust, and that more rigorously collected and analyzed numeric data should
be used as the basis for decision-making about how courses are designed and
taught (Mandinach, 2012; Wieman, Perkins,  Gilbert, 2010). With hard
evidence in hand, so the story goes, faculty will improve their teaching, strug-
gling students will be identified and supported more efficiently, and institutions
will change and continually improve (Lane, 2014; Spillane, 2012).
While much of the writing on DDDM and LA advocates for its adoption
and/or centers on descriptions of data-related interventions, researchers are in-
creasingly employing a more descriptive and critical stance, focusing instead
on how educators think about and utilize data in their daily work (Coburn 
Turner, 2011). At the heart of these investigations is the growing realization that
“Data do not objectively guide decisions on their own—people do,” (Spillane,
22 Matthew T. Hora
2012, p. 114), and that some educators and policymakers have placed too much
faith in the power of data and technology to change teaching behaviors and
student learning. Further, some analysts critique the focus on adoption and
program evaluation for ignoring ethical issues related to LA (Clow, 2013; Slade 
Prinsloo, 2013); how technology could be used to foster educators’ profes-
sional growth and development (Hora, Bouwma-Gearhart,  Park, 2017); and,
especially, how the ultimate efficacy of a data system is dependent on how well
integrated they are with the cognitive, cultural, and contextual aspects of orga-
nizational life. At the heart of many critiques of analytics is the fact that within
the seemingly simple prospect of mining gigabytes of keystroke data to iden-
tify struggling students is a process whereby data are transformed by human
­
beings—working within constraints of local cultural norms, ­
organizational
conditions, and knowledge about data and student learning—into information
and then ideally actionable knowledge (Mandinach, 2012). As Zilvinskis, ­
Willis,
and Borden (2017) observe,
It’s not that these new technologies and methods are unhelpful, but rather
it’s that they don’t address the more complex aspects of higher education,
including the incredible diversity and complexity of learning outcomes
across the curriculum and complex organizational arrangements.
(p. 9)
Consequently, some scholars argue that instead of continuing to advocate for
the adoption of DDDM in general and the utilization of LA in particular, the
field of education needs to better understand the existing conditions of data
use within actual schools, colleges, and universities so that data systems can
be more effective and responsive to educators’ real needs (Coburn  Turner,
2011; Mandinach, 2012). In particular, given that “Effective data use requires
going beyond the numbers and their statistical properties to make meaning of
them” (Mandinach, 2012, p. 73), insights into how educators notice, inter-
pret, and draw conclusions from various forms of data are needed (Coburn 
Turner, 2011). Without a nuanced and ethnographically informed account of
how educators use data and LA ‘in the wild’, the field risks developing institu-
tional and departmental data systems that contribute to ineffective (or incorrect)
decision-making, are widely rejected by potential users, and fail to generate
reliable, useful, and legitimate information that faculty and administrators
can use as they go about planning courses, engaging students, and improving
programs (Foss, 2014; Hora et al., 2017). These considerations led to the principal
questions that I address in this chapter: (1) How are postsecondary educators
thinking about and using teaching-related data in their daily work? (2) How
do specific organizational features shape these decisions and practices? and
(3) What are the implications of practice-based accounts of data utilization for
the design and implementation of LA initiatives?
Analytics in the Field 23
Unfortunately, there is “shockingly little research on what happens when
individuals interact with data in their workplace settings” (Coburn  Turner,
2012, p. 99), and the aim of this chapter is to partially ameliorate this situation
with respect to research on data use in higher education. While a promising
line of inquiry is beginning to examine how faculty and administrators,
within the constraints and affordances posed by their institutional contexts,
engage, or not, in DDDM and utilize LA (Foss, 2014; Hora et al., 2017;
Klein, Lester, Rangwala,  Johri, in press), there remains much work to be
done. For instance, the literature on LA is often replete with lists of why these
data can improve educational practice, but with less insight into how exactly
educators can use them in practice. In addition, research shows that educa-
tors make decisions based on a wide range of information—verbal, numeric,
and experiential—such that accounts of data use that ignore these varied and
trusted sources of information are incomplete and likely incommensurate
with existing behaviors. Finally, the field lacks a conceptual framework that
accounts for how cognitive, sociocultural, and contextual factors collectively
impact data use, especially in ways that move beyond lists of contextual con-
ditions and that allow for the empirical specification of relationships between
and among behavior and situations.
One of the most promising frameworks for studying DDDM and the use of
LA was developed by Coburn and Turner (2012), who draw upon insights from
theories of situated and distributed cognition to develop a model that captures
the temporal processes of data use as they unfold in specific situations. At the
heart of this framework is teacher cognition, based on the view that the use of
analytics or other forms of data are dependent on whether and how individ-
uals notice, interpret and analyze data based on preexisting beliefs and other
cognitive structures and, subsequently, construct implications about their own
teaching and/or students’ learning based on these interpretations. This frame-
work also focuses on how these processes of data interpretation and utilization
unfold within and touch upon specific aspects of the social and organizational
systems in which teachers work, so that specific features of the environment
can be identified as key leverage points that support or inhibit effective DDDM
(Spillane, 2012; Spillane, Halverson,  Diamond, 2001). In this chapter, I ex-
pand upon this framework by focusing particular attention on those organi-
zational features where information is regularly stored and retrieved, or what
is called the retention structure in research on organizational memory and
learning, such as digital databases, social networks, hardcopy files, and personal
memory (Walsh  Ungson, 1991). Insights into what specific aspects of orga-
nizational memory are implicated in educators’ use of instructional data are
important because in many ways they establish the boundaries of what types of
behaviors are desirable and feasible within organizations. Furthermore, these
elements also may play critical roles in formal and informal DDDM systems
where LA are utilized.
24 Matthew T. Hora
In this chapter, I also outline the theoretical underpinning of this framework
and demonstrate its use in an exploratory study of how a group of faculty and
administrators in a California research university used diverse types of teaching-­
related data—of which LA is but one type—in practice, and how specific features
of the organizational memory influenced their decisions and actions. Analyses of
interview, classroom observation, and documentary data revealed five distinct,
yet intertwined, decision chains involving a diverse range of teaching-related
data, with little evidence that LA or sophisticated statistical analyses of numeric
data were viewed as important and salient to the educators’ work. Instead, the
educators in this study relied upon and highly valued homework and exam results,
their own self-created mid-term course evaluation systems, and low-tech infor-
mation sources such as informal student feedback and responses to open-ended
questions on course evaluations. Within this department, despite the presence
of policies mandating the collection of teaching-related data for the purpose of
disciplinary accreditation and student course evaluations, results from these exer-
cises were either not fed back to faculty or were viewed as insufficiently detailed
and reliable. Some respondents instead created their own systems for continuous
improvement, which represents an opportunity lost in terms of institutional sup-
ports for DDDM and a foreboding sign for those hoping to introduce LA into
similar departmental contexts. These results suggest that institutional structures
for continuous improvement be instituted that facilitate educators’ collection,
analysis, and reflection on various forms of data (including but not limited to ana-
lytics) while also acknowledging and respecting low-tech, individualized sources
of data and feedback systems.
These results raise questions that should be addressed by those advocating for
the widespread adoption of LA in the nation’s postsecondary institutions, espe-
cially those who argue that the sector must adopt these techniques given their
apparent impacts and success in health care, business, and management (Zimpher,
2014). Indeed, the fact that data analysts in these fields have raised questions about
the reliance on Big Data and performance-based metrics, arguing instead for the
perspectives of behavioral scientists and small data (i.e., data from small samples
and qualitative data), highlights the limitations of Big Data and DDDM (Lazer,
Kennedy, King,  Vespignani, 2014; Peysakhovich  Stephens-Davidowitz,
2015). Instead, as a field, we must consider the prospect that “rigor is (being) seen as
trumping relevance, responsiveness, and often reality” (Mandinach, 2012, p. 81).
It appears that as with K-12 education, the pendulum of reform in higher
education has swung hard in the direction of relying on technical solutions and
hard data to inform and improve student’s educational experiences. Instead, I
agree with Mandinach (2012) who suggests that as the field of education considers
the need for continuous improvement in educational practice, that, “There needs
to be a balance between the use of data and experience” (p. 81) and far more
attention to the realities of the organizational and sociocultural conditions that
shape and define how educators approach teaching and learning.
Analytics in the Field 25
Background: The Promise and Challenges
of Using Data-Driven Decision-Making
and Learning Analytics in Education
In higher education, the largest body of research on DDDM and LA focuses on
descriptions about innovations, interventions, and developments with LA and
DDDM, a state of affairs not dissimilar from the K-12 literature (Coburn 
Turner, 2011). Researchers have documented and described how new technolo-
gies and datasets have led the shift from institutional research offices generating
static annual reports to building sophisticated, relational databases that can gen-
erate reports about student performance in real-time (Lane, 2014). This devel-
opment is largely due to the fact that with new learning management systems
(LMS) and online datasets “every instructional transaction can be immediately
recorded and added to a database” (Picciano, 2012, p. 10) and is viewed as a
transformative development in terms of providing predictive analyses in real-
time that can help support student success. Many examples of the successful use
of LA exist: Arizona State University increasing pass rates from 66% to 75%
in freshman remedial math (Kolowich, 2013); using analytics to identify intro-
ductory math as a key determinant of student persistence in nursing programs at
Georgia State University (Treaster, 2017); and, how analyses of student demo-
graphic, academic, and LMS-utilization data were used at Purdue University to
flag students at risk of failing a course, thus sparking messages to instructors that
additional tutoring or interventions were required (Arnold, 2010; Campbell,
2007). Based in part on success stories such as these, Siemens (2013) argues that
with the development of specialized journals, conferences, and research method-
ologies, the field of LA even represents an emerging discipline in its own right.
But two areas of research within the field of DDDM and analytics in education
are also growing in volume and prominence that have a more skeptical view of
these movements—critical analyses and descriptive, practice-based research.
Critical perspectives on DDDM and analytics have focused on issues such as
students’ data privacy, the surveillance state, the tendency for a technocratic ap-
proach to educational management, and the commercialization of student data.
Of particular concern is the issue of student privacy, and who owns or has ac-
cess to data obtained via LMS, such as discussion posts or even essays and other
original products (Siemens, 2013; Slade  Prinsloo, 2013). Broader concerns
are also being raised about the way that institutions, policymakers and educa-
tors talk about DDDM, which is often framed in terms of economic efficiency
and accountability. Instead, some scholars argue that teachers’ and students’
interests need to be more pronounced in conversations about accountability, as
“a way of taking control of the agenda, so that the economic framing can be at
least supplemented with a concern for learning” (Clow, 2013, p. 18). Addition-
ally, some have argued that these efforts are heavy-handed attempts to control
the profession of education through overly simplistic measures (Fullan, 2010),
26 Matthew T. Hora
and that standardized assessment data and DDDM are inappropriate measures
for evaluating educational quality and supporting teacher decision-making
(Anderson, 2006; Schmelkin, Spencer,  Gellman, 1997). Foregrounding the
fact that no reform initiative is entirely innocent, or absent of particular agendas
or ideologies, Ewing (2011) notes how the value-added methodology in K-12
schools has become a rhetorical weapon and that the assumptions underlying
DDDM also demand close, critical examination. Similarly, Slade and Prinsloo
(2013) argue that more attention should be paid to how culture, politics, and
economic contexts shape how we deal with ethical issues in analytics. Drawing
on the critical research tradition in education (e.g., Apple, 2004), these scholars
posit that instead of viewing students as sources of data, they should be seen as
collaborators in data collection, analysis, and interpretation. In addition, criti-
cal scholars argue that instead of adopting an accountability mentality, analytics
should be seen as a moral practice that acts as a counter-narrative to the market-
and consumer-based ideologies governing aspects of the DDDM movement.
Another body of research that adopts a critical stance toward data-related
reforms is called practice-based research, which utilizes an ethnographic per-
spective to focus on what Cook and Brown (1999) called, “the coordinated
activities of individuals and groups in doing their ‘real work as it is informed
by particular organizations or group context’” (p. 386). This research tradi-
tion builds on observational and ethnographic studies in anthropology and
cognitive psychology, especially the groundbreaking research that Hutchins
(1995a, 1995b) conducted on the cognitive, technical, and sociocultural under-
pinnings of how teams of professionals perform tasks in their workplaces. The
value of such descriptive research, Hutchins (1995a) argued, was in dispensing
with the notion that findings from laboratory-based studies of cognition and
decision-making were universally applicable to real-world situations. One of
the advantages of more naturalistic studies of practice was that features of the
task ­
context—whether social, cultural, structural, or physical—that influenced
behavior could be accounted for and identified. Building on these ideas, schol-
ars of data use in K-12 settings have examined how district central offices uti-
lize information (Honig  Coburn, 2008), how teachers develop ­
professionally
when discussing data and student outcomes (Horn  Little, 2010), and how
the artifacts (i.e., designed objects and policies) teachers and administrators
use when interacting with data can shape their behaviors and conclusions
(Spillane, 2012).
Unfortunately, in contrast to the robust body of work on data use in K-12
schools and districts, relatively little Unfortunately, in exists about how people
think about and use data in higher education. Andrews and Lemons (2015) stud-
ied how biology instructors made teaching-related decisions, finding that per-
sonal experience was utilized more than empirical evidence about teaching or
student learning. In a study of the use of data analytics by academic leaders, Foss
(2014) found that adoption is shaped by a combination of features including the
Analytics in the Field 27
data system itself, the organizational context, and individual attributes of Deans,
chairs, and faculty. In particular, Foss (2014) found that for analytics-related
innovations to be embraced, data must be viewed as legitimate within the pro-
fession and discipline such that the data systems and their outputs achieve “cur-
rency” within local communities (p. 191). A similar line of inquiry examining
the adoption of LA tools found that organizational commitment, leadership
and policies all impact faculty decisions about whether or not to use analytics
as part of their teaching practices (Klein et al., in press). Finally, my colleagues
and I examined how a group of 59 faculty members used teaching-related data
in three large, public research universities (Hora et al., 2017). Results from this
study indicate that faculty drew upon a variety of data (e.g., numeric data, verbal
feedback, etc.) in ways that can be grouped into six distinct clusters of practice
that varied according to the degree of faculty involvement in creating contin-
uous improvement systems and the sophistication of the data system. In some
cases, faculty were neither involved nor utilized sophisticated data or analytic
methods, whereas in other cases faculty co-designed rigorous systems that drew
upon cutting-edge data and analytics. Consequently, given the importance of
understanding educators’ data practices and how local organizational conditions
and constraints influence them, empirical research on DDDM and analytics
requires a conceptual framework that allows researchers to document these be-
haviors and the specific contextual factors linked to them.
A Framework for Studying Data Use
in Complex Organizations
While many robust frameworks exist to study postsecondary organizations as
complex systems, from Birnbaum’s (1988) cybernetic systems theory, Clark’s
cultural systems approach (1983), and Lattuca and Stark’s multidimensional
framework for studying course planning (2011), these models are not easily op-
erationalized for empirical research on faculty behaviors. This is due in part be-
cause these frameworks aim to model factors that influence educational practice
in abstract terms (e.g., leadership, culture, and technology) and in the aggregate
(e.g., in departments or even institutions), and not how individuals think, make
decisions, and behave in practice. As Stark (2000) noted after studying and
articulating a model about how course planning unfolded in higher education,
“Our work fell short of exploring in depth the actual decisions teachers make
about course plans and curriculum” (p. 435).
Fortunately, a robust model of data utilization has been developed that al-
lows for the empirical documentation of both data-related practices at fine-
grained levels and also how distinct contextual elements impact these behaviors
(Coburn  Turner, 2012). In creating this model, Coburn and Turner (2012)
specifically aimed to move beyond generating lists of contextual factors that
influence practice, or a static and coarsely grained model comprising various
28 Matthew T. Hora
boxes and arrows that hinted at causal relations among elements, but instead to
“specify the relationship between contextual conditions on the one hand and
the process of data use on the other” (p. 180). As previously noted, one of the
motivating ideas behind this model is the view that the unique characteristics
and constraints of human cognition should be at the core of analyses of DDDM,
especially how perception of pertinent stimuli (e.g., relevant data) is strongly
influenced by preexisting beliefs, experiences, and mental representations.
­
Additionally, the idea that perception as well as behavior is influenced by the
“raw material” of the socio-technical environment (Spillane, 2012, p. 8), such
that the context is not simply a passive backdrop to practice but an integral fea-
ture of human behavior, is another feature of this model. Finally, this approach
emphasizes that institutional contexts are not innocent or objective features
but are instead actively created by particular people and interests who have the
power to institute policies and develop organizational structures ­
(Little, 2011).
While Coburn and Turner’s (2012) framework is the most theoretically ro-
bust and readily operational for field research, two ideas are not integrated
into their model that are particularly important for empirical research on how
educators think about and use instructional data in real-world situations. First,
theories of organizations as socio-technical information processing systems
­
recast the enterprise of data use from being centered on the behaviors of au-
tonomous individuals using and interacting with information technologies and
data, to one of the institution itself as a social collective that produces and con-
structs knowledge (Pentland, 1995). This view argues that businesses, colleges,
and other organizations are best seen as social knowledge systems, instead of
a structural context where behavior occurs and where individuals and teams
engage in processes of legitimizing and interpreting information as part of a
social and cultural process (Pentland, 1995). This perspective foregrounds an
essential process that is at the heart of DDDM and LA—the transformation of
data into information and actionable knowledge (Mandinach, 2012)—in a way
that moves beyond the fiction of an isolated actor engaged in sense-making
activities to implicate the entire institution as a socio-technical entity.
Second, research on organizational learning has demonstrated the impor-
tance of how “organizations encode, store, and retrieve the lessons of history
despite the turnover of personnel and the passage of time” (Levitt  March,
1988, p. 319), which is a more historic perspective on data and its impact on be-
havior than is commonly taken in research on DDDM and analytics. Research-
ers in this field have documented how important information is stored within
organizations in a variety of locations, and not simply in digital databases. These
repositories include data sources that are commonly associated with DDDM
such as hard-copy and digital databases, but they also include physical artifacts
(e.g., course syllabi), routinized practices that embody acceptable behaviors,
the organization’s structure (e.g., governance, hierarchy), and even individ-
uals’ memories—all of which Walsh and Ungson (1991) collectively call the
Exploring the Variety of Random
Documents with Different Content
applicable. This question really turns on the largest utilizable
emergent pencil from the eye piece. It used to be commonly stated
that ⅛ inch for the emergent pencil was about a working maximum,
leading to a magnification of 8 per inch of aperture of the objective.
This in view of our present knowledge of the eye and its properties
is too low an estimate of pupillary aperture. It is a fact which has
been well known for more than a decade that in faint light, when the
eye has become adapted to its situation, the pupil opens up to two
or three times this diameter and there is no doubt that a fifth or a
fourth of an inch aperture can be well utilized, provided the eye is
properly dark-adapted. For scrutinizing faint objects, comet sweeping
and the like, one should therefore have one ocular of very wide field
and magnifying power of 4 or 5 per inch of aperture, the main point
being to secure a field as wide is practicable. One may use for such
purposes either a very wide field Huyghenian, or, if cross wires are
to be used, a Kellner form. Fifty degrees of field is perfectly
practicable with either. As regards the rest of the eyepiece
equipment the observer may well suit his own convenience and
resources. Usually one ocular of about half the maximum power
provided will be found extremely convenient and perhaps oftener
used than either the high or low power. Oculars of intermediate
power and adapted for various purposes will generally find their way
into any telescopic equipment. And as a last word do not expect to
improve bad conditions by magnifying. If the seeing is bad with a
low power, cap the telescope and await a better opportunity.
APPENDIX
WORK FOR THE TELESCOPE
To make at first hand the acquaintance of the celestial bodies is, in
and of itself, worth the while, as leading the mind to a new sense of
ultimate values. To tell the truth the modern man on the whole
knows the Heavens less intimately than did his ancestors. He
glances at his wrist-watch to learn the hour and at the almanac to
identify the day. The rising and setting of the constellations, the
wandering of the planets among the stars, the seasonal shifting of
the sun’s path—all these are a sealed book to him, and the intricate
mysteries that lie in the background are quite unsuspected.
The telescope is the lifter of the cosmic veil, and even for merely
disclosing the spectacular is a source of far-reaching enlightenment.
But for the serious student it offers opportunities for the genuine
advancement of human knowledge that are hard to underestimate.
It is true that the great modern observatories can gather information
on a scale that staggers the private investigator. But in this matter
fortune favors the pertinacious, and the observer who settles to a
line of deliberate investigation and patiently follows it is likely to find
his reward. There is so much within the reach of powerful
instruments only, that these are in the main turned to their own
particular spheres of usefulness.
For modest equipment there is still plenty of work to do. The study
of variable stars offers a vast field for exploration, most fruitful
perhaps with respect to the irregular and long-period changes of
which our own Sun offers an example. Even in solar study there are
transient phenomena of sudden eruptions and of swift changes that
escape the eye of the spectro-heliograph, and admirable work can
be done, and has been done, with small telescopes in studying the
spectra of sun spots
Temporary stars visible to the naked eye or to the smallest
instruments turn up every few years and their discovery has usually
fallen to the lot of the somewhat rare astronomer, professional or
amateur, who knows the field of stars as he knows the alphabet. The
last three important novæ fell to the amateurs—two to the same
man. Comets are to be had for the seeking by the persistent
observer with an instrument of fair light-grasp and field; one
distinguished amateur found a pair within a few days, acting on the
theory that small comets are really common and should be looked
for—most easily by one who knows his nebulæ, it should be added.
And within our small planetary system lies labor sufficient for
generations. We know little even about the superficial characters of
the planets, still less about their real physical condition. We are not
even sure about the rotation periods of Venus and Neptune. The
clue to many of the mysteries requires eternal vigilance rather than
powerful equipment, for the appearance of temporary changes may
tell the whole story. The old generation of astronomers who believed
in the complete inviolability of celestial order has been for the most
part gathered to its fathers, and we now realize that change is the
law of the universe. Within the solar system there are planetary
surfaces to be watched, asteroids to be scanned for variability or
change of it, meteor swarms to be correlated with their sources,
occultations to be minutely examined, and when one runs short of
these, our nearest neighbor the Moon offers a wild and physically
unknown country for exploration. It is suspected with good reason of
dynamic changes, to say nothing of the possible last remnants of
organic life.
Much of this work is well within the useful range of instruments of
three to six inches aperture. The strategy of successful investigation
is in turning attention upon those things which are within the scope
of one’s equipment, and selecting those which give promise of
yielding to a well directed attack. And to this end efforts correlated
with those of others are earnestly to be advised. It is hard to say too
much of the usefulness of directed energies like those of the Variable
Star Association and similar bodies. They not only organize activities
to an important common end, but strengthen the morale of the
individual observer.
INDEX
A
Abbé, roof prism, 162
Aberration, compensated by minute change of focus, 266
illuminates the diffraction minima, 265
relation determines of focus and aperture, 266
Achromatic long relief ocular, 146
objective, 77
Achromatism, condition for, 78
determination of, 78
imperfection of, 87
Adjustment where Polaris invisible, 235
Air waves, length of, 255
Alt-azimuth mount for reflector, 102
mounts, with slow motions, 102
setting up an, 228
Anastigmats, 84
Annealing, pattern of strain, 68
Astigmatism, 84, 209
of figure, 210
Astronomy, dawn of popular, 19
B
Bacon, Roger, alleged description of telescopes, 6
Barlow lens, 152
“Bent,” objective, 86
Binocular, 2
advantage of, exaggerated, 151
for strictly astronomical use, 152
telescopes for astronomical use, 163
C
Camouflage, in optical patents, 97
Cassegrain, design for reflecting telescope, 22
Cassegrain, sculptor and founder of statues, 22
Cell, taking off from a telescope, 202
Chromatic aberration, 11, 76
investigation of, 210
correction, differences in, 91
error of the eye, 90
Clairault’s condition, 81
two cemented forms for, 81
Clarks, portable equatorial mounting, 109
terrestrial prismatic eyepiece, 158
Clock, the cosmic, 233
Clock drive, 110, 174
Clock mechanism, regulating rate of motor, 179
Coddington lens, 137
Cœlostat constructions, 126
tower telescopes, 127
Color correction, commonly used, 211
examined by spectroscope, 211
of the great makers, 90
Coma-free, condition combined with Clairault’s, 83
Comet seeker, Caroline Herschel’s 118
seekers with triple objective, 119
Crowns distinguished from flints, 64
Curves, struggle for non-spherical, 18
D
Davon micro-telescope, 148
Dawes’ Limit, 261
in physiological factors, 263
Declination circle, 108
adjustment of, 239
Declination circle, adjustment by, 237
facilitates setting up instrument, 110
Definition condition for excellence of, 254
good in situations widely different, 254
DeRheita, 12
constructed binoculars, 13
terrestrial ocular, 13
Descartes’ dioptrics, publication of, 11
lens with elliptical curvature, 12
Dew cap, 219
Diaphragms, importance of, 43
Diffraction figure for bright line, 269
pattern, 256
solid, apparent diameter of, 262
solid of planet, 269
solid for a star, 260
spectra, 190
system, scale of, 260
varies inversely with aperture, 260
through objective, 258
Digges, account suggests camera obscura, 7
Dimensions, customary, telescope of, 24
Discs, inspection of glass, 66
roughing to form, 69
Distortion, 86
Dolland, John, 28
published his discovery of achromatism, 29
Peter, early triple objective, 29
Dome wholly of galvanized iron, 250
Domes, 246
Driving clock, a simple, 174
pendulum controlled, 177
clocks spring operated, 175
E
English equatorial, 110
mounts, mechanical stability of, 113
Equatorial, adjustments of, 230
Equatorial, coudé, 124
mount, different situations in using, 229
mount, first by Short, 104
mount, pier overhung, 115
mount in section, 107
two motions necessary in, 106
Equilibrating levers, devised by T. Grubb, 39
Evershed, direct vision solar spectroscope, 189
Eye lens, simple, preferred by Sir W. Herschel, 136
Eyepiece, compensating, 142
Huygenian, 139
Huygenian, achromatism of, 140
Huygenian, with cross wires, 140
Huygenian, field of, 141
Huygenian focal length of, 143
measuring focus of, 136
microscope form, 147, 148
monocentric, 139
a simple microscope, 134
Tolles solid, 141
F
Field, curvature of, 85
glass, arrangement of parts, 151
Galilean, 150
lens diameter possible, 150
Field lens, 139
Figuring locally, 73
process of, 73
Filar micrometer, 172
Finder, 108, 132
adjustment of, 230
Fine grinding, 69
Fixed eyepiece mounts, 118
Flints, highly refractive due to Guinand, 36
Foucault, 39
development of silver on glass reflector, 41
knife edge test, 212
Foucault, methods of working and testing, 41
Fraunhofer, 36
applied condition of absence of coma, 82
form of objectives, 37
long list of notable achievements, 38
“Front view” telescope, 32
mechanical difficulty of, 33
Furnaces, glass, classes of, 59
G
Galilean telescope, small field of, 9
Galileo, exhibited telescope to senators of Venice, 8
grasps the general principles, 7
produces instrument magnifying 32 times, 8
Gascoigne, William, first using genuine micrometer, 12
Gauss, Objective, 82
Gerrish, application of drive, 181
motor drive, 179
Ghosts, 137
Glass, dark, as sunshade, 166
forming and annealing, 62
inspection of raw, 61
losses by volatilization, 58
materials of, 59
origin of, 57
persistent bubbles in, 58
a solid solution, 57
Grating spectroscopes, 190
Gratings, spectroscope, 189
Gregory, James, described construction which bears his name,
19
failed of material success, 20
Grubb, Sir Howard, objectives, 74
Guinand, Pierre Louis, improvements in optical glass, 36
H
Hadley, disclosed test for true figure, 27
John, real inventor of reflector, 25
Hadley’s reflector, tested with satisfactory results, 26
Hall, Chester Moor, designed first achromatic telescope, 27
had telescopes made as early as 1733, 27
Hand telescope, magnifying power, 150
monocular, 151
Hartmann test, 213
on large objectives, 267
principle of, 214
Hartness, turret telescope, 130, 131
Heliometer, principle of, 171
Hensoldt, prism form, 163
Herschel’s discovery of Uranus, 32
forty foot telescope, 34
Sir John, 35
Sir John, proposed defining condition, 81
Sir William, 31
Herschel’s time, instruments of, 35
Hevelius, construction for objective of 150 feet, 17
directions for designing Galilean and Keplerian telescopes,
14
invention of first periscope, 15
Johannes, 13
mention of advantage of plano convex lens, 14
mentions telescope due to DeRheita, 14
Housing reflector of 36 inch aperture, 243
rolling on track, 242
simplest instrument for fixed, 241
Huygens, Christian, devised methods of grinding  polishing, 16
Huygens’ eyepiece, introduction of, 24
Huygens, sketch of Mars, 16
I
Image, correct extra focal, 208
critical examination of, 204
Image, curvature of, 87
seen without eyepiece, 134
showing unsymmetrical coloring, 208
Interference rings, eccentric, 205
Irradiation, 262
J
Jansen, Zacharius, 4
K
Kellner, ocular, 145
Kepler, astronomical telescope, 10
differences of from Galilean form, 10
Knife edge test of parabolic mirror, 212
L
Lacquer, endurance of coating, 223
Latitude scale, 232
Lenses, determinate forms for, 80
Lens, magnifying power of, 134
“crossed,” 24
polishing the fine ground, 70
power of, 78
triple cemented, a useful ocular, 138
simple achromatic, 137
single, has small field, 137
spotted, cleaning of, 217
Light grasp and resolving power, 265
small telescope fails in, 264
Light ratio of star magnitudes, 264
Light transmitted by glass, 53
Lippershey, Jan, 2
discovery, when made, 5
retainer to, 3
Lunette à Napoleon Troisiéme, 154, 155, 162
M
Magnifying power, directly as ratio of increase in tangent, 135
powers, increase of, 273
Marius, Simon, 5
used with glasses from spectacles, 5
Marius, picked up satellites of Jupiter, 5
Meridian photometer, 194
Metius, James, 4
Metius, tale of, 4
Micrometer, double image, 171
square bar, 171
Micrometers, 168
Micrometry, foundations of, 12
Mirror’s, aberrations of, 92
adjustment of, 206
concave spherical, 92
final burnishing of, 226
hyperboloidal, 96
lacquer coating for surface, 221
mounting, by Browning, 49
parabolic oblique, shows aberration, 95
surface, prevention of injury to, 220
Mittenzwey ocular, 141
Mountain stations, good or very bad, 254
Mounts, alt-azimuth and equatorial, 98
Myopia, glasses for, came slowly, 2
N
Navicula Lyra, stages of resolution of, 271
Newton, abandoned parabolic mirror, 21
blunder in experiment, 20
gave little information about material for mirrors, 23
Isaac, attempt at a reflector, 20
Normal spectra, 190
O
Objective, adjustable mount for, 44
adjusting screws of, 44
Clark’s form, 83
cleansing, 203
examination of, 202
Objective, four-part, 85
Fraunhofer flint-ahead, 83
how to clean, 216
spacers, to take out, 217
typical striæ in, 203
Objective prism, photographing with, 185, 187
Objectives, crown glass equiconvex, 80
over-achromatized, 90
rated on focal length for green 24
Observatories, cost of Romsey, 252
Observatory at small expense, 249
Romsey, description of, 249
with simple sliding roof, 245
Observing box, 229
Oblique fork alt-azimuth, 100
Ocular, apparent angular field of, 146
terrestrial, 147
Tolles terrestrial, 147
typical form, 45
Oculars, radius of curvature of image in, 146
undesirability of short focus, 275
Open fork mount, 115
well suited to big reflectors, 117
Optical axis, to adjust declination of, 238
Optical glass, classes of, 63
data and analysis of, 64
industry, due to single man, 36
production of, 60
Orthoscopic ocular, 145
P
Parallactic mount, 104
Petition for annulment of Dolland’s patent, 29
Photometer, artificial star Zöllner, 194
extinction, 198
photoelectric cell, 199
precision of astronomical, 199
selenium cell, 199
Zöllner, 197
Photometers, three classes in stellar, 193
“Photo-visual, objective,” 89
Pillar-and-claw stand, 98
Pillar mount, 240
Pitch, optician’s, 71
Placement for tripod legs, 236
Polar and coudé forms of reflector, 125
axis, adjustment of by level, 232
axis, alignment to meridian, 232
axis, setting with finder altitude of, 234
telescope, 119, 122
Polaris, hour angle of, 233
a variable star, 199
Polarizing photometer, 193
Pole, position, 234
Polishing machine, 70
surface of tool, 72
tool, 71
Porro’s second form, 157
work, original description of, 156
Porta, description unintelligible, 7
Portable equatorial, adjustment of, 230
telescopes, mounting of, 228
Porter polar reflector, 130
Position angle micrometer of Lowell Observatory, 173
Powers, lowest practicable, 276
Prismatic inversion, Porro’s first form, 155
Prismatic inverting system, the first, 154
Prisms, Dove’s, 154
Prism field glasses, stereoscopic effect of, 159
Prism glass, 152
loss of light in, 160
objectives of, 161
weak points of, 160
R
Resolving constant, magnification to develop, 275
power and verity of detail, 2
power of the eye, 274
Reticulated micrometer, 169
Reversion prism, 153
Right ascension circle, 108
Ring micrometer, 169
computation of results of, 170
Ring system faults due to strain, 205
“Romsey” observatory type, 248
Rack motion in altitude, 100
Ramsden, ocular, 144
Reflection, coefficient of, from silvered surface, 54
Reflector costs, 55
cover for, 242
development in England, 41
for astrophysical work, 56
light-grasp of, 53
relative aperture of, 50
section of Newtonian, 45
skeleton construction, 49
suffers from scattered light, 56
working field of, 55
Refractive index, 63
Refractors and reflectors, relative advantages of, 52
few made after advent of reflector, 27
in section, 43
light transmission of, 53
Refractors, relative equivalent apertures of, 54
tubes of, 42
S
Scheiner, Christopher, use of Kepler’s telescope, 11
devised parallactic mount, 11
Secondary spectrum, 87
new glasses reducing, 88
Seeing, 257
conditions, for difference of aperture, 257
conditions generally bad, 253
standard scale of, 256
true inwardness of bad, 253
Separating power, to compute, 261
Short, James, mastered art of figuring paraboloid, 27
took up Gregorian construction with success, 27
Shortened telescope, 152
Sights, on portable mount, 229
Silver films, condition of, 54
Silvering, Ludin’s process, 225
processes, 222

Learning Analytics in Higher Education 1st Edition Jaime Lester

  • 1.
    Learning Analytics inHigher Education 1st Edition Jaime Lester install download https://ebookmeta.com/product/learning-analytics-in-higher- education-1st-edition-jaime-lester/ Download more ebook from https://ebookmeta.com
  • 2.
    We believe theseproducts will be a great fit for you. Click the link to download now, or visit ebookmeta.com to discover even more! The Analytics Revolution in Higher Education Big Data Organizational Learning and Student Success 1st Edition Jonathan S. Gagliardi https://ebookmeta.com/product/the-analytics-revolution-in-higher- education-big-data-organizational-learning-and-student- success-1st-edition-jonathan-s-gagliardi/ Digital Agency in Higher Education Transforming Teaching and Learning 1st Edition Toril Aagaard https://ebookmeta.com/product/digital-agency-in-higher-education- transforming-teaching-and-learning-1st-edition-toril-aagaard/ India Higher Education Report 2022: Women in Higher Education 1st Edition N.V. Varghese https://ebookmeta.com/product/india-higher-education- report-2022-women-in-higher-education-1st-edition-n-v-varghese/ Lloyd's Maritime Atlas of World Ports and Shipping Places 2022-2023 32nd Edition Taylor & Francis Group https://ebookmeta.com/product/lloyds-maritime-atlas-of-world- ports-and-shipping-places-2022-2023-32nd-edition-taylor-francis- group/
  • 3.
    Lectures on ModernConvex Optimization 2020-2023: Analysis, Algorithms, Engineering Applications Aharon Ben-Tal https://ebookmeta.com/product/lectures-on-modern-convex- optimization-2020-2023-analysis-algorithms-engineering- applications-aharon-ben-tal/ Leadership Resilience in a Digital Age 1st Edition Young https://ebookmeta.com/product/leadership-resilience-in-a-digital- age-1st-edition-young/ Miss Mated BBW Paranormal Shape Shifter Romance Raging Falls Book 4 1st Edition Milly Taiden Taiden Milly https://ebookmeta.com/product/miss-mated-bbw-paranormal-shape- shifter-romance-raging-falls-book-4-1st-edition-milly-taiden- taiden-milly/ Key Out of Time 1st Edition Andre Norton https://ebookmeta.com/product/key-out-of-time-1st-edition-andre- norton/ WJEC Eduqas Media Studies For A Level Year 1 and AS Student Book Unknown https://ebookmeta.com/product/wjec-eduqas-media-studies-for-a- level-year-1-and-as-student-book-unknown/
  • 4.
    How to Winat Spread Betting An analysis of why some people win at spread betting and some lose Patel Alpesh B Kiri Paresh H https://ebookmeta.com/product/how-to-win-at-spread-betting-an- analysis-of-why-some-people-win-at-spread-betting-and-some-lose- patel-alpesh-b-kiri-paresh-h/
  • 6.
    Learning Analytics in HigherEducation Learning Analytics in Higher Education provides a foundational understanding of how learning analytics is defined, what barriers and opportunities exist, and how it can be used to improve practice, including strategic planning, course ­ development, teaching pedagogy, and student assessment. Well-known ­ contributors provide empirical, theoretical, and practical perspectives on the current use and future potential of learning analytics for student learning and data-driven decision-making, ways to effectively evaluate and research ­ learning analytics, integration of learning analytics into practice, organizational barriers and opportunities for harnessing Big Data to create and support use of these tools, and ethical considerations related to privacy and consent. Designed to give readers a practical and theoretical foundation in learning analytics and how data can support student success in higher education, this book is a valu- able resource for scholars and administrators. Jaime Lester is Professor of Higher Education at George Mason University, USA. Carrie Klein is a PhD Candidate, and Research and Teaching Assistant in the Higher Education Program at George Mason University, USA. Aditya Johri is Associate Professor of Information Sciences and Technology at George Mason University, USA. Huzefa Rangwala is Associate Professor of Computer Science at George ­ Mason University, USA.
  • 8.
    Learning Analytics in HigherEducation Current Innovations, Future Potential, and Practical Applications Edited by Jaime Lester, Carrie Klein, Aditya Johri and Huzefa Rangwala
  • 9.
    First published 2019 byRoutledge 711 Third Avenue, New York, NY 10017 and by Routledge 2 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN Routledge is an imprint of the Taylor & Francis Group, an informa business © 2019 Taylor & Francis The right of Jaime Lester, Carrie Klein, Aditya Johri, and Huzefa Rangwala to be identified as the authors of the editorial material, and of the authors for their individual chapters, has been asserted in accordance with sections 77 and 78 of the Copyright, Designs and Patents Act 1988. All rights reserved. No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publishers. Trademark notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe. Library of Congress Cataloging-in-Publication Data A catalog record for this title has been requested ISBN: 978-1-138-30213-6 (hbk) ISBN: 978-1-138-30217-4 (pbk) ISBN: 978-0-203-73186-4 (ebk) Typeset in Bembo by codeMantra
  • 10.
    Contents List of Tablesvii List of Figures viii Preface ix Acknowledgements xvi 1 Absorptive Capacity and Routines: Understanding Barriers to Learning Analytics Adoption in Higher Education 1 Aditya Johri 2 Analytics in the Field: Why Locally Grown Continuous Improvement Systems are Essential for Effective Data-Driven Decision-Making 20 Matthew T. Hora 3 Big Data, Small Data, and Data Shepherds 45 Jennifer DeBoer and Lori Breslow 4 Evaluating Scholarly Teaching: A Model and Call for an Evidence-Based Approach 69 Daniel L. Reinholz, Joel C. Corbo, Daniel J. Bernstein, and Noah D. Finkelstein 5 Discipline-Focused Learning Analytics Approaches with Users Instead of for Users 93 David B. Knight, Cory Brozina, Timothy J. Kinoshita, Brian J. Novoselich, Glenda D. Young, and Jacob R. Grohs
  • 11.
    vi Contents 6 Student Consentin Learning Analytics: The Devil in the Details? 118 Paul Prinsloo and Sharon Slade 7 Using Learning Analytics to Improve Student Learning Outcomes Assessment: Benefits, Constraints, & Possibilities 140 Carrie Klein and Richard M. Hess 8 Data, Data Everywhere: Implications and Considerations160 Matthew D. Pistilli Editors 187 Contributors 188 Index 193
  • 12.
    Tables 2.1 The sixrepositories where organizational information can be stored and accessed 29 4.1 Rubric of components of scholarly teaching 76 4.2 Summary of six perspectives of institutional change theories, from Kezar (2013) 82 4.3 A summary of roles of three key layers for enacting a scholarly and discipline-grounded teaching evaluations 83 5.1 Selected themes emerging from the focus group with students 96 5.2 Summary of themes related to data instructors would find useful 99 5.3 Change in how time is spent engaging with course content reported in hours per week between high-stakes tests 1 and 2 110 5.4 ID card use by student over the semester by college 113 6.1 Simple consent versus informed consent (Whitney et al., 2004, p. 55) 121
  • 13.
    Figures 1.1 Theoretical framework(adapted and modified from Martin et al., 2003, itself adapted from Zahra George, 2002, and others) 12 2.1 The processes of decision chain data use 32 3.1 Model for the integrated Multiple Perspective Insights framework 48 3.2 Reasons student cited for positive response to the CAF (Graph courtesy of Dr. Saif Rayyan and The MIT Faculty Newsletter) 56 3.3 Our integrated set of methods/methodologies, aligned with data sources and resultant findings (modified from Chen, 2015) 58 3.4 Spearman correlations between CAF behaviors and student outcomes (adapted from Chen, 2015) 62 3.5 Comparison of the centroids of the two student clusters for their behaviors on online homework problems 63 5.1 Overall unique sessions by week grouped by final course grade 106 5.2 LMS usage by day of the week grouped by final course grade 107 5.3 Performance pathways in statics based on four distinct tests 109 5.4 Average final daily ID usage for a high variation and low variation student (one semester broken into quarters) 112 5.5 Average student ID usage by day of the semester 112 5.6 Daily ID usage over the semester by GPA quintile (5th is highest) 114 6.1 A conceptual overview of the generative mechanisms for considering consent 131 6.2 Typology for consent in learning analytics 134 8.1 The Potter Box (Potter, 1965) 177
  • 14.
    Preface Jaime Lester Introduction In 2013,the same amount of data were generated in ten minutes as was gen- erated previously in all of recorded history (Zwitter, 2014). In the last decade, learning analytics has evolved in education alongside the Big Data revolution. The ability to mine and analyze large amounts of institutional data is useful for higher education institutions, which are facing increasing environmental pressures to provide evidence of learning, institutional accountability, and in- creased retention and completion rates (Norris Baer, 2013). The existence of these data has made their use integral data-driven management of higher education institutional goals and practices (Slade Prinsloo, 2013). Due to their volume, velocity, and variety, learning analytics have the ­ potential to bring clarity from complexity, allowing for organizations to ­ better understand trends and correlations in data (Macfadyen Dawson, 2012; ­ Norris Baer, 2013, p. 13). These insights can be used to improve peda- gogy, course design, student retention, and decision-making by providing ­ personalized feedback for users. Within this context, learning and advising management systems, based on learning analytics, are being developed to better measure, analyze, report, and predict data related to student learning, ­ retention, and completion. These learning analytics-informed systems have the potential to generate new insight into courses and student learning by creating respon- sive feedback mechanisms that can shape data-informed decision-making as it relates to teaching, learning, and advising. Given the potential and increasing presence of learning analytics in higher education, it is important to understand how learning analytics is defined, what barriers and opportunities exist, and how it can be used to improve organizational and individual practices, including strategic planning, course
  • 15.
    x Preface development, teachingpedagogy, student assessment, and ethical use. This edited book is designed to give readers a practical and theoretical founda- tion in learning analytics in higher education, including an understanding of the challenges and incentives that are present in the institution, in the individual, and in the technologies themselves. The authors of this book explore the current use and future potential of learning analytics for student learning and data-driven decision-making, ways to effectively evaluate and research learning analytics, integration of learning analytics into practice, organizational barriers and opportunities for harnessing Big Data to create and support use of these tools, and ethical considerations related to privacy and consent. Among questions that are explored and answered are (1) What are the foun- dational assumptions and functions of learning analytics algorithms? (2) What effects do learning analytics technologies have on student learning, pedagogical development, and assessment of teaching and learning? (3) What role do insti- tutional context, technological capacity, and individual beliefs play in promot- ing or constraining adoption and integration of learning analytics technologies in higher education? (4) What are the ethical considerations related to use of learning analytics or other predictive data and associated interventions? and (5) What are the practical implications and future research recommendations associated with learning analytics based? Defining Learning Analytics Learning analytics has arguably grown out of the field of education data min- ing and the explosion of Big Data that has occurred during the past decade. Education mining is the development and use of research methods to leverage large-scale or ‘big’ data from educational settings to better understand student learning and contexts (Siemens Baker, 2012). While there is no uniformly accepted definition, learning analytics is generally understood to be the “mea- surement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the envi- ronments in which it occurs” (Siemens, 2013, p. 3). Learning analytics is a form of educational data mining that specifically uses predictive analysis on Big Data with the intention of creating platforms for intervention. One of the first ex- amples of a learning analytics platform was the Purdue Course Signals Project, which developed a traffic light visualization to represent student performance in higher education courses (Arnold Pistilli, 2012). The signals were available to students to help them better assess their learning and faculty to utilize new communication tools to support student success. Behind the Signals tool were some of the first learning analytics algorithms and new forms of data visual- ization. Importantly, learning analytics is concerned with understanding and inferring certain key characteristics about student learning, not a more generic
  • 16.
    Preface xi use ofpredictive analytics (e.g., institutional or academic analytics) for business use (e.g., predicting student enrollment trends). Learning analytics is an emerging field, and the few studies currently pub- lished have largely focused on the user side with emphasis on specific ­ analytics tools, such as Purdue Course Signals, or data visualization (Ali, ­ Hatala, Gašević, Jovanović, 2012; Arnold Pistilli, 2012; Duval, 2011; ­ Jayaprakash, Moody, Lauría, Regan, Baron, 2014; Kosba, Dimitrova Boyle, 2005; Lockyer, Heathcote Dawson, 2013; Park Jo, 2015; Santos, Verbert, ­ Govaerts, Duval, 2013; Verbert, Duval, Klerkx, Govaerts, Santos, 2013). The results of many of these studies conclude that learning analytics is useful when tracking student information (i.e., grades) and creating communication with peers and instructors; however, poorly designed data visualizations and communications can also inhibit use. An even smaller number of studies have focused on the impact of organizational dynamics on learning analytics adop- tion and use. Hora, Bouwma-Gearheart, and Park (2017), for example, found that a lack of time, incentives, and training for pedagogy negatively impact the ability for instructors to accurately and efficiently use learning analytics tools. ­ Further, Klein, Lester, Rangwala, and Johri (in press) found that mis- alignment between technological components and abilities and user needs and practices inhibits adoption of these tools by individuals, recommending that users be included in the design, purchase, and implementation of these tools. In ­ response to the impact of organizational barriers and incentives, Arnold, Lonn, and ­ Pistill (2014) developed The Learning Analytics Readiness Instrument, building upon similar work by Norris and Baer (2013) to assist institutions in evaluating their capacity to integrate learning analytics into their institutional processes and cultures. This book helps to provide a multidisciplinary approach to the literature on learning analytics and provides multiple new research ave- nues to further knowledge. Audience and Need This book is intended for anyone who works in higher education and uses learning and/or advising management systems, especially those based on learn- ing analytics algorithms. Information in the book will be relevant for faculty, advisors, and administrators who are interested in the potential and challenges related to implementation, adoption, and integration of these systems on their campuses and within their classrooms and advising sessions. Researchers in higher education will also be interested in the interdisciplinary and multi- method discussion of analyzing the impact of learning analytics on student success and organizational decision-making. One of the main audiences for this book is information technology, in- structional designers, and institutional research offices that are regularly inter- facing with companies and organizations that develop learning analytics tools
  • 17.
    xii Preface for highereducation. Companies such as Blackboard™, D2L, EAB, Ellucian, Moodle, and Salesforce, in partnership with universities, have developed learn- ing analytics tools to assist in tracking student success. The rise of the power of new algorithms has created a marketplace with little empirical guidance for higher education administrators who are selecting among a variety of new, and often expensive, tools. This book provides a better understanding of the capa- bilities of learning analytics tools, how to critically consider their use and meth- odologies, how to develop tools that are useful to users, how to integrate these tools into practice, and how to use these tools and their data into organizational decision-making. This book serves as a valuable resource for all higher edu- cation administrators who are evaluating or adopting learning analytics tools. Other main audiences for this book are higher education faculty and ad- visors, who are seeking to integrate learning analytics tools into their prac- tice. Studies on learning management systems often note that these systems are often not used to their full potential (Bichsel, 2012; Dahlstrom, Brooks ­ Bichsel, 2014), and a more recent study (Klein, Lester, Rangwala, Johri, in press) identifies the challenges in adoption of learning analytics tools. Simply, more intentionality in the form of professional development and consistent in- stitutional decision-making is needed to support integration of learning analyt- ics into practice. This book provides information on the relationship between institutional decision-making and supporting widespread adoption of learning analytics, the need for faculty professional development tied to the values that undergird teaching and advising practice, and the importance of including us- ers in the process of learning analytics development and implementation. Or- ganizations that work on faculty development and those focused on student learning and assessment will be interested in the contents of this book. Finally, this book appeals to higher education, learning analytics, and other education scholars who are working on the myriad of questions related to learning analytics. Across multiple disciplines—engineering education, com- puter science, communication, and psychology—scholars are exploring the complexity of learning analytics in multiple education sectors. This book pro- vides innovative approaches to learning analytics research, including those us- ing multiple forms of data- and user-informed methods. The value of this book is that it brings together scholars from these disciplines to explore the complex nature of learning analytics creation, adoption, and impact on student success within the unique context of higher education. Overview of the Chapters The first chapter in the book, written by Aditya Johri, explores the positive potential of learning analytics for higher education practice and argues that learning analytics has failed to gain widespread adoption in higher education, especially in comparison to corporate settings. He also shows, through a series
  • 18.
    Preface xiii of casestudies, how successful implementation of learning analytics initiatives is often hampered by the capacity and routines of both organizations and their members. Drawing on organizational studies concepts of absorptive capacity and routines, Johri outlines a new model of how colleges and universities can more effectively adopt learning analytics by addressing issues of capacity and routine during design and implementation. Matthew Hora draws from a case study of a California research university to explore the intersection of data-driven decision-making and learning analytics in Chapter 2. Using an organizational context framework, Hora argues that ed- ucators draw upon a variety of numeric data and other information (e.g., student feedback, conversations with colleagues), operate within institutional contexts that are poorly designed to facilitate continuous improvement, create novel and often low-tech solutions to the lack of quality data, engage colleagues and stu- dents in their use and analysis of instructional data, and respond to ­ external mandates for data use in very different ways. His chapter underscores the com- plexity of data-driven decision-making and the interplay of organizational capacity, individual routines and practices, and technological alignment that exists in data-rich environments. The next chapter authored by Jennifer DeBoer and Lori Breslow begins the discussion of learning analytics to examine student learning in the classroom. DeBoer and Breslow propose a sophisticated methodology, with a mixed- method and a multi-stage approach that leverages small data with the emer- gence of new Big Data used in learning analytics. This chapter also argues for the efficacy of multidisciplinary research teams that combine the expertise of education researchers and faculty in other disciplines (i.e., engineering and sci- ence). With data on student learning often coming in multiple forms, such as student surveys, exams, and transcript data, their model is a guide for who to engage and the steps needed to conduct a thorough analysis. Chapter 4, by Reinholz and colleagues, looks at the other side of the class- room, the evaluation of instructors, and how to utilize learning analytics to create more accurate teaching evaluations for promotion and tenure processes. Using a framework that defines teaching as a scholarly activity analogous to research, the authors outline how multiple forms of data, from students, faculty peers, and reflections from the faculty members, themselves, can be integrated into online learning analytics programs to more accurately and effectively evaluate faculty teaching. The chapter concludes with a clear strategy for im- plementation, a challenge that is outlined by Johri in the second chapter of this book. Chapter 5, by Knight et al., from multiple higher education institutions, describes a mixed-method approach to learning analytics data analysis. The authors suggest that incorporation of qualitative methods into learning an- alytics studies (which are often quantitative in nature) allows for a clearer understanding of student success. Largely in agreement with the central
  • 19.
    xiv Preface argument ofChapter 3 by DeBoer and Breslow, Knight et al., describe a process of engaging users, in this case, undergraduate student researchers, to construct and make meaning of data, or potential data, produced via learning analytics methods. The chapter continues by showing results from learn- ing analytics analysis and the limitations of explaining those results without the engagement of users like undergraduate students. This chapter combined with that of DeBoer and Breslow creates a compelling argument for more mixed-method and user-engaged models for learning analytics in the higher education setting. Prinsloo and Slade, in Chapter 6, outline a major concern in the collection and analysis of Big Data including learning analytics in higher education—­ student consent. The chapter begins with a broad overview of consent from the medical research tradition and turns to consent in the digital environment. Prinsloo and Slade effectively argue that the digital environment provides unique complexities to consent in the form of data property, intent of data use, control over data, and privacy. As learning analytics is concerned with the anal- ysis of existing Big Data often collected for other purposes, these arguments are directly relevant. The chapter concludes with broader ethical considerations and recommendations for consent. In Chapter 7, Klein and Hess provide an overview of how learning analytics data can inform student learning outcomes assessment efforts in higher educa- tion. The chapter begins with an overview of traditional assessment measures and then explores how the timely, visualized, personalized, and predictive na- ture of learning analytics data can enhance those efforts. Using examples from tools and approaches researched in extant theoretical and empirical studies, they show that use of learning analytics in assessment provides dynamic for- mative feedback to users, allowing them to make more timely, informed de- cisions during the learning process. They also highlight the need for learning analytics-enhanced assessment to be inclusive of informed and empowered data users (per the arguments by Johri, DeBoer and Breslow, and Knight et al., in this book), be built on trusted foundations, and be cognizant of the specific implications of using learning analytics data in practice. The chapter concludes with recommendations for implementation of learning analytics-enhanced as- sessment initiatives. The last chapter of the book is by Matthew Pistilli, an individual who was integral to the Purdue Signal Project. His chapter outlines the broad themes across the other chapters and presents future implications and practical con- siderations for learning analytics in higher education. Organized around four major questions, Pistilli argues that learning analytics is a growing and dynamic field that requires careful and thoughtful implementation. For example, Pistilli specifically outlines the complexity of student data and how the diversity of students in higher education today leads to a lack of uniformity in data. He concludes and provides a response to a central question: How should—not can—data be used, and to what ends?
  • 20.
    Preface xv References Ali, L.,Hatala, M., Gašević, D., Jovanović, J. (2012). A qualitative evaluation of evo- lution of a learning analytics tool. Computers Education, 58, 470–489. Arnold, K. E., Lonn, S., Pistilli, M. D. (2014, March). An exercise in institutional reflection: The learning analytics readiness instrument (LARI). In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge (pp. 163–167). New York, NY: ACM. Arnold, K. E., Pistilli, M. D. (2012, April). Course signals at Purdue: Using learning analytics to increase student success. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 267–270). New York, NY: ACM. Bichsel, J. (2012, August). Analytics in higher education: Benefits, barriers, progress, and recom- mendations (esearch report). Louisville, CO: EDUCAUSE Center for Applied Research. Retrieved from http://net.EDUCAUSE.edu/ir/library/pdf/ERS1207/ers1207.pdf. Dahlstrom, E., Brooks, D. C., Bichsel, J. (2014). The current ecosystem of learning man- agement systems in higher education: Student, faculty, and IT perspectives (research report). Louisville, CO: EDUCAUSE, September 2014. Available from www.educause. edu/ecar. Duval, E. (2011, February). Attention please!: Learning analytics for visualization and recommendation. In Proceedings of the 1st international conference on learning analytics and knowledge (pp. 9–17). New York, NY: ACM. Hora, M. T., Bouwma-Gearhart, J., Park, H. J. (2017). Data driven decision-making in the era of accountability: Fostering faculty data cultures for learning. The Review of Higher Education, 40(3), 391–426. Jayaprakash, S. M., Moody, E. W., Lauría, E. J., Regan, J. R., Baron, J. D. (2014). Early alert of academically at-risk students: An open source analytics initiative. Jour- nal of Learning Analytics, 1(1), 6–47. Klein, C., Lester, J., Rangwala, H., Johri, A. (in press). Learning analytics tools in higher education: Adoption at the intersection of institutional commitment and individual action. The Review of Higher Education. Kosba, E., Dimitrova, V., Boyle, R. (2005, July). Using student and group models to support teachers in web-based distance education. In International Conference on User Modeling (pp. 124–133). Berlin, Heidelberg: Springer. Lockyer, L., Heathcote, E., Dawson, S. (2013). Informing pedagogical action: Align- ing learning analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459. Norris, D. M., Baer, L. L. (2013). Building organizational capacity for analytics. Educause Learning Initiative, 7–56. Park, Y., Jo, I. H. (2015). Development of the Learning Analytics Dashboard to Sup- port Students’ Learning Performance. J. UCS, 21(1), 110–133. Santos, J. L., Verbert, K., Govaerts, S., Duval, E. (2013, April). Addressing learner issues with StepUp!: An evaluation. In Proceedings of the third international conference on learning analytics and knowledge (pp. 14–22). Leuven: ACM. Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behav- ioral Scientist, 57(10), 1380–1400. Siemens, G., d Baker, R. S. (2012, April). Learning analytics and educational data mining: towards communication and collaboration. In Proceedings of the 2nd interna- tional conference on learning analytics and knowledge (pp. 252–254). ACM. Verbert, K., Duval, E., Klerkx, J., Govaerts, S., Santos, J. L. (2013). Learning analyt- ics dashboard applications. American Behavioral Scientist, 57(10), 1500–1509.
  • 21.
    Acknowledgements This book wassupported in part by a grant from the National Science Founda- tion under grant IIS-1447489.
  • 22.
    1 Absorptive Capacity and Routines UnderstandingBarriers to Learning Analytics Adoption in Higher Education Aditya Johri Introduction As I sit here writing this near the start of a new semester, it is hard to imagine that it is almost a decade since the term ‘Big Data’, in its current incarnation, “the mining and processing of petabytes’ worth of information to gain insights into customer behavior, supply chain efficiency and many other aspects of busi- ness performance” (Pearson Wegener, 2013, p. 1), was first introduced in the mainstream media by The Economist (2010). Since then, the notion of data an- alytics has infused almost all thinking about how organizations go about their business; and data-driven organizations and organizations driving data-driven practices, including infrastructures such as cloud computing, have become the jewels of the business world (e.g., Amazon™, Google™, etc.). It was reported in a recent study of more than 400 large companies conducted by Bain Company that early adopters of Big Data analytics had a significant lead over the rest of the corporate world (Pearson Wegener, 2013). The companies that had adopted Big Data analytics, according to this report, were (1) twice as likely to be in the top quartile of financial performance within their industries, (2) five times as likely to make decisions faster than market peers, (3) three times as likely to execute decisions as intended, and (4) twice as likely to use data fre- quently when making decisions (Pearson Wegener, 2013). Given ­ reports like this, it is not surprising that many organizations, spanning various industries, are looking toward data analytics as way to propel themselves forward. Higher education institutions are also cognizant of the potential value of analytics to improve organizational performance. As a result, at least two lead- ing ideas and communities—educational data mining (EDM) and Learning Analytics (LA)—have emerged on the scene (Lester, Klein, Rangwala, Johri, 2017). EDM, which has a more computational stance, is concerned largely with
  • 23.
    2 Aditya Johri developing,researching, and applying computerized methods to detect patterns in large collections of educational data that would otherwise be hard or impos- sible to analyze due to the enormous volume of data within which they exist. EDM, as the name implies, is defined as “the application of data mining (DM) techniques to this specific type of dataset that come from educational environ- ments to address important educational questions” (Romero Ventura, 2013, p. 12). Overall, EDM researchers and practitioners analyze data generated by any type of information system that supports supporting learning or education, defined broadly—schools, colleges, or universities. These data are broad and include interactions of individual students within an educational system (e.g., navigation behavior, input in quizzes, and interactive exercises) but also ad- ministrative data (e.g., school, school district, teacher), demographic data (e.g., gender, age), and so forth. The other allied field, LA, is concerned more with learners directly and includes as its purview “the measurement, collection, anal- ysis and reporting of data about learners and their contexts, for purposes of un- derstanding and optimizing learning and the environments in which it occurs” (Siemens et al., 2011, p. 4). Whereas LA is largely concerned with improving learner success (Gašević , Dawson Siemens, 2015), the practitioners or LA differentiate academic analytics as “the improvement of organizational pro- cesses, workflows, resource allocation, and institutional measurement through the use of learner, academic, and institutional data. Academic analytics, akin to business analytics, are concerned with improving organizational effectiveness” (Siemens et al., 2011, p. 4). For the purposes of this chapter, I am going to use LA as a catchall for all the data and analysis techniques mentioned above. We hear continuously about how LA has the potential to change higher education and how these data are making inroads, but the reality at the level of everyday work practices is different. This is not to say that higher educa- tion organizations are not leveraging data and analytics (Arroway, Morgan, O’Keefe, Yanosky, 2016), but the adoption lags behind what organizations in other sectors are doing or moving toward. According to Bichsel (2012), in 2012, between 55% and 65% of institutions reported engaging in data activity at the level of finance and human resources, but less than 20% of institutions reported data analytics activity in the functional areas of instructional manage- ment, centralized information technology (IT) infrastructure, student learning, strategic planning, alumni and advancement, research administration, library, cost to complete a degree, human resources, facilities, faculty promotion and tenure, faculty teaching performance, procurement, and faculty research per- formance. A lot of the fervor is still about the potential and not necessarily the actual implementation of LA. For instance, the most innovative work in terms of using machine learning and DM is limited largely to researchers in the educational research community (SOLAR, EDM, etc.). Companies that are providing products are, for the most part, using rudimentary techniques for data analysis and presentation—bar graphs, pie charts, and large Excel™ files.
  • 24.
    Absorptive Capacity andRoutines 3 This is complicated by data access issues, including data ethics and privacy, and as many authors in this volume point out, that is not necessarily a bad thing. Yet, a lack of clarity around data use has limited the development of innovative applications. There is also limited understanding of the impact LA can have beyond the immediate concerns that most higher education institutions, espe- cially publicly funded ones, are facing, such as student retention. Tuition fees are becoming an ever larger portion of the budget as public funding is declin- ing. The other funding option is externally funded grants, which essentially means overhead, and, therefore, this is another area in which analytical efforts are targeted. Finally, LA is also prevalent in reporting, as accreditation con- cerns overwhelm institutions often at the expense of institutional effectiveness. At the infrastructure level—data warehousing, for instance—there has been significant uptake of information technology (IT) in most higher education institutions, and, therefore, there is a reasonable expectation that slowly LA will percolate to other aspects of higher education institutions. In this chapter, my goal is to shed light on what prevents a greater adoption of LA within higher education. As opposed to other chapters in the volume, I first use a personal perspective on LA, based on my experiences as an instructor, an administrator, and a researcher, to shed light on what I believe are some es- sential issues that need to be addressed. After that, I look at some recent reports that shed light on what organizations—largely business enterprises—have to do in order to leverage data analytics successfully. I then move toward some the- oretical explanations for the relative lack of data analytics application in higher education organizations and use these theoretical underpinnings to examine three case studies from my own experience working on an LA tool research project—cases, which are likely familiar to readers from their own experience. Finally, I end with practical considerations for overcoming barriers to the use of LA in higher education. I have to start with a caveat—my personal characteristics and experiences shape my experience of using LA. I am a technology adopter and so are most of the people who worked on the project I refer to in this chapter. I sit in an engineering school and teach analytics. I have a research interest in this area. I know people I can reach out to when I need help with technology, and I know the resources to refer to when I hit a wall. This is probably not the case for most people on campus. Just as an example, there is a wide variation in just using a learning management system (LMS) across the institution. Some of this has to do with lack of technological expertise, but a lot of it has to do with a lack of understanding of what LMS can and does add to teaching. In many cases, it actually does little beyond acting as a repository of resources. Even in terms of using email, which is a standard practice now, there is a diversity of tools people use (a large part of the user base actually using Gmail™ to access the university mail services). Therefore, there will always be a vast variation in any kind of technology use on a campus, and this, itself, I will argue, is problematic.
  • 25.
    4 Aditya Johri MissedOpportunities for Learning Analytics I start with reflecting on my work this week; a busy week as the new semester starts shortly. Here are the instances of data or analytics use I can come up with in my work. I looked at some data about my research expenditures—Excel™ charts sent to me to make sure expenditures were progressing as planned and charges were correct. I looked at it, saw some numbers in red, and went about taking action by emailing a few folks to ensure things were corrected. I had six such reports to go through, and most of my action was taken when things were way off or when things were off but I thought it was a temporary issue. The other data I looked at were the number of students enrolled in my classes to make sure that one of the classes had the minimum number of enrollments. Otherwise, I would have done some more advertising and publicity for the class. Then, I went into our LMS, Blackboard™, to set up course pages for the upcoming semester. I copied some stuff, I updated some other stuff, and I tweaked some settings. Now, my hope is that things will work smoothly when the semester starts. There doesn’t seem to be a lot of analytics going on here and minimal use of data. It is clear that as a faculty member, the use of data and analytics is not a part of my everyday practices or of what some organizational theorists will call my ‘routines’. The idea of routines or practices that have be- come a norm and are embedded across the organization is a critical one for my argument, and I will address it in detail later. Are there instances, though, within my work practices where data and an- alytics would have been important for me or would have helped me in some ways? Can it be integrated into my routines—actions I have to take habitually? Certainly. For instance, I would much rather be able to do a dynamic review of my grant expenditures. Grant funding comes with a time stamp—it has to be spent within a specific amount of time, and the funds can be used only for certain activities and items as specified in the proposal. Yet, there is some leeway wherein spending can be different than what is exactly proposed. This means that it is important to monitor and make adjustments as the grant pe- riod progresses. The systems in place to monitor spending—many driven by federal regulations—make the monitoring problematic. There is always a delay between spending and when it is posted against the grant account, for instance. There is also a lag, as finding a student with the right expertise can take time. All this makes running the grant very fluid, and this problem is compounded with each additional grant. Hence, some way of continuous monitoring is es- sential. And yes, I know there are systems that allow me to do that to some extent, and these vary by the institution, but they are cumbersome to use and no single system provides me with all the information I need to take action. At the end of the day, it will take an email or a face-to-face visit with a fiscal person or a post-award administrator to resolve the issue, and often there is no single point of contact. The grants office is responsible for certain issues, while the home department of college is responsible for others.
  • 26.
    Absorptive Capacity andRoutines 5 When it comes to teaching, in addition to the number of students, it would be great if I could get some information about the students. Now, there is a way in which I can log in to another system, dig through a few screens, and get their photos and degree information, but, once again, it is cumbersome. What I really want to know is their prior knowledge, their achievements, and their interests. In my role as a teacher, my primary responsibility lies in ensuring students learn. A lack of knowledge of what students actually know, except for a few broad markers, is a real barrier to how I go about my work. Not that I will be able to take care of all the variance in prior knowledge, but at least I will have some idea of where the students are coming from. In an ideal world, their years of schooling and acceptance into a program should convey some of that, but the reality is quite different. Finally, the LMS is a black box, where I put in content and effort but noth- ing much comes out. In one of my classes, which I teach online, everything is managed through our LMS, Blackboard™—the course readings and videos, the quizzes, the discussions, the reflection assignments submitted by students; all of this is online. Yet, I have very limited knowledge of what is actually going on in the class until I see a submission from a student and need to grade that. There is no overview dashboard that tells me who has looked at the content, who is on track to meet the deadline, how long is it taking students to read the content, and so on. The end-of-semester evaluations are not mandatory, and, therefore, it is hard to interpret and use that data to revise the course (al- though it is still used to evaluate faculty as that is the only data point available). The data go somewhere, somebody benefits—makes a lot of money off my effort—but nothing feeds back to my work. And yes, there are ways to better monitor the use of the system—you can turn on the option to store the number of views, for instance—but most of that information is summative. Formative analytics is unfortunately missing. This is the crux of the issue—none of my work practices incentivize me to put effort into utilizing analytics more effectively. I still try my best to incor- porate these data, because I want to work more efficiently. From research prac- tices, to teaching practices, to advising, nothing is built to draw on or benefit from data and analytics, and, hence, nothing does. The incentive for grants is real—Who wants to go over budget? So, I pay some attention. And even if I want to change the way I teach or run grants using more data and analytics, it is hard to do unless the infrastructure is in place. To some extent, these is- sues are personal, and organizational members need to be invested and willing to make changes, but without the infrastructure in place, these jobs becomes much harder. Why, after all this effort, is this still the case? I try to address this issue in this chapter. I’m not technology averse or analytic averse; I even have multiple grants on this topic and write about it. So, why hasn’t it made it into my practice? Is this lack of integration of LA into my practice a problem, and, if so, what is the solution? Here are some practical ideas, and later, I will discuss why these often fail to make their way into higher education.
  • 27.
    6 Aditya Johri PracticalIdeas for Success with Learning Analytics Let’s take a normative look at what needs to be done if one wants to leverage analytics in a meaningful manner. Nothing is more normative than prescrip- tions by professional consulting firms such as McKinsey, so I am drawing on multiple reports and papers from them including the following: Arellano, DiL- eonardo, and Felix (2017), Brown, Kanagasabai, Pant, and Pinto (2017), Chui, Henke, and London (2017), Kirkland and Wagner (2017). Data Capture and Availability One of the first issues that needs to be addressed for any form of analytics to be performed is the capture of data. Without data—useful data—there is no scope for any analysis to be performed. The proliferation of digitization across organizations means that it is possible to capture a wide variety of data and also to acquire large volumes of it. For instance, a retailer now has access not only to sales data and customer information through their credit cards but also their online customer profiles and even log data for every action that they perform on the retailer’s website. In higher education organizations, similarly, there is the opportunity to capture a variety of data about students such as their incom- ing Grade Point Average (GPA), high school performance, their interaction with an LMS, and even their swipe access data using their student identity card. Of course, these increased data bring with them numerous challenges for capture, storage, and analysis, especially in regard to whether useful data are being captured. For instance, if we take the mission of a higher education insti- tution to be improving student learning, we need to then think about whether the data that are captured and can be analyzed assist us with this mission. As of now, we have very little that data speak to learning that helps us understand students’ cognitive process or misconceptions. We have grades and GPA infor- mation, which is more a signal of achievement rather than cognition. At best, it is an indirect marker of knowledge. To leverage useful LA data that assess learning requires the collection of disparate data, sources need to be monitored and stored—from student admission and enrollment data to their activities on the LMS. Modeling and Analysis The second important step in the analytic process is the availability and use of different mathematical models that can take useful data and turn them into something actionable—and provide insights that allows us to better under- stand an issue. There are dozens, if not more, models or techniques available for analyzing data, including those that draw on traditional statistics and so- cial science such as statistical models, visualization, social network analysis,
  • 28.
    Absorptive Capacity andRoutines 7 sentiment analysis, influence analytics, discourse analysis, concept analysis, and sense-making models, and those that draw on computational DM such as clas- sification, clustering, Bayesian modeling, relationship mining, and discovery with models (Romero Ventura, 2013). These techniques have been used for predicting student performance, providing feedback for supporting instruc- tors, recommending problems or contents to students, creating alerts for stake- holders such as students to complete a task, domain modeling to describe the domain of instruction in terms of concepts, and for planning and scheduling fu- ture courses. These applications though have largely used existing techniques, and very little development has taken place of techniques that are unique to LA or modified for LA. Therefore, it is an open question as to the value these techniques add and also an open area of research to develop techniques that are driven by LA requirements. Embedded Analytics within Organization for Action Once data and techniques are available, the real challenge of LA, which is to add value to the organization, begins. There are two steps in this process: (1) embedding of LA across the organization and (2) creation of practices that leverage LA capabilities. In order to add value to the organization, LA tools and their data have to be embedded across existing practices, or new practices have to be created across the institution (Pistilli, Willis, Campbell, 2014). In other words, LA has to scale across all people that can actually use the data and techniques to make a difference. This is one of the most prominent barri- ers to use of LA in higher education. In order to make informed decisions, an organization needs human resources with expertise, users have to be trained to use data-driven practices, and user functions have to be aligned with tech- nological capabilities (Bean and Kiron, 2013). It’s a tall order, and one way to think more about it is in terms of an organization’s absorptive capacity and of instantiation, or some would say reification, of that capacity in everyday routines. Absorptive Capacity and Routines One lens to examine the diffusion of LA among faculty members is absorptive capacity, an organizational theory that was introduced by Cohen and Levinthal (1990). Developed to describe the behavior of a firm, this perspective posits that the firm’s absorptive capacity—or an organization’s ability to recognize, assimilate, and apply new information in innovative ways—greatly relies upon the firm’s prior related knowledge (Lane, Koka Pathah, 2006). The authors argue that that prior learning shapes subsequent learning (Cohen Levin- thal, 1990). In addition, cumulative experience in utilizing new and external knowledge increases an individual’s absorptive capacity. Furthermore, Cohen
  • 29.
    8 Aditya Johri andLevinthal (1990) note that an organization’s absorptive capacity depends on the absorptive capacities of key ‘gatekeepers’ within the organization who interface with the external environment and can translate new information to be useable within the organization. If new ideas—such as LA—are too distant from an organization’s existing knowledge base and practices, Cohen and Levinthal’s (1990) theory would predict that it would be difficult for those ideas or innovations to gain traction, diffuse, and become sustainable. Within higher education, absorptive capacity has largely been used to ex- plain diffusion (or nondiffusion) of innovations in the context of university- industry relationships (e.g., Azagra-Caro, Archontakis, Gutiérrez-Gracia, Fernández-de-Lucio, 2006), but in recent years, scholars have also started to use the concept to examine organizational dynamics. For instance, Da Silva and Davis (2011) use absorptive capacity to explain faculty members’ research scholarship and proposed that individual characteristics, such as task motiva- tion and creativity, relate to a faculty member’s ability to generate creative, new research. They also posited that a faculty member will leverage external sources of research-related knowledge to produce creative research above and beyond those individual characteristics if they have prior relevant knowledge to do so or have extrinsic motivation via institutional policies expecting high research productivity (Da Silva Davis, 2011). Furthermore, faculty who perceive support for creative research from supervisors, colleagues, and non- work sources (i.e., family and friends) also would theoretically have a stronger relationship between their creative performance and innovative performance. In other words, perceptions of support impact faculty members’ potential ab- sorptive capacity—or the ability to identify, acquire, and assimilate external knowledge—and their realized absorptive capacity—the ability to exploit and implement that knowledge (Zahra George, 2002). Therefore, the absorp- tive capacity framework helped identify both individual- and organizational- level factors that influenced faculty members’ research innovation. In a similar vein, the uptake of LA within higher education can be seen as the ability of indi- vidual users—faculty, staff, or students—and of the organization—assessment office—to be able to acquire and assimilate external knowledge of LA and then exploit it for their purposes. This could mean that faculty develop pro- ficiency with using LMS and are able to provide students a better learning experience. It can mean formative and continuous assessment of programs through data comes from multiple sources such as student performance, LMS, course assignments, etc. In some ways, this seems simple enough, so what is the barrier? It is routines or the habitual and procedural use of these capabil- ities across the institution. The concept of routines, which can take the form of standard operating pro- grams, procedures, norms, habits, and so on, has been advanced across a range of organizational theories (Cyert March, 1963; March Simon, 1958). In
  • 30.
    Absorptive Capacity andRoutines 9 simple terms, routines consist of rules, heuristics, and norms at different levels of organization activities (Pentland Rueter, 1994). A critical element of rou- tines, as opposed to other practices, is that routines are practices that become a standard (Pentland Feldman, 2005; Pentland Rueter, 1994). A distinction is made by Lewin, Massini, and Peeters (2011) between meta-routines that are higher level routines and are associated with a bundle of specific lower level routines that can be seen as practices routines (standard operating procedures) that express a higher level meta-routine. From the perspective of this chapter, the important aspect of routines is that they can be considered to constitute the building blocks of organizational capabilities. If something has to be made a part of how things work, they have to become routinized. According to Lewin et al. (2011) who advance a routine-based theory of ab- sorptive capacity, the overall effectiveness of absorptive capacity is determined by the extent to which organizations develop processes that address routines, both at the organizational level and the individual practiced routines. Absorp- tive capacity and routines interact as absorptive capacity can enable or restrict change in routines by moderating exploration. One common way in which this can happen is that an organization can create routines that actually bring new ideas into the organization. Yet, the success of this routine will depend on how selected routines (within this meta-routine) play out. If these routines discourage variation by the way in which selection is made, overall absorptive capacity is reduced. Any organization at any given time is full of routines, and higher educa- tion institutions are no different. There are routines as simple as regular emails through mailing lists that employees receive, regular alerts from the LMS, reg- ular faculty meetings, and so on. However, for an institution to grow and innovate, it is critical that routines change and new routines get designed and adopted (Lewin, Massini, Peeters, 2011). Routines in and of themselves are adaptable, and they evolve over time as new knowledge, often as new people and innovations, are introduced. Then, through a selection and retention pro- cess, some changes become a constant. At the time a new routine gets estab- lished or an old routine gets modified, it is hard to predict the outcome of this routine; therefore, there is an element of trying things out to see what works. Routines that work will often get replicated. For instance, if one department is successful at implementing some form of student advising that works, then others will follow suit. Some routines get reified and formally embedded or- ganizationally in the form of rules, procedures, norms, or habits, and others are contextual and idiosyncratic to a unit or a department within the organi- zation. For example, every organization has procedures to allow students to enroll in classes and withdraw from classes. Not every department has Friday happy hours, and different research centers might have different kinds of proj- ect meetings.
  • 31.
    10 Aditya Johri ProposedFramework for Learning Analytics Capacity Building The most direct examination of how absorptive capacity can support higher education innovation relevant to LA comes from Martin, Massy, and Clarke (2003)’s paper on absorptive capacity and learning technologies. Martin et al. (2003) use absorptive capacity to explain why e-learning, despite its potential to become a booming industry, did not diffuse more rapidly in Europe. They propose a model that integrates adoption, diffusion, and assimilation processes related to e-learning and advance several propositions. Drawing on Zahra and George (2002)’s reconceptualization of absorptive capacity, the proposed framework also makes a distinction between potential and realized capacities. I propose a modified version of their model as a framework to examine how the failure of LA to become routinized within higher education is a barrier to its adoption. I incorporate more directly the idea of routines in the framework to propose that the absorptive capacity for LA is influenced by two sets of an- tecedent factors: (1) the nature of LA—the technology, the techniques, and the data—that is available to an organization or an individual and (2) the capacity or prior knowledge that exists in the organization or individual to utilize LA. In the case of LA, in particular, the distinction between potential and realized capacities is strongly applicable. As I argued earlier, the emphasis so far within LA as it related to higher education has been on the potential of it rather than what has been realized. Similar to Martin et al. (2003), I propose that potential capacities can be further subdivided into two specific ones—the acquisition of knowledge and the assimilation of knowledge. Realized capacities, on the other hand, relate to the transformation of knowledge and the exploitation of knowledge. Acquisition This dimension means an organization’s or individual’s dynamic capacity to iden- tify and acquire external knowledge about LA. According to Martin et al. (2003), this dynamicity has three important subcomponents—the potential speed, intensity, and direction of knowledge acquisition. In practice, this can mean the speed with which new appointments are made of experts or the speed with which an individual updates their knowledge. The intensity relates to the depth of knowledge ­ acquired related to a software or technique—how much prior knowledge is needed to acquire new knowledge? The direction refers to the target – is the new knowledge needed by or for students, faculty, or advisors. Assimilation This dimension refers to the organization’s processes or individual’s work prac- tices that allow them to understand and act on information or knowledge about LA they acquire from other sources (Martin, Massy, Clarke, 2003). One
  • 32.
    Absorptive Capacity andRoutines 11 critical issue with assimilation often is the technological jargon, complicated algorithms, or lack of understanding of data that does not match an existing knowledge base or heuristics of the organization or of individuals. Overcom- ing this barrier then requires retraining or hiring of new experts. It might also mean changing existing processes to allow for new information to enter the system. It is commonly acknowledged in higher education organizations that a lack of trained personnel is a barrier to assimilation of LA. For instance, person- nel trained in analytics are hard to recruit, as the skills are not easily available and private industry is often able to lure trained academic personnel. Therefore, a new LA system can require significant retraining of staff, pulling them from their existing routines and practices. Transformation In the context of LA, transformation can be thought of as the ability to further develop existing knowledge by fusing it with new knowledge. Transformation has often been associated with the capability to take two previously incom- patible or incongruous frames of references and combine them in a novel way to produce a new model or schema (Zahra George, 2002). This fusion, if successful, often alters the manner in which an organization or individual per- ceives itself and interprets its environment. This transformation is akin to the creation of new routines or work practices, and, in many ways, it is rare but also a truly innovative aspect of successful organizations. For many higher ed- ucation institutions, this means being able to introduce cloud computing in its infrastructure, continuously analyze new data, and create actionable insights to improve efficiency and effectiveness. Exploitation The end game, if LA has to be influential in higher education, is its practi- cal application across the institution and by a range of individuals who work there—staff, faculty, administrators, advisors, etc. This means that LA is a part of common routines across the organization so that LA is used to improve various functions. Even though the application might be short term and still be useful, it is the long-term application—reification in routines—that can produce systematic and systemic change. This shift can be seen with many institutions creating new offices and hiring new personnel that are devoted to analytics. It can also be seen in an attempt to purchase novel software that promises actionable knowledge across a range of function. Figure 1.1 displays a synthesized analytical framework, which hypothesizes that the nature of the LA system or technology combined with users’ prior knowledge with tech- nology will influence adoption or non-adoption after moving through their absorptive capacity filters.
  • 33.
    12 Aditya Johri Some(Unsuccessful) Case Studies for Use of Learning Analytics As an example of what is possible with LA, what steps can be taken to reach that goal, and how adoption fails, I now discuss three case studies from my own experience. The first two scenarios derive from research, and, to some extent, application, on a project that was externally funded and on which I served as co-principle investigator. The overall goal behind the project was to better un- derstand issues of student retention using institutionally available data and to be able to predict and support students who were likely to struggle with or be un- successful in their educational goals. After almost half a year of discussions and negotiations, we were able to access the data. These data consisted of a range of student records including their LMS participation. To protect student privacy, each student was given a unique identifier, and we did not have access to student names. Although some demographic data about the students were available, they were not analyzed. The third case comes from trying to use an off-the- shelf software to analyze students’ success in terms of retention and graduation. Figure 1.1 Theoretical framework (adapted and modified from Martin et al., 2003, itself adapted from Zahra George, 2002, and others). Case Study 1: Understanding Student Retention and Persistence The first challenge we wanted to tackle was understanding student reten- tion. In particular, improving student retention in science, technology, engineering, and mathematics, or STEM, majors has been a real concern for higher education institutions. Given that the primary project investiga- tor (PI) and I were both in the engineering school, one of the first projects we undertook was to study student retention in STEM (Almatrafi, Johri, Rangwala, Lester, 2017). We used the data of students who started in Fall 2009 and Spring 2010 to project the retention rate in every STEM major at each semester for eight semesters. The data included 328 students who matriculated in engineering and 299 students who matriculated in
  • 34.
    Absorptive Capacity andRoutines 13 science for that year. We looked at all students who were admitted be- tween 2009 and 2014, both direct admits and transfers. Transfer students are often neglected in studies of retention and persistence, especially in engineering, and our institution has a large number of transfer students, so this was of special interest to us. We found that engineering students were more persistent than science and math students, with retention rates over 60% for engineering students compared to 40% in math, for instance. Persistence rates for first-time students were lower than transfer students in engineering. Also, as has been reported previously, most migration out of discipline occurred in the first two years of enrollment. We also found that among the enrolled students, a large number of engineering students (almost 20%) did not declare a major, some until late in their studies. In contrast, in the college of science, all enrolled students had declared a major by the eighth semester. This work was further expended to look at an ever larger population of students (Chen, Johri, Rangwala, 2018). Case Study 2: Understanding Student Trajectories In a second study (Almatrafi, Johri, Rangwala, Lester, 2016; Sweeney, Rangwala, Lester, Johri, 2016) from the same project, we compared course trajectories of students who performed well academically and grad- uated in four years with those who did not (low performers). The goal was to identify factors related to how course-taking choices and degree planning affected students’ academic performance. The dataset consisted of information for three majors within the engineering school: civil, en- vironmental, and infrastructure engineering (CEIE), computer science (CS), and IT. The data included more than 13,500 records of 360 students. The analysis showed that low performers postponed taking certain courses until the latter end of their program, and this delay had consequences for taking other courses and, subsequently, their graduation. We also found a trend, whereby low performing students enrolled in a set of courses within a specific semester—took certain courses concurrently—that the high performing counterparts did not. Case Study 3: Off-the-Shelf Software Use The final case study comes from my attempts to use a third-party system purchased from a vendor by my institution. The system was designed to allow me to better understand how students were performing in a timely manner. As department chair at that time, it was useful for me to know
  • 35.
    14 Aditya Johri Discussionand Conclusion How can the framework I proposed earlier help us interpret these case studies? Whether it is experimenting with LMS such as BlackBoard™, Moodle, or Sakai, or systems for student admissions, uptake is slow and stability is hard to achieve. As Borrego, Froyd, and Hall (2010) note, “adoption levels will be higher in situations where change agents focus on clients’ (i.e., faculty and ad- ministrators) needs over promoting adoption of a specific innovation” (Borrego et al., 2010, p. 203). In technology-driven projects though, this is rarely the case—the user almost always comes last. Fundamentally, it comes down to the routines. The existing routines are such that integration of new knowledge of this kind is not the reality. It is not hard to see the breakdowns and also the inability to routinize for each of the cases. The research project, although productive, did not result in ready-to-use applications, and the analysis that was done was interesting for us researchers, but we were not the target audience in terms of policy changes. Those who could apply the data, such as advisors, had already existing routines that were productive for them and which would not have been improved by the analytics produced by the research or the off-the-shelf software that provided basic in- formation on retention or progression of students. Yet, what I found interesting was that the breakdown was not at the ‘poten- tial’ stage of the process. There was significant acquisition and assimilation of new knowledge related to LA across the organization at multiple levels. The team had faculty and students who understood different aspects of the project and were able to absorb different technological advances and knowledge. This knowledge was about data, about analytical techniques, about higher education issues, about the infrastructure needed, and so on. In addition to the faculty other institutional actors also played a crucial role starting with staff working this information, but other than providing a visual representation, which can be useful, there was not a lot that I learned from the system that I did not already know from my own teaching experience and feedback from faculty, advisors, and senior administrators. There were no new insights, although it did confirm things what I knew anecdotally. In any case, I passed on the use of the system to the student advisors, as I thought they would benefit more from it. There was a community of them and so they could use the information from that system to have a dialogue. But the truth is that they knew more about what was going on than the system could ever tell them, since they were always in direct contact with the students and their information was more current than what was in the system.
  • 36.
    Absorptive Capacity andRoutines 15 on institutional data and analytics. Through negotiations that involved many meetings, data access was granted and some advances were made in terms of coming up with routines—standard procedures—to respond to additional request for data. Where no routine existed, for instance, related to data sharing practices, they were created. All this work was exploited through analysis and what was produced was research studies and papers; there were also some demonstrable inter- faces developed. What was transformed was research and teaching; what did not happen was adoption of the products into real systems that could impact decision-making about intended audiences. So even though we created new knowledge, through data analytics, it did not cross the barrier to adoption—it was not used for any form of decision-making. It can be argued that a research project does not really have to be adopted as a system and routinized, but, in this case, we tried. We tried by trying to build systems, studying student use, advisor use, and so on, but convincing those who have decision-making power at a higher level is where the breakdown happens. Fundamentally, this is not any different than most technology-related project in higher education (as many of the chapter authors in this volume point out). The faculty and staff, or students, are important for routines to form and become reified, but bottom-up approaches are just one aspect of organizational change. In higher education, top-down approaches are equally important, as the resources (e.g., funding or hiring) are controlled by those higher up in the hierarchy, and, therefore, they can better incentivize formation of new routines. Overall, what is present in the organization is an ecosystem of tools. Given the range of users, functions, and roles that the user base has, it is not inconceiv- able that for any kind of INFT, an ecosystem approach will be required. From an organizational perspective, something central is always better—in terms of support, security, and maintenance—but given the user base, that might not work out. The other critical ingredient toward mass adoption is to start with something small that is really useful and usable and then build from that. For instance, just a simple tool that allows me to explore the class composition will be invaluable. Once I use it regularly and see its value, I am more likely to move on to more complex information and analytics. For something to become a routine, it is important that it allows first for flexibility so that the user can try it out and be able to adapt it. The user also has to perceive that they have a choice and that the LA fills a real need. Here are a few steps that can be taken to support routinizing to build LA capacity: 1. Technology design and adaptation: As is common for most technology- adoption processes, one crucial requirement to ensure that LA is integrated into organizational routines is to design technology using a user-centered approach. Across a range of technological products, there is growing evidence that using a user-centered approach is essential for user adoption
  • 37.
    16 Aditya Johri (Johri,Dufour, Lo, Shanahan, 2013; Klein, Lester, Rangwala, Johri, in press). This alone though is not sufficient, as the needs of users vary and changes and technological products also need to be tailorable by users or those who are supporting users to user needs so that they can be adapted by the end user. For instance, it is essential that LMS has a way for faculty to monitor class participation, but depending on how a course is struc- tured and how technologically advanced the instructor is, and, therefore, adaptation of monitoring interfaces or dashboards is essential for it to be- come part of faculty practices. Similarly, a department chair who needs to monitor or assess all courses in the department needs to be able to use that interface at a meta-level, and so on. 2. Sharing of practices across the organization: Given the diversity of dif- ferent units within a higher education institution—departments, support offices, etc.—it is important to share practices that work both within similar kinds of job profiles and units but also across them. It is especially important to include people who have domain knowledge about some organizational aspect as well as knowledge of LA. For instance, advisors who use LA proficiently can shed light on what works and how some elements of what they do might benefit faculty. These sharing sessions can be face-to-face but also consist of emails, mailing lists, and other elements that fit within existing routines. Sharing of practices also has to be both bottom-up and top-down, and itself will involve the creation of new routines. The sharing can be led by higher-ups in the organizations, by peers, or by anyone who has achieved advanced proficiency or has been able to use a LA system effectively. One way to accelerate adoption of LA would be to target specific audiences such as new faculty or other employees and ensure that they are exposed to the different systems that exist within the organization. Often, if other established faculty do not use these systems, it is likely that new faculty will be exposed to them through their departmental mentoring. The other goal of these sessions can be to evaluate new technologies to ensure that not every LA offer- ing is seen as a useful tool and that there is some common evaluation or brainstorming around its use. 3. Integrated technology and organization development: Although this idea has been around for over two decades (Pipek, Wulf, Johri, 2012; Wulf Rhode, 1995), integrating technology and organization development is an approach more appropriate for the age of analytics than any other. The primary notion behind this idea is that any new technology, especially any kind of INFT, cannot simply be installed within an organization without it affecting organizational practices. Therefore, it is important to think of technology and organizational development as occurring together and planning for inception for both the new technology and the organization to change when the two
  • 38.
    Absorptive Capacity andRoutines 17 come together. This notion captures the two ideas advanced above that any technology needs to be tailorable and that organization practices need to be created to support technology adoption. These two aspects need to work together at multiple levels—practices of individuals, groups, and the organization as a whole. For instance, as an instructor, if I have to better understand students using LMS, I need new dash- boards and analysis, but I also need to change my practices to ensure that I am taking what the system is telling me into account. If all I do is monitor without any changes in my practices, the LA is not going to be very effective. Acknowledgements This research was partially supported by the U.S. National Science Foundation Awards # 1447489 1712129. I would like to thank Jaime Lester and Carrie Klein for comments and feedback. The work on absorptive capacity derives from discussions with David B. Knight. References Almatrafi, O., Johri, A., Rangwala, H., Lester, J. (2016). Identifying course trajec- tories of high achieving engineering students through data analytics. Proceedings of ASEE 2016. Almatrafi, O., Johri, A., Rangwala, H., Lester, J. (2017). Retention and persistence among STEM students: A comparison of direct admit and transfer students across engineering and science. Proceedings of ASEE Annual Meeting. Arellano, C., DiLeonardo, A., Felix, I. (2017). Using people analytics to drive busi- ness performance: A case study. McKinsey Quarterly, July 2017. Arroway, P., Morgan, G., O’Keefe, M., Yanosky, R. (2016). Learning analytics in higher education (Research report). Louisville, CO: EDUCAUSE Center for Applied Re- search. Retrieved from https://library.educause.edu/~/media/files/library/2016/2/ ers1504la.pdf Azagra-Caro, J., Archontakis, F., Gutiérrez-Gracia, A., Fernández-de-Lucio, I. (2006). Faculty support for the objectives of university-industry relations versus de- gree of RD cooperation: The importance of regional absorptive capacity. Research Policy, 35(1), 37–55. Bean, R., Kiron, D. (2013). Organizational alignment is key to big data success. MIT Sloan Management Review, 54(3), 6. Bichsel, J. (2012). Analytics in higher education: Benefits, barriers, progress, and recommen- dations (Research Report). Loiusville, CO: EDUCAUSE Center for Applied Re- search, August 2012. Available from www.educause.edu/ecar Borrego, M., Froyd, J. E., Hall, T. S. (2010). Diffusion of engineering education in- novations: A survey of awareness and adoption rates in US engineering departments. Journal of Engineering Education, 99(3), 185–207. Brown, B., Kanagasabai, K., Pant, P., Pinto, G. S. (2017). Capturing value from your customer data. McKinsey Quarterly, March 2017.
  • 39.
    18 Aditya Johri Chen,Y., Johri, A., Rangwala, H. (2018). Running out of STEM: A comparative study across STEM majors of college students at-risk of dropping out early. Proceed- ings of Learning Analytics and Knowledge (LAK). Chui, M., Henke, N., London, S. (2017). How to win in the age of analytics. ­ McKinsey Quarterly, January 2017. Cohen, W. M., Levinthal, D. A. (1990), Absorptive capacity: A new perspective on learning and innovation. Administrative Science Quarterly, 35, 128–152. Cyert, R. M., March, J. G. (1963). A behavioral theory of the firm. Blackwell, Oxford, UK. Da Silva, N., Davis, A. R. (2011). Absorptive capacity at the individual level: Linking creativity to innovation in academia. The Review of Higher Education, 24(3), 355–379. Economist (2010). Data, data everywhere. Special report on Managing Information, www.economist.com/node/15557443 Gašević, D., Dawson, S., Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. Tech Trends, 59(1), 64–71. doi:10.1007/s11528-014-0822-x Johri, A., Dufour, M., Lo, J., Shanahan, D. (2013). AdWiki: Socio-technical systems engineering for managing advising knowledge in higher education. International Journal of Sociotechnology and Knowledge Development, 5(1), 37–59. Kirkland, R., Wagner, D. (2017). The role of expertise and judgment in a data-driven world. McKinsey Quarterly, May 2017. Klein, C., Lester, J., Rangwala, H., Johri, A. (In Press). Learning analytics tools in higher education: Adoption at the intersection of institutional commitment and individual action. The Review of Higher Education. Lane, P. J., Koka, B., Pathak, S. (2006). The reification of absorptive capacity: A critical review and rejuvenation of the construct. Academy of Management Review, 31(4) 833–863. Lester, J., Klein, C., Rangwala, H., Johri, A. (2017). Learning analytics in higher education. ASHE Monograph Series, 3(5), 9–133. Lewin, A. Y., Massini, S., Peeters, C. (2011) Microfoundations of internal and external absorptive capacity routines. Organization Science, 22(1), 81–98. doi:10.1287/ orsc.1100.0525 March, J. G., Simon, H. A. (1958). Organizations. New York, NY: Wiley. Martin, G., Massy, J., Clarke, T. (2003). When absorptive capacity meets institutions and (e)learners: adopting, diffusing and exploiting e-learning in organizations. Inter- national Journal of Training and Development, 7(4), 228–244. Pearson, T., Wegener, R. (2013). Big data: The organizational challenge. Bain Company. www.bain.com/publications/articles/big_data_the_organizational_challenge.aspx Pentland, B. T., Feldman, M. S. (2005). Organizational routines as a unit of analysis. Industrial and Corporate Change, 14(5), 793–815. Pentland, B. T., Rueter, H. (1994). Organizational routines as grammars of action. Administrative Science Quarterly, 39(3), 484–510. Pipek, V., Wulf, V., Johri, A. (2012). Bridging artifacts and actors: Expertise sharing in organizational ecosystems. Journal of Computer Supported Cooperative Work, 21(2–3), 261–282. Pistilli, M. D., Willis, J. E., Campbell, J. P. (2014). Analytics from an institutional lens: Definition, theory, design and impact. In J. A. Larusson B. White (Eds.), Learning analytics: From research to practice (pp. 79–102). New York, NY: Springer Books. doi:10.1007/978-1-4614-3305-7
  • 40.
    Absorptive Capacity andRoutines 19 Romero, C., Ventura, S. (2013). Data mining in education. WIREs Data Mining Knowledge Discovery, 3, 12–27. doi:10.1002/widm.1075 Siemens, G., Gašević, D., Haythornthwaite, C., Dawson, S., Buckingham-Shum, S., Ferguson, R., … Baker, R. (2011). Open learning analytics: An integrated modular- ized platform. Open University Press. https://solaresearch.org/wp-content/uploads/ 2011/12/OpenLearningAnalytics.pdf Sweeney, M., Rangwala, H., Lester, J., Johri, A. (2016). Next-term student per- formance prediction: A recommender systems approach. Journal of Educational Data Mining, 8(1), 22–51. Wulf, V., Rhode, M. (1995). Towards an integrated organization and technology development. In Proceedings of designing interactive systems (pp. 55–64). New York, NY: ACM. Zahra, S. A., George, G. (2002), Absorptive capacity: A review, reconceptualization and extension. Academy of Management Review, 27(2), 185–203.
  • 41.
    2 Analytics in theField Why Locally Grown Continuous Improvement Systems are Essential for Effective Data-Driven Decision-Making Matthew T. Hora Introduction Dr. Lee’s department was awash in data, including end-of-term course evalua- tions, final exams, homework results, and students’ online activities on course websites. In the spring of 2015, my colleagues and I visited the offices and lecture halls of Dr. Lee and his colleagues at a large, public California research univer- sity as part of a research project on data-driven decision-making (DDDM) in the field. There we heard complaints about ineffective online course evaluation systems and the value of hallway conversations with colleagues, observed sophis- ticated analytics in use but also rudimentary and low-tech systems of formative feedback such as a worn shoebox at the front of a classroom where students deposited 3 × 5 cards with their complaints and questions. It quickly became apparent that for mechanical engineering departments like the one where Dr. Lee was a faculty member, working with these various forms of data was not a simple exercise of reporting grades at the end of each semester, but was also part of a high-stakes exercise in disciplinary and institutional accreditation. With the reputation and continued operations of academic programs hinging on the qual- ity and thoroughness of accreditation reports that ran over 400 pages long, it was no surprise that Dr. Lee observed that issues surrounding educational data were “a pretty big deal in our department.” The study that led me to Dr. Lee’s department was motivated by increasing claims about the potential for sophisticated data analyses to discern patterns and predict outcomes that could ultimately solve longstanding problems ­ plaguing higher education such as institutional inefficiency (Lane, 2014), ineffective ­ institutional data management systems (Cheslock, Hughes, Umbricht, 2014), limited student learning in science courses (Wright, McKay, Hershock, Miller, Tritz, 2014), and poor student completion rates (Picciano, 2012;
  • 42.
    Analytics in theField 21 Siemens, 2013; Treaster, 2017). Advocates of DDDM in higher education also argue that since other sectors, such as business and health care, have embraced predictive analytics with vigor and apparent success, then higher education must also do so (Zimpher, 2014). In the case of Learning Analytics (LA), which entails the measurement, analysis, and reporting of student-centered data (e.g., course-taking patterns, keystrokes on course websites, grades) in order to im- prove and optimize learning, one of the central arguments behind its potential is that these increasingly large datasets could be analyzed in ways that would add clarity, precision, and an element of prediction to postsecondary educators understanding of how to support student success (Lane Finsel, 2014; Long Siemens, 2011). Besides the analytic advantages suggested by the new technologies, com- puter power, and data mining techniques that are closely tied to the analytics movement, two other forces should be recognized as contributing to the in- creasing attention being paid to Big Data and analytics in higher education: ac- countability pressures and skepticism about teachers’ abilities to make informed decisions. First, higher education in the early 21st century is facing a new wave performance-oriented reforms and accountability pressures from state legis- lators and accreditors that require institutions to establish, track, and report data on various performance metrics, a policy development that is leading to the development of new institutional data systems that are focused on ensur- ing compliance with these new policies (Alexander, 2000; Rabovsky, 2014). In doing so, institutions can be viewed as using technology and data systems to cultivate (and respond to) a culture of accountability as opposed to using these new tools to foster a culture of student-centered learning (Halverson Shapiro, 2012). Consequently, some observers have argued that DDDM and analytics can be considered to be “one of the most prominent strategies for educational improvement in the country” (Coburn Turner, 2012, p. 100). Second, the DDDM movement in education is also informed by the notion that teachers generally rely upon two forms of information to make instructional decisions—intuition and anecdote—which are insufficiently objective and robust, and that more rigorously collected and analyzed numeric data should be used as the basis for decision-making about how courses are designed and taught (Mandinach, 2012; Wieman, Perkins, Gilbert, 2010). With hard evidence in hand, so the story goes, faculty will improve their teaching, strug- gling students will be identified and supported more efficiently, and institutions will change and continually improve (Lane, 2014; Spillane, 2012). While much of the writing on DDDM and LA advocates for its adoption and/or centers on descriptions of data-related interventions, researchers are in- creasingly employing a more descriptive and critical stance, focusing instead on how educators think about and utilize data in their daily work (Coburn Turner, 2011). At the heart of these investigations is the growing realization that “Data do not objectively guide decisions on their own—people do,” (Spillane,
  • 43.
    22 Matthew T.Hora 2012, p. 114), and that some educators and policymakers have placed too much faith in the power of data and technology to change teaching behaviors and student learning. Further, some analysts critique the focus on adoption and program evaluation for ignoring ethical issues related to LA (Clow, 2013; Slade Prinsloo, 2013); how technology could be used to foster educators’ profes- sional growth and development (Hora, Bouwma-Gearhart, Park, 2017); and, especially, how the ultimate efficacy of a data system is dependent on how well integrated they are with the cognitive, cultural, and contextual aspects of orga- nizational life. At the heart of many critiques of analytics is the fact that within the seemingly simple prospect of mining gigabytes of keystroke data to iden- tify struggling students is a process whereby data are transformed by human ­ beings—working within constraints of local cultural norms, ­ organizational conditions, and knowledge about data and student learning—into information and then ideally actionable knowledge (Mandinach, 2012). As Zilvinskis, ­ Willis, and Borden (2017) observe, It’s not that these new technologies and methods are unhelpful, but rather it’s that they don’t address the more complex aspects of higher education, including the incredible diversity and complexity of learning outcomes across the curriculum and complex organizational arrangements. (p. 9) Consequently, some scholars argue that instead of continuing to advocate for the adoption of DDDM in general and the utilization of LA in particular, the field of education needs to better understand the existing conditions of data use within actual schools, colleges, and universities so that data systems can be more effective and responsive to educators’ real needs (Coburn Turner, 2011; Mandinach, 2012). In particular, given that “Effective data use requires going beyond the numbers and their statistical properties to make meaning of them” (Mandinach, 2012, p. 73), insights into how educators notice, inter- pret, and draw conclusions from various forms of data are needed (Coburn Turner, 2011). Without a nuanced and ethnographically informed account of how educators use data and LA ‘in the wild’, the field risks developing institu- tional and departmental data systems that contribute to ineffective (or incorrect) decision-making, are widely rejected by potential users, and fail to generate reliable, useful, and legitimate information that faculty and administrators can use as they go about planning courses, engaging students, and improving programs (Foss, 2014; Hora et al., 2017). These considerations led to the principal questions that I address in this chapter: (1) How are postsecondary educators thinking about and using teaching-related data in their daily work? (2) How do specific organizational features shape these decisions and practices? and (3) What are the implications of practice-based accounts of data utilization for the design and implementation of LA initiatives?
  • 44.
    Analytics in theField 23 Unfortunately, there is “shockingly little research on what happens when individuals interact with data in their workplace settings” (Coburn Turner, 2012, p. 99), and the aim of this chapter is to partially ameliorate this situation with respect to research on data use in higher education. While a promising line of inquiry is beginning to examine how faculty and administrators, within the constraints and affordances posed by their institutional contexts, engage, or not, in DDDM and utilize LA (Foss, 2014; Hora et al., 2017; Klein, Lester, Rangwala, Johri, in press), there remains much work to be done. For instance, the literature on LA is often replete with lists of why these data can improve educational practice, but with less insight into how exactly educators can use them in practice. In addition, research shows that educa- tors make decisions based on a wide range of information—verbal, numeric, and experiential—such that accounts of data use that ignore these varied and trusted sources of information are incomplete and likely incommensurate with existing behaviors. Finally, the field lacks a conceptual framework that accounts for how cognitive, sociocultural, and contextual factors collectively impact data use, especially in ways that move beyond lists of contextual con- ditions and that allow for the empirical specification of relationships between and among behavior and situations. One of the most promising frameworks for studying DDDM and the use of LA was developed by Coburn and Turner (2012), who draw upon insights from theories of situated and distributed cognition to develop a model that captures the temporal processes of data use as they unfold in specific situations. At the heart of this framework is teacher cognition, based on the view that the use of analytics or other forms of data are dependent on whether and how individ- uals notice, interpret and analyze data based on preexisting beliefs and other cognitive structures and, subsequently, construct implications about their own teaching and/or students’ learning based on these interpretations. This frame- work also focuses on how these processes of data interpretation and utilization unfold within and touch upon specific aspects of the social and organizational systems in which teachers work, so that specific features of the environment can be identified as key leverage points that support or inhibit effective DDDM (Spillane, 2012; Spillane, Halverson, Diamond, 2001). In this chapter, I ex- pand upon this framework by focusing particular attention on those organi- zational features where information is regularly stored and retrieved, or what is called the retention structure in research on organizational memory and learning, such as digital databases, social networks, hardcopy files, and personal memory (Walsh Ungson, 1991). Insights into what specific aspects of orga- nizational memory are implicated in educators’ use of instructional data are important because in many ways they establish the boundaries of what types of behaviors are desirable and feasible within organizations. Furthermore, these elements also may play critical roles in formal and informal DDDM systems where LA are utilized.
  • 45.
    24 Matthew T.Hora In this chapter, I also outline the theoretical underpinning of this framework and demonstrate its use in an exploratory study of how a group of faculty and administrators in a California research university used diverse types of teaching-­ related data—of which LA is but one type—in practice, and how specific features of the organizational memory influenced their decisions and actions. Analyses of interview, classroom observation, and documentary data revealed five distinct, yet intertwined, decision chains involving a diverse range of teaching-related data, with little evidence that LA or sophisticated statistical analyses of numeric data were viewed as important and salient to the educators’ work. Instead, the educators in this study relied upon and highly valued homework and exam results, their own self-created mid-term course evaluation systems, and low-tech infor- mation sources such as informal student feedback and responses to open-ended questions on course evaluations. Within this department, despite the presence of policies mandating the collection of teaching-related data for the purpose of disciplinary accreditation and student course evaluations, results from these exer- cises were either not fed back to faculty or were viewed as insufficiently detailed and reliable. Some respondents instead created their own systems for continuous improvement, which represents an opportunity lost in terms of institutional sup- ports for DDDM and a foreboding sign for those hoping to introduce LA into similar departmental contexts. These results suggest that institutional structures for continuous improvement be instituted that facilitate educators’ collection, analysis, and reflection on various forms of data (including but not limited to ana- lytics) while also acknowledging and respecting low-tech, individualized sources of data and feedback systems. These results raise questions that should be addressed by those advocating for the widespread adoption of LA in the nation’s postsecondary institutions, espe- cially those who argue that the sector must adopt these techniques given their apparent impacts and success in health care, business, and management (Zimpher, 2014). Indeed, the fact that data analysts in these fields have raised questions about the reliance on Big Data and performance-based metrics, arguing instead for the perspectives of behavioral scientists and small data (i.e., data from small samples and qualitative data), highlights the limitations of Big Data and DDDM (Lazer, Kennedy, King, Vespignani, 2014; Peysakhovich Stephens-Davidowitz, 2015). Instead, as a field, we must consider the prospect that “rigor is (being) seen as trumping relevance, responsiveness, and often reality” (Mandinach, 2012, p. 81). It appears that as with K-12 education, the pendulum of reform in higher education has swung hard in the direction of relying on technical solutions and hard data to inform and improve student’s educational experiences. Instead, I agree with Mandinach (2012) who suggests that as the field of education considers the need for continuous improvement in educational practice, that, “There needs to be a balance between the use of data and experience” (p. 81) and far more attention to the realities of the organizational and sociocultural conditions that shape and define how educators approach teaching and learning.
  • 46.
    Analytics in theField 25 Background: The Promise and Challenges of Using Data-Driven Decision-Making and Learning Analytics in Education In higher education, the largest body of research on DDDM and LA focuses on descriptions about innovations, interventions, and developments with LA and DDDM, a state of affairs not dissimilar from the K-12 literature (Coburn Turner, 2011). Researchers have documented and described how new technolo- gies and datasets have led the shift from institutional research offices generating static annual reports to building sophisticated, relational databases that can gen- erate reports about student performance in real-time (Lane, 2014). This devel- opment is largely due to the fact that with new learning management systems (LMS) and online datasets “every instructional transaction can be immediately recorded and added to a database” (Picciano, 2012, p. 10) and is viewed as a transformative development in terms of providing predictive analyses in real- time that can help support student success. Many examples of the successful use of LA exist: Arizona State University increasing pass rates from 66% to 75% in freshman remedial math (Kolowich, 2013); using analytics to identify intro- ductory math as a key determinant of student persistence in nursing programs at Georgia State University (Treaster, 2017); and, how analyses of student demo- graphic, academic, and LMS-utilization data were used at Purdue University to flag students at risk of failing a course, thus sparking messages to instructors that additional tutoring or interventions were required (Arnold, 2010; Campbell, 2007). Based in part on success stories such as these, Siemens (2013) argues that with the development of specialized journals, conferences, and research method- ologies, the field of LA even represents an emerging discipline in its own right. But two areas of research within the field of DDDM and analytics in education are also growing in volume and prominence that have a more skeptical view of these movements—critical analyses and descriptive, practice-based research. Critical perspectives on DDDM and analytics have focused on issues such as students’ data privacy, the surveillance state, the tendency for a technocratic ap- proach to educational management, and the commercialization of student data. Of particular concern is the issue of student privacy, and who owns or has ac- cess to data obtained via LMS, such as discussion posts or even essays and other original products (Siemens, 2013; Slade Prinsloo, 2013). Broader concerns are also being raised about the way that institutions, policymakers and educa- tors talk about DDDM, which is often framed in terms of economic efficiency and accountability. Instead, some scholars argue that teachers’ and students’ interests need to be more pronounced in conversations about accountability, as “a way of taking control of the agenda, so that the economic framing can be at least supplemented with a concern for learning” (Clow, 2013, p. 18). Addition- ally, some have argued that these efforts are heavy-handed attempts to control the profession of education through overly simplistic measures (Fullan, 2010),
  • 47.
    26 Matthew T.Hora and that standardized assessment data and DDDM are inappropriate measures for evaluating educational quality and supporting teacher decision-making (Anderson, 2006; Schmelkin, Spencer, Gellman, 1997). Foregrounding the fact that no reform initiative is entirely innocent, or absent of particular agendas or ideologies, Ewing (2011) notes how the value-added methodology in K-12 schools has become a rhetorical weapon and that the assumptions underlying DDDM also demand close, critical examination. Similarly, Slade and Prinsloo (2013) argue that more attention should be paid to how culture, politics, and economic contexts shape how we deal with ethical issues in analytics. Drawing on the critical research tradition in education (e.g., Apple, 2004), these scholars posit that instead of viewing students as sources of data, they should be seen as collaborators in data collection, analysis, and interpretation. In addition, criti- cal scholars argue that instead of adopting an accountability mentality, analytics should be seen as a moral practice that acts as a counter-narrative to the market- and consumer-based ideologies governing aspects of the DDDM movement. Another body of research that adopts a critical stance toward data-related reforms is called practice-based research, which utilizes an ethnographic per- spective to focus on what Cook and Brown (1999) called, “the coordinated activities of individuals and groups in doing their ‘real work as it is informed by particular organizations or group context’” (p. 386). This research tradi- tion builds on observational and ethnographic studies in anthropology and cognitive psychology, especially the groundbreaking research that Hutchins (1995a, 1995b) conducted on the cognitive, technical, and sociocultural under- pinnings of how teams of professionals perform tasks in their workplaces. The value of such descriptive research, Hutchins (1995a) argued, was in dispensing with the notion that findings from laboratory-based studies of cognition and decision-making were universally applicable to real-world situations. One of the advantages of more naturalistic studies of practice was that features of the task ­ context—whether social, cultural, structural, or physical—that influenced behavior could be accounted for and identified. Building on these ideas, schol- ars of data use in K-12 settings have examined how district central offices uti- lize information (Honig Coburn, 2008), how teachers develop ­ professionally when discussing data and student outcomes (Horn Little, 2010), and how the artifacts (i.e., designed objects and policies) teachers and administrators use when interacting with data can shape their behaviors and conclusions (Spillane, 2012). Unfortunately, in contrast to the robust body of work on data use in K-12 schools and districts, relatively little Unfortunately, in exists about how people think about and use data in higher education. Andrews and Lemons (2015) stud- ied how biology instructors made teaching-related decisions, finding that per- sonal experience was utilized more than empirical evidence about teaching or student learning. In a study of the use of data analytics by academic leaders, Foss (2014) found that adoption is shaped by a combination of features including the
  • 48.
    Analytics in theField 27 data system itself, the organizational context, and individual attributes of Deans, chairs, and faculty. In particular, Foss (2014) found that for analytics-related innovations to be embraced, data must be viewed as legitimate within the pro- fession and discipline such that the data systems and their outputs achieve “cur- rency” within local communities (p. 191). A similar line of inquiry examining the adoption of LA tools found that organizational commitment, leadership and policies all impact faculty decisions about whether or not to use analytics as part of their teaching practices (Klein et al., in press). Finally, my colleagues and I examined how a group of 59 faculty members used teaching-related data in three large, public research universities (Hora et al., 2017). Results from this study indicate that faculty drew upon a variety of data (e.g., numeric data, verbal feedback, etc.) in ways that can be grouped into six distinct clusters of practice that varied according to the degree of faculty involvement in creating contin- uous improvement systems and the sophistication of the data system. In some cases, faculty were neither involved nor utilized sophisticated data or analytic methods, whereas in other cases faculty co-designed rigorous systems that drew upon cutting-edge data and analytics. Consequently, given the importance of understanding educators’ data practices and how local organizational conditions and constraints influence them, empirical research on DDDM and analytics requires a conceptual framework that allows researchers to document these be- haviors and the specific contextual factors linked to them. A Framework for Studying Data Use in Complex Organizations While many robust frameworks exist to study postsecondary organizations as complex systems, from Birnbaum’s (1988) cybernetic systems theory, Clark’s cultural systems approach (1983), and Lattuca and Stark’s multidimensional framework for studying course planning (2011), these models are not easily op- erationalized for empirical research on faculty behaviors. This is due in part be- cause these frameworks aim to model factors that influence educational practice in abstract terms (e.g., leadership, culture, and technology) and in the aggregate (e.g., in departments or even institutions), and not how individuals think, make decisions, and behave in practice. As Stark (2000) noted after studying and articulating a model about how course planning unfolded in higher education, “Our work fell short of exploring in depth the actual decisions teachers make about course plans and curriculum” (p. 435). Fortunately, a robust model of data utilization has been developed that al- lows for the empirical documentation of both data-related practices at fine- grained levels and also how distinct contextual elements impact these behaviors (Coburn Turner, 2012). In creating this model, Coburn and Turner (2012) specifically aimed to move beyond generating lists of contextual factors that influence practice, or a static and coarsely grained model comprising various
  • 49.
    28 Matthew T.Hora boxes and arrows that hinted at causal relations among elements, but instead to “specify the relationship between contextual conditions on the one hand and the process of data use on the other” (p. 180). As previously noted, one of the motivating ideas behind this model is the view that the unique characteristics and constraints of human cognition should be at the core of analyses of DDDM, especially how perception of pertinent stimuli (e.g., relevant data) is strongly influenced by preexisting beliefs, experiences, and mental representations. ­ Additionally, the idea that perception as well as behavior is influenced by the “raw material” of the socio-technical environment (Spillane, 2012, p. 8), such that the context is not simply a passive backdrop to practice but an integral fea- ture of human behavior, is another feature of this model. Finally, this approach emphasizes that institutional contexts are not innocent or objective features but are instead actively created by particular people and interests who have the power to institute policies and develop organizational structures ­ (Little, 2011). While Coburn and Turner’s (2012) framework is the most theoretically ro- bust and readily operational for field research, two ideas are not integrated into their model that are particularly important for empirical research on how educators think about and use instructional data in real-world situations. First, theories of organizations as socio-technical information processing systems ­ recast the enterprise of data use from being centered on the behaviors of au- tonomous individuals using and interacting with information technologies and data, to one of the institution itself as a social collective that produces and con- structs knowledge (Pentland, 1995). This view argues that businesses, colleges, and other organizations are best seen as social knowledge systems, instead of a structural context where behavior occurs and where individuals and teams engage in processes of legitimizing and interpreting information as part of a social and cultural process (Pentland, 1995). This perspective foregrounds an essential process that is at the heart of DDDM and LA—the transformation of data into information and actionable knowledge (Mandinach, 2012)—in a way that moves beyond the fiction of an isolated actor engaged in sense-making activities to implicate the entire institution as a socio-technical entity. Second, research on organizational learning has demonstrated the impor- tance of how “organizations encode, store, and retrieve the lessons of history despite the turnover of personnel and the passage of time” (Levitt March, 1988, p. 319), which is a more historic perspective on data and its impact on be- havior than is commonly taken in research on DDDM and analytics. Research- ers in this field have documented how important information is stored within organizations in a variety of locations, and not simply in digital databases. These repositories include data sources that are commonly associated with DDDM such as hard-copy and digital databases, but they also include physical artifacts (e.g., course syllabi), routinized practices that embody acceptable behaviors, the organization’s structure (e.g., governance, hierarchy), and even individ- uals’ memories—all of which Walsh and Ungson (1991) collectively call the
  • 50.
    Exploring the Varietyof Random Documents with Different Content
  • 51.
    applicable. This questionreally turns on the largest utilizable emergent pencil from the eye piece. It used to be commonly stated that ⅛ inch for the emergent pencil was about a working maximum, leading to a magnification of 8 per inch of aperture of the objective. This in view of our present knowledge of the eye and its properties is too low an estimate of pupillary aperture. It is a fact which has been well known for more than a decade that in faint light, when the eye has become adapted to its situation, the pupil opens up to two or three times this diameter and there is no doubt that a fifth or a fourth of an inch aperture can be well utilized, provided the eye is properly dark-adapted. For scrutinizing faint objects, comet sweeping and the like, one should therefore have one ocular of very wide field and magnifying power of 4 or 5 per inch of aperture, the main point being to secure a field as wide is practicable. One may use for such purposes either a very wide field Huyghenian, or, if cross wires are to be used, a Kellner form. Fifty degrees of field is perfectly practicable with either. As regards the rest of the eyepiece equipment the observer may well suit his own convenience and resources. Usually one ocular of about half the maximum power provided will be found extremely convenient and perhaps oftener used than either the high or low power. Oculars of intermediate power and adapted for various purposes will generally find their way into any telescopic equipment. And as a last word do not expect to improve bad conditions by magnifying. If the seeing is bad with a low power, cap the telescope and await a better opportunity.
  • 53.
    APPENDIX WORK FOR THETELESCOPE To make at first hand the acquaintance of the celestial bodies is, in and of itself, worth the while, as leading the mind to a new sense of ultimate values. To tell the truth the modern man on the whole knows the Heavens less intimately than did his ancestors. He glances at his wrist-watch to learn the hour and at the almanac to identify the day. The rising and setting of the constellations, the wandering of the planets among the stars, the seasonal shifting of the sun’s path—all these are a sealed book to him, and the intricate mysteries that lie in the background are quite unsuspected. The telescope is the lifter of the cosmic veil, and even for merely disclosing the spectacular is a source of far-reaching enlightenment. But for the serious student it offers opportunities for the genuine advancement of human knowledge that are hard to underestimate. It is true that the great modern observatories can gather information on a scale that staggers the private investigator. But in this matter fortune favors the pertinacious, and the observer who settles to a line of deliberate investigation and patiently follows it is likely to find his reward. There is so much within the reach of powerful instruments only, that these are in the main turned to their own particular spheres of usefulness. For modest equipment there is still plenty of work to do. The study of variable stars offers a vast field for exploration, most fruitful perhaps with respect to the irregular and long-period changes of which our own Sun offers an example. Even in solar study there are transient phenomena of sudden eruptions and of swift changes that escape the eye of the spectro-heliograph, and admirable work can
  • 54.
    be done, andhas been done, with small telescopes in studying the spectra of sun spots Temporary stars visible to the naked eye or to the smallest instruments turn up every few years and their discovery has usually fallen to the lot of the somewhat rare astronomer, professional or amateur, who knows the field of stars as he knows the alphabet. The last three important novæ fell to the amateurs—two to the same man. Comets are to be had for the seeking by the persistent observer with an instrument of fair light-grasp and field; one distinguished amateur found a pair within a few days, acting on the theory that small comets are really common and should be looked for—most easily by one who knows his nebulæ, it should be added. And within our small planetary system lies labor sufficient for generations. We know little even about the superficial characters of the planets, still less about their real physical condition. We are not even sure about the rotation periods of Venus and Neptune. The clue to many of the mysteries requires eternal vigilance rather than powerful equipment, for the appearance of temporary changes may tell the whole story. The old generation of astronomers who believed in the complete inviolability of celestial order has been for the most part gathered to its fathers, and we now realize that change is the law of the universe. Within the solar system there are planetary surfaces to be watched, asteroids to be scanned for variability or change of it, meteor swarms to be correlated with their sources, occultations to be minutely examined, and when one runs short of these, our nearest neighbor the Moon offers a wild and physically unknown country for exploration. It is suspected with good reason of dynamic changes, to say nothing of the possible last remnants of organic life. Much of this work is well within the useful range of instruments of three to six inches aperture. The strategy of successful investigation is in turning attention upon those things which are within the scope of one’s equipment, and selecting those which give promise of yielding to a well directed attack. And to this end efforts correlated
  • 55.
    with those ofothers are earnestly to be advised. It is hard to say too much of the usefulness of directed energies like those of the Variable Star Association and similar bodies. They not only organize activities to an important common end, but strengthen the morale of the individual observer.
  • 57.
    INDEX A Abbé, roof prism,162 Aberration, compensated by minute change of focus, 266 illuminates the diffraction minima, 265 relation determines of focus and aperture, 266 Achromatic long relief ocular, 146 objective, 77 Achromatism, condition for, 78 determination of, 78 imperfection of, 87 Adjustment where Polaris invisible, 235 Air waves, length of, 255 Alt-azimuth mount for reflector, 102 mounts, with slow motions, 102 setting up an, 228 Anastigmats, 84 Annealing, pattern of strain, 68 Astigmatism, 84, 209 of figure, 210 Astronomy, dawn of popular, 19 B Bacon, Roger, alleged description of telescopes, 6 Barlow lens, 152
  • 58.
    “Bent,” objective, 86 Binocular,2 advantage of, exaggerated, 151 for strictly astronomical use, 152 telescopes for astronomical use, 163 C Camouflage, in optical patents, 97 Cassegrain, design for reflecting telescope, 22 Cassegrain, sculptor and founder of statues, 22 Cell, taking off from a telescope, 202 Chromatic aberration, 11, 76 investigation of, 210 correction, differences in, 91 error of the eye, 90 Clairault’s condition, 81 two cemented forms for, 81 Clarks, portable equatorial mounting, 109 terrestrial prismatic eyepiece, 158 Clock, the cosmic, 233 Clock drive, 110, 174 Clock mechanism, regulating rate of motor, 179 Coddington lens, 137 Cœlostat constructions, 126 tower telescopes, 127 Color correction, commonly used, 211 examined by spectroscope, 211 of the great makers, 90 Coma-free, condition combined with Clairault’s, 83
  • 59.
    Comet seeker, CarolineHerschel’s 118 seekers with triple objective, 119 Crowns distinguished from flints, 64 Curves, struggle for non-spherical, 18 D Davon micro-telescope, 148 Dawes’ Limit, 261 in physiological factors, 263 Declination circle, 108 adjustment of, 239 Declination circle, adjustment by, 237 facilitates setting up instrument, 110 Definition condition for excellence of, 254 good in situations widely different, 254 DeRheita, 12 constructed binoculars, 13 terrestrial ocular, 13 Descartes’ dioptrics, publication of, 11 lens with elliptical curvature, 12 Dew cap, 219 Diaphragms, importance of, 43 Diffraction figure for bright line, 269 pattern, 256 solid, apparent diameter of, 262 solid of planet, 269 solid for a star, 260 spectra, 190 system, scale of, 260 varies inversely with aperture, 260 through objective, 258
  • 60.
    Digges, account suggestscamera obscura, 7 Dimensions, customary, telescope of, 24 Discs, inspection of glass, 66 roughing to form, 69 Distortion, 86 Dolland, John, 28 published his discovery of achromatism, 29 Peter, early triple objective, 29 Dome wholly of galvanized iron, 250 Domes, 246 Driving clock, a simple, 174 pendulum controlled, 177 clocks spring operated, 175 E English equatorial, 110 mounts, mechanical stability of, 113 Equatorial, adjustments of, 230 Equatorial, coudé, 124 mount, different situations in using, 229 mount, first by Short, 104 mount, pier overhung, 115 mount in section, 107 two motions necessary in, 106 Equilibrating levers, devised by T. Grubb, 39 Evershed, direct vision solar spectroscope, 189 Eye lens, simple, preferred by Sir W. Herschel, 136 Eyepiece, compensating, 142 Huygenian, 139 Huygenian, achromatism of, 140
  • 61.
    Huygenian, with crosswires, 140 Huygenian, field of, 141 Huygenian focal length of, 143 measuring focus of, 136 microscope form, 147, 148 monocentric, 139 a simple microscope, 134 Tolles solid, 141 F Field, curvature of, 85 glass, arrangement of parts, 151 Galilean, 150 lens diameter possible, 150 Field lens, 139 Figuring locally, 73 process of, 73 Filar micrometer, 172 Finder, 108, 132 adjustment of, 230 Fine grinding, 69 Fixed eyepiece mounts, 118 Flints, highly refractive due to Guinand, 36 Foucault, 39 development of silver on glass reflector, 41 knife edge test, 212 Foucault, methods of working and testing, 41 Fraunhofer, 36 applied condition of absence of coma, 82 form of objectives, 37 long list of notable achievements, 38
  • 62.
    “Front view” telescope,32 mechanical difficulty of, 33 Furnaces, glass, classes of, 59 G Galilean telescope, small field of, 9 Galileo, exhibited telescope to senators of Venice, 8 grasps the general principles, 7 produces instrument magnifying 32 times, 8 Gascoigne, William, first using genuine micrometer, 12 Gauss, Objective, 82 Gerrish, application of drive, 181 motor drive, 179 Ghosts, 137 Glass, dark, as sunshade, 166 forming and annealing, 62 inspection of raw, 61 losses by volatilization, 58 materials of, 59 origin of, 57 persistent bubbles in, 58 a solid solution, 57 Grating spectroscopes, 190 Gratings, spectroscope, 189 Gregory, James, described construction which bears his name, 19 failed of material success, 20 Grubb, Sir Howard, objectives, 74 Guinand, Pierre Louis, improvements in optical glass, 36
  • 63.
    H Hadley, disclosed testfor true figure, 27 John, real inventor of reflector, 25 Hadley’s reflector, tested with satisfactory results, 26 Hall, Chester Moor, designed first achromatic telescope, 27 had telescopes made as early as 1733, 27 Hand telescope, magnifying power, 150 monocular, 151 Hartmann test, 213 on large objectives, 267 principle of, 214 Hartness, turret telescope, 130, 131 Heliometer, principle of, 171 Hensoldt, prism form, 163 Herschel’s discovery of Uranus, 32 forty foot telescope, 34 Sir John, 35 Sir John, proposed defining condition, 81 Sir William, 31 Herschel’s time, instruments of, 35 Hevelius, construction for objective of 150 feet, 17 directions for designing Galilean and Keplerian telescopes, 14 invention of first periscope, 15 Johannes, 13 mention of advantage of plano convex lens, 14 mentions telescope due to DeRheita, 14 Housing reflector of 36 inch aperture, 243 rolling on track, 242 simplest instrument for fixed, 241
  • 64.
    Huygens, Christian, devisedmethods of grinding polishing, 16 Huygens’ eyepiece, introduction of, 24 Huygens, sketch of Mars, 16 I Image, correct extra focal, 208 critical examination of, 204 Image, curvature of, 87 seen without eyepiece, 134 showing unsymmetrical coloring, 208 Interference rings, eccentric, 205 Irradiation, 262 J Jansen, Zacharius, 4 K Kellner, ocular, 145 Kepler, astronomical telescope, 10 differences of from Galilean form, 10 Knife edge test of parabolic mirror, 212 L Lacquer, endurance of coating, 223 Latitude scale, 232 Lenses, determinate forms for, 80 Lens, magnifying power of, 134 “crossed,” 24 polishing the fine ground, 70 power of, 78
  • 65.
    triple cemented, auseful ocular, 138 simple achromatic, 137 single, has small field, 137 spotted, cleaning of, 217 Light grasp and resolving power, 265 small telescope fails in, 264 Light ratio of star magnitudes, 264 Light transmitted by glass, 53 Lippershey, Jan, 2 discovery, when made, 5 retainer to, 3 Lunette à Napoleon Troisiéme, 154, 155, 162 M Magnifying power, directly as ratio of increase in tangent, 135 powers, increase of, 273 Marius, Simon, 5 used with glasses from spectacles, 5 Marius, picked up satellites of Jupiter, 5 Meridian photometer, 194 Metius, James, 4 Metius, tale of, 4 Micrometer, double image, 171 square bar, 171 Micrometers, 168 Micrometry, foundations of, 12 Mirror’s, aberrations of, 92 adjustment of, 206 concave spherical, 92
  • 66.
    final burnishing of,226 hyperboloidal, 96 lacquer coating for surface, 221 mounting, by Browning, 49 parabolic oblique, shows aberration, 95 surface, prevention of injury to, 220 Mittenzwey ocular, 141 Mountain stations, good or very bad, 254 Mounts, alt-azimuth and equatorial, 98 Myopia, glasses for, came slowly, 2 N Navicula Lyra, stages of resolution of, 271 Newton, abandoned parabolic mirror, 21 blunder in experiment, 20 gave little information about material for mirrors, 23 Isaac, attempt at a reflector, 20 Normal spectra, 190 O Objective, adjustable mount for, 44 adjusting screws of, 44 Clark’s form, 83 cleansing, 203 examination of, 202 Objective, four-part, 85 Fraunhofer flint-ahead, 83 how to clean, 216 spacers, to take out, 217 typical striæ in, 203 Objective prism, photographing with, 185, 187
  • 67.
    Objectives, crown glassequiconvex, 80 over-achromatized, 90 rated on focal length for green 24 Observatories, cost of Romsey, 252 Observatory at small expense, 249 Romsey, description of, 249 with simple sliding roof, 245 Observing box, 229 Oblique fork alt-azimuth, 100 Ocular, apparent angular field of, 146 terrestrial, 147 Tolles terrestrial, 147 typical form, 45 Oculars, radius of curvature of image in, 146 undesirability of short focus, 275 Open fork mount, 115 well suited to big reflectors, 117 Optical axis, to adjust declination of, 238 Optical glass, classes of, 63 data and analysis of, 64 industry, due to single man, 36 production of, 60 Orthoscopic ocular, 145 P Parallactic mount, 104 Petition for annulment of Dolland’s patent, 29 Photometer, artificial star Zöllner, 194 extinction, 198 photoelectric cell, 199
  • 68.
    precision of astronomical,199 selenium cell, 199 Zöllner, 197 Photometers, three classes in stellar, 193 “Photo-visual, objective,” 89 Pillar-and-claw stand, 98 Pillar mount, 240 Pitch, optician’s, 71 Placement for tripod legs, 236 Polar and coudé forms of reflector, 125 axis, adjustment of by level, 232 axis, alignment to meridian, 232 axis, setting with finder altitude of, 234 telescope, 119, 122 Polaris, hour angle of, 233 a variable star, 199 Polarizing photometer, 193 Pole, position, 234 Polishing machine, 70 surface of tool, 72 tool, 71 Porro’s second form, 157 work, original description of, 156 Porta, description unintelligible, 7 Portable equatorial, adjustment of, 230 telescopes, mounting of, 228 Porter polar reflector, 130 Position angle micrometer of Lowell Observatory, 173
  • 69.
    Powers, lowest practicable,276 Prismatic inversion, Porro’s first form, 155 Prismatic inverting system, the first, 154 Prisms, Dove’s, 154 Prism field glasses, stereoscopic effect of, 159 Prism glass, 152 loss of light in, 160 objectives of, 161 weak points of, 160 R Resolving constant, magnification to develop, 275 power and verity of detail, 2 power of the eye, 274 Reticulated micrometer, 169 Reversion prism, 153 Right ascension circle, 108 Ring micrometer, 169 computation of results of, 170 Ring system faults due to strain, 205 “Romsey” observatory type, 248 Rack motion in altitude, 100 Ramsden, ocular, 144 Reflection, coefficient of, from silvered surface, 54 Reflector costs, 55 cover for, 242 development in England, 41 for astrophysical work, 56 light-grasp of, 53
  • 70.
    relative aperture of,50 section of Newtonian, 45 skeleton construction, 49 suffers from scattered light, 56 working field of, 55 Refractive index, 63 Refractors and reflectors, relative advantages of, 52 few made after advent of reflector, 27 in section, 43 light transmission of, 53 Refractors, relative equivalent apertures of, 54 tubes of, 42 S Scheiner, Christopher, use of Kepler’s telescope, 11 devised parallactic mount, 11 Secondary spectrum, 87 new glasses reducing, 88 Seeing, 257 conditions, for difference of aperture, 257 conditions generally bad, 253 standard scale of, 256 true inwardness of bad, 253 Separating power, to compute, 261 Short, James, mastered art of figuring paraboloid, 27 took up Gregorian construction with success, 27 Shortened telescope, 152 Sights, on portable mount, 229 Silver films, condition of, 54 Silvering, Ludin’s process, 225 processes, 222