Be Careful What You Wish For! Learning Analytics and the Emergence of Data- Driven Practices in Higher Education

In an age of data measurement,1 we are witnessing the developments of tools and techniques to capture, transmit, store, and analyze data that enable businesses, governments, healthcare, and welfare institutions to identify “knowledge” about human behavior in unprecedented ways. Big data and analytics offer the promise and potential of providing a better foundation for financial and organizational decisions and streamline, as do practices associated with data more effectively.2 The higher education sector is no exception to the emerging data-driven practices in society. With the pervasive use of learning management systems (LMSs) facilitating access to and storage of large-scale datasets, higher education institutions (HEIs) have started to pay attention to the promises entrenched in big data and data mining techniques to support learning, teaching, and administrative activities in more efficient ways.3 In this context, HEIs, particularly in the USA, the


Introduction
In an age of data measurement, 1 we are witnessing the developments of tools and techniques to capture, transmit, store, and analyze data that enable businesses, governments, healthcare, and welfare institutions to identify "knowledge" about human behavior in unprecedented ways. Big data and analytics offer the promise and potential of providing a better foundation for financial and organizational decisions and streamline, as do practices associated with data more effectively. 2 The higher education sector is no exception to the emerging data-driven practices in society. With the pervasive use of learning management systems (LMSs) facilitating access to and storage of large-scale datasets, higher education institutions (HEIs) have started to pay attention to the promises entrenched in big data and data mining techniques to support learning, teaching, and administrative activities in more efficient ways. 3 In this context, HEIs, particularly in the USA, the UK, and Australia, are implementing learning analytics to understand better and support student learning. 4 Even elsewhere, we see vast investments in learning management systems that enable the analysis of learning behavior by capturing student data on their academic interactions.
The purpose of this chapter is to introduce learning analytics (LA), exemplify how LA has currently been implemented in higher education, and discuss critically the ethical issues and concerns that arise when LA is introduced into HE.

What Is Learning Analytics?
Learning analytics is a fast-developing research area within the field of technology-enhanced learning (TEL) that has emerged during the last decade. 5 In particular, LA has its roots in various fields such as data science, artificial intelligence, practices of recommender systems, online marketing, and business intelligence. 6 The Society for Learning Analytics Research (SOLAR) 7 situates it at the intersection of learning (e.g., educational research, educational technology), analytics (e.g., statistics, visualization, computer/data sciences, artificial intelligence), and human-computer interaction (e.g., usability, participatory design, sociotechnical systems thinking). As a consequence of such a cross-disciplinary characterization, the field has been defined in varied ways. Still, a common understanding is: "the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs." 8 What is new with capturing a vast amount of student data is the teachers' (and the higher education institutions') chance to design pedagogical interventions based on analyzing the massive amount of data and the links between them. Among the types of analyses allowing for changes in educational practices, SOLAR includes: (1) predicting student success, (2) identifying students at risk of failing or dropping out of their studies, (3) supporting student development of lifelong learning skills and strategies, (4) providing personalized and timely feedback to students regarding their learning, (5) supporting the development of essential skills such as collaboration, critical thinking, communication, and creativity, (6) developing student awareness by supporting self-reflection, and (7) supporting quality learning and teaching by providing empirical evidence on the success of pedagogical innovations.

The technologies of learning analytics
Combining analytics technologies, their mighty analytical power, and cutting-edge data mining techniques with large-scale datasets, learning analytics in higher education practices promises to identify patterns to enhance student learning and make teaching more effective and efficient. In the specific context of higher education, data is generated basically from the use of learning management systems (e.g., ItsLearning), open online learning environments (e.g., Moodle), open social platforms (e.g., LinkedIn), e-learning, intelligent tutoring systems, and forums, among others. Students' data can most often be accessed, combined, and analyzed using complex algorithms to inform decisions. 9 The large-scale datasets, or "big data," are often described according to the following specific attributes: volume (massive size), velocity (of updating data), validity (accuracy), variety (various formats), venue (location), vocabulary (context), and value (usefulness). 10 Anthony G. Picciano explains that, as university student record-keeping systems maintain information on students' grades in each course, institutions could hypothetically use it to trace student performance patterns over time. 11 Similar processes can be illustrated in the analytics applications used in e-commerce. 12 For instance, companies like Netflix, using recommender models and algorithms, examine website traffic, customers' purchases, or navigation patterns to determine which customers are more or less likely to buy a particular product (e.g., series). Based on these patterns, companies like Netflix send notifications to customers of new series or movies as they become available. 13 Following Picciano, analytics of this sort are beginning to be used in higher education for predicting student performance, outcomes, and persistence. Consequently, capturing the data generated by groups of students attending, for instance, a 15-week online course could generate thousands of transactions per student and can be used to feed an LA application to optimize students' future choices. Such transactions can also be integrated with other data sources (e.g., personal data, health data, police records, bank data, etc.) coming from university information systems or others. This type of real-time monitoring of student transactions can, in turn, generate alerts that give course instructors a chance to intervene to assist the student in time. 14

Learning analytics: some examples
Among the most common data analytics, monitoring individual student performance is most commonly applied in today's LA. This type of analytics can be better grasped by the example of 11 Picciano argues that: "In a big data scenario, data would be collected for each student transaction in a course, especially if the course was delivered electronically online. Every student entry on a course assessment, discussion board entry, blog entry, or wiki activity could be recorded, generating thousands of transactions per student per course. Furthermore, this data would be collected in real or near real time as it is transacted and then analyzed to suggest courses of action. Analytics software is evolving to assist in this analysis." Anthony G. Picciano, "The Evolution of Big Data and Learning Analytics in American Higher Education," Journal of Asynchronous Learning Networks 16, no. 3 (2012): 9-20, at 12. 12 Picciano, "The Evolution of Big Data." 13 Picciano, "The Evolution of Big Data." 14 Picciano, "The Evolution of Big Data." diagnostic testing via diagnostic assessment systems. As a teacher, at the end of a learning sequence, let's suppose that you assign your students a diagnostic test. You design your test, choosing 20 questions from the reading material and the material discussed and lectured about in class. You have subdivided the questions into five categories, A, B, C, D, and E. The students do the diagnostic test, and, when you analyze the results, you notice that a majority of students lack knowledge on category D. What you do next is design a follow-up class to address the lack of understanding in category D, assign additional reading to the students, or create a digital learning resource with a video or something similar where you clarify the category in question. As a teacher, you could also use the data to see how your teaching on different themes develops over a more extended time by collecting the same or similar data points. That is a form of LA in its most basic form: it requires data, data analysis, and ideally some change in behavior on behalf of the teacher and/or the student to close potential gaps that arise from the analysis.
In a similar vein, students may be encouraged to self-identify the weaknesses they have in their understanding. Following the same steps as described above, students could do a test and be presented with their score, usually via a dashboard. From the information offered via the dashboard, the student could then identify gaps in their knowledge and then study to reduce those gaps, and, in the best of worlds, the students would be better prepared for exams, score better tests, and so on.
There are also more advanced ways to work with LA. For example, through machine learning algorithms, one could pull together multiple sources of data from students' attendance and interaction on the universities' LMSs, to make predictions about the risk for student dropout or the success or failure of courses and programs. Some of the most common data points include login information, tracking not only when students log in but also how long they interact with teaching material, student performance and activity data on downloads, quizzes, and video views online via the LMS. All of this could also include third-party tools and data from social media. 15 Some examples of use are highlighted by Picciano, who points to the LA application developed at Rio Salado Community College in Arizona in the USA, which enrolls more than 41,000 students in online courses. Rio Salado has implemented the PACE (Progress and Course Engagement) analytics application, which automatically tracks student progress and prompts teachers to introduce interventions if needed. The PACE application emphasizes personalizing the learning experience, meaning that it helps nontraditional students reach their educational goals through programs and services tailored to individual needs. Another example is Pattern, developed, and implemented at Purdue University. Pattern is a service offered to students to measure study habits and provide analytics and insights to their learning.
Furthermore, the use of predictive analytics, presenting the potential to improve the feasibility of effective early intervention strategies aimed to support at-risk students before they fail, have earlier been identified. 16 Such interventions may include specific recommendations for improvement facilitated by the mapping of student activity and student profiles. 17

Learning analytics: promises and expectations
LA applications are often presented as offering unbounded possibilities and a grand narrative of modernity that resonate with techno-romanticism's rhetoric. 18 Such narratives have recently gained prominence and can be associated with the idea that access to and analyzing students' data will improve the quality and value of the learning experience within schools and universities. 19 More specifically, the use of LA in universities is expected to 16  contribute "to quality assurance and quality improvement, boosting retention rates, assessing and acting on differential outcomes for students and as an enabler for the introduction of adaptive learning," 20 as well as to enhance students' success by identifying students at risk, 21 and consequentially increasing organizational productivity 22 and competitiveness. 23 The use of LA is also expected to facilitate students' informed decision-making, helping them change their learning strategies accordingly, and support self-regulated learning. 24 On this note, Simon Knight et al.,25 as well as Anna Kruse and Rob Pongsajapan,26 identify LA as offering much potential to understand how services to students and student learning can be improved. The promises and expectations associated with LA are in grained in the novel technical developments in artificial intelligence and the higher education sector's socioeconomic landscape. As such, stories about the progressive label of LA technologies and the emergence of data-driven practices these technologies stimulate may be due to the following reasons: (1) Universities worldwide are operating in an increasingly complex and competitive environment. They are expected to adjust to national and global economic changes while ensuring that their quality is relevant. 27 (2) Many universities worldwide have adopted learning management systems and are offering online education. As a consequence of the increasing use of these digital learning platforms, it is today possible to access online data repositories and make sense of the data to intervene in current teaching practices.
(3) Learning management systems are evolving into more powerful data collection devices with the power to generate clickstream data on all student interactions online, presenting universities with means to better monitor and predict students' learning trajectories and predict teaching and assessment strategies. 28 (4) There are today concrete opportunities for higher education institutions to "go beyond the individual student level and share their disparate datasets or even link data at a federal level which presents further opportunities for analytical insights at an even larger scale." 29 (5) With the increasing trust in data, there is a tendency to "value what we measure rather than to measure what we value." As such, it has become a common practice to "focus discussions about education almost exclusively on the measurement and comparison of educational outcomes." 30 Furthermore, access and analysis of large-scale datasets have been suggested to be the opportunity for higher education to reinvent its business model and making decision processes about educational outcomes. 31 In this context, it is not surprising that LA is presented as something inherently positive and approached as an excellent means to understand better the complexity inherent to student learning, current teaching practices, and how to contribute to 27 Daniel, "Big Data and Analytics." 28 32 However, as pointed out by Paul Prinsloo and Sharon Slade, 33 real-time automatic gathering, processing, storage, and students' data analysis are not neutral acts. Accessing, storing, and analyzing such data entail making a series of several decisions that involve ethical, legal, 34 and moral considerations. In this context, the use of LA in education is contested on several points, and we will return to those shortly.

Learning Analytics and Data-Driven Practices: Understanding the Complexity of a Sociotechnical Phenomenon in Higher Education
Looking at the LA research field's development over time, we observe a gradual shift away from the technical-driven promises associated with LA toward the real and tangible complexities of introducing LA into educational institutions. In this regard, Olga Viberg et al. underscore that, while the literature in the field highlights that LA has the potential to improve learning and teaching, overall there is little evidence that shows such improvements take place. 35 The emerging data-driven practices associated with the use of LA by universities have yet to demonstrate the claimed effectiveness of these technologies for improving student learning, teaching, and, consequently, higher education quality. To date, the use of LA seems to have been realized by the extraction and reporting of student data, instead of "prescribing personalized and timely support strategies that can aid teaching quality and improve student learning experiences." 36 In this regard, Mervat Adib Bamiah et al. add that the use of big data and analytics promised in the adoption of LA by HEIs is today not fully implemented for a diversity of reasons, including "the complexity of 32  identifying the relevant data which requires knowledge, resources and time." 37 Here there is also the need to develop specific competencies such as data literacy and ethical literacy and transparent and human-centered interfaces to LA applications. 38

Stakeholders in learning analytics
Despite the availability of large-scale datasets, powerful analytical tools, and cutting-edge data mining techniques, deploying LA in the higher education sector is a complex endeavor. Part of the complexity is given by the different types of stakeholders and practices involved in developing, designing, deploying, and using learning analytics technologies. 39 Questions such as: Who gets access to aggregate and communicate the data? What does all this data mean? For what ends is data collected and analyzed? Who benefits from data-driven practices? are all critical questions 40 for teachers, students, administrators, academics, EdTech developers, and university technical staff to discuss within educational communities. Such questions help clarify the various stakeholders' roles and responsibilities in the emergence of data-driven practices in higher education and map the vast array of technologies and practices involved. More precisely, following Olugbenga Adejo and Thomas Connolly, administrators, teachers, and students are acknowledged as the primary stakeholders, followed by course developers/researchers, computer/network administrators, technicians, and data analysts. They are all involved in strategizing about LA implementation and deciding about the importance, acceptance, and suitability of such analytics technologies for the students, their learning, and the learning environment. 41 In turn, 37  the social and epistemic relationships maintained by such diverse stakeholders are configured and shaped by data, files, programs, and other resources used and shared across different systems bound to different educational practices. Within this complex sociotechnical configuration, two points of view can be distinguished. 42 On the one hand, the point of view focused on the HEI, most often called academic analytics, to support the institutional, operational, and financial decision-making process 43 is a point of view that is concerned with data governance, and, on the other hand, the point of view focused on the students and their rights vis-à-vis their data, which Slade and Prinsloo exemplify in their socio-critical framework for learning analytics. 44 These-at times conflicting-points of view involve a series of challenges that lie at the core of the deployment of LA applications. 45 The conflation of diverse types of techniques, applications, multiple stakeholders' roles, and mandates constitutes dynamic, complex, and sociotechnical arrangements that make the appraisal of ethical and moral issues and their implications challenging for education. This is perhaps one reason why so little is still understood about ethical and moral concerns linked to the use of LA and the emergence of data-driven practices in universities.

Ethical and moral challenges concerning the use of learning analytics in higher education
Issues about ethical and moral considerations that are inherent to the access, storage, and analysis of student data were introduced by Sharon Slade and Paul Prinsloo in 2013, where the authors argued that, although using student data for gaining knowledge about learners' behavior may be advantageous for students, instructors, and institutions, and of value for the understanding of students' learning and construction of didactical interventions, it faces significant ethical and moral considerations. Such considerations are connected to the location and interpretation of data, informed consent, privacy, deidentification of data, and classification and data management. 46 Within this framework, these authors propose (1) understanding LA as "moral practice," (2) addressing students as agents; acknowledging that student identity and performance are temporal dynamic constructs, (3) asserting that student success is a complex and multidimensional phenomenon, (4) putting the focus on transparency, and (5) acknowledging the fact that higher education cannot afford not to use data. 47 Comparatively, with studies on the use of LA technologies promising enhancement of students' performance, a small number of empirical research studies have engaged with the ethics of LA systems. More particularly, a review of such literature 48 reflects a tension between, on the one hand, a need to find solutions via concrete guidelines to help institutions to be able to deal with new ethical and moral challenges brought by LA and, on the other hand, a need to problematize student data and data mining techniques for higher education learning and teaching purposes.
Two examples are of interest here. One sets out to focus on an institutional perspective 49 or academic analytics 50 that engages 46  with ethics as barriers that halt the LA field's development and the institutional benefits associated with LA. 51 The other is driven by the interest to raise awareness and engender conversations about opportunities and concerns that come with developing data-driven practices from a student perspective. 52 Interestingly, the latter argues that producing ethical guidelines or policies on the issue only is insufficient and points instead to cultivating ethical practices to deal with the use of LA and their effects, including risks for students and teachers in higher education. This argument makes explicit that the implementation of policies concerning the development of a code of practice for Jisc (Joint Information Systems Committee) in the UK. Sclater's code of practice identifies no fewer than 86 distinct issues comprised in a taxonomy of ethical, legal, and logistical issues for learning analytics aimed "to set out the responsibilities of educational institutions to ensure that learning analytics is carried out responsibly, appropriately and effectively, addressing the key legal, ethical and logistical issues which are likely to arise." Niall Sclater, "Developing a Code of Practice for Learning Analytics," Journal of Learning Analytics 3, no. 1 (2016): 16-42, at 31. The Open University's Policy on Ethical Use of Student Data influenced by Slade and Prinsloo offers a set of guiding principles aimed to provide a framework for the ethical application of learning analytics. These principles are: (1) "Learning analytics is an ethical practice that should align with core organizational principles, such as open entry to undergraduate level study.
(2) The Open University has a responsibility to all stakeholders to use and extract meaning from student data for the benefit of students where feasible.
(3) Students should not be wholly defined by their visible data or our interpretation of that data. (4) The purpose and the boundaries regarding the use of learning analytics should be well defined and visible. (5) The University is transparent regarding data collection, and will provide students with the opportunity to update their own data and consent agreements at regular intervals. (6) Students should be engaged as active agents in the implementation of learning analytics (e.g. informed consent, personalized learning paths, interventions). (7) Modelling and interventions based on analysis of data should be sound and free from bias. (8) Adoption of learning analytics within the Open University requires broad acceptance of the values and benefits (organizational culture) and the development of appropriate skills across the organization." "Policy on Ethical Use of Student Data for Learning Analytics," Open University, retrieved https:// help.open.ac.uk/documents/policies/ethical-use-of-student-data/files/22 /ethical-use-of-student-data-policy.pdf. 51 Sclater, "Developing a Code of Practice." 52 Slade and Prinsloo, "Learning Analytics." the ethical use of student data is not enough. These instruments need to be carefully introduced via a communication strategy to explain "how individual subjects are affected by specific applications of learning analytics, to off-set this complexity" (cf. Open University Policy). 53 Recent work that critically problematizes the adoption of LA in higher education points to the transient, context-sensitive and temporal character of the identity of the student, 54 the meaning of student success in university, 55 the risk of viewing students as sources of data and passive recipients of services and tuition, 56 the risks of putting into practice decision-making structures increasingly beholden to the algorithms without necessarily understanding how they work, 57 the educational triage in open distance education, 58 and the obligation to act. 59 All these themes seem to engage with the issue of information justice, 60 which not only concerns the ideological nature of the data captured and in use but also calls for a critical stance that underscores that "just because you can access, store and analyze student data doesn't mean you should." 61

Looking Ahead
With learning analytics available in current institutional learning management systems, the era of data-driven educational practices may have just begun. As we have seen in this chapter, diverse stakeholders need to be involved in discussions about the potentials, benefits, and risks of capturing, analyzing, and making decisions based on the digital traces that the students leave behind, and without necessarily knowing what they leave behind. In this context, important questions arise: • Will the integration of LA help students develop critical and creative thinking and collaborative and communication skills? Or will LA help students to pass university assignments and exams? • Will emerging data-driven practices in higher education enable us to understand student learning and success better? Or will they encourage the emergence of surveillance practices and invasion of student privacy at universities? • Will the adoption of LA in educational practices scaffold academics and institutional leadership to embrace education as a public good? Or will it become instrumental in supporting new public management ideals in higher education?
Considering the development of LA, we see it as imperative to build on a socio-critical approach 62 and engage with the multiple questions and multifaceted aspects of using such technologies in everyday higher educational practices. It is crucial to bear in mind that what is at play here is no less than the development of new data-driven, also known as evidence-based practices. They are underpinned and configured by a set of assumptions and potential 61 Slade and Prinsloo, "Learning Analytics." biases that need to be detected, unpacked, and seriously discussed by the entire group of stakeholders (e.g., academics, administrative staff, institutional leaders, students, information technology support, developers, companies, and lawyers). In this respect, we call for discussion regarding both the potentials and the challenges entrenched in the use of LA in current educational practices. Moreover, we suggest the following insights to organize such a necessary discussion.
Educational data-driven practices are highly context sensitive Notably, the approaches to data reflected in the LA literature seem to overlook the central role that the context of the captured data plays in the analysis and interpretation of such data. Techniques such as prediction, clustering, relationship mining, the distillation of data, and discovery with models, 63 just like applications for modeling user knowledge, behavior, experience, and knowledge domains, as well as applications for creating profiles of users, trend analysis, personalization, and adaptation, 64 all imply human judgment and understanding of the particular microcontexts in which the data is imbued. 65 As friction between microcontexts tied to different norms, values, goals are most likely to surface, engaging with them in student data access and analysis is imperative to make explicit and negotiate. 66 Although there is a potential benefit for institutions, teachers, and students in analyzing data in educational settings, analyses that do not consider the context of such data remain incomplete, unreliable, and potentially nonsensical. 67 Considering context in large-scale datasets analysis is a challenging task requiring technical competence, judgment, and care if we understand the student identity and performance as temporal dynamic constructs associated with myriads of microcontexts. 68 Educational data-driven practices are not synonymous with evidence-based practice In Sweden, as we have seen in the UK and the USA, there is a growing trend toward "evidence-based educational practice." 69 In this sense, it is vital to distinguish between educational data and evidence. There seems to be confusion in the LA literature, which often draws parallels with the medical domain. 70 There is a body of knowledge and professional practice in medicine that develops based on establishing effective interventions. In the field of education, to speak about effective teaching becomes meaningless as "the question that always needs to be asked is, effective for what?" 71 Following Gert Biesta, what counts as effective in the educational sector is tightly related to what is "educationally desirable," which, in turn, makes educational data the object of subjective interpretation and human judgment. 67 Cf. boyd and Crawford, "Critical Questions for Big Data." 68 Slade and Prinsloo, "Learning Analytics." 69 This trend is explained by Gert Biesta, who points out that proponents of evidence-based education stress "that education is too important to allow it to be determined by unfounded opinion, whether of politicians, teachers, researchers or anyone else. They call for a culture in which evidence is valued over opinion and argue that any approach to decision making that is not evidence-based is simply pre-scientific." Biesta, "Why 'What Works' Won't Work," 4. 70 Biesta explains this distinction by putting attention to the view of professional practice that is embedded in evidence-based practice: "Central to evidence-based practice is the idea of effective intervention. Evidencebased practice conceives of professional action as intervention, and looks to research for evidence about the effectiveness of interventions. Research needs to find out, in other words, 'what works,' and the main if not the only way of doing this, so it is often argued, is through experimental research, most notably in the form of randomized controlled trials." Biesta, "Why 'What Works' Won't Work," 7. 71 Biesta, "Why 'What Works' Won't Work," 7.
Innovative educational data-driven practices are not sustainable per se From previous experiences, we know that innovations, to be sustainable, need always to be woven into the fabrics of everyday practices of all stakeholders involved. 72 In that respect, we pay special attention to the university teachers whose data literacy, ethical standpoints (shaped by institutional practices), and legal considerations 73 are key for making sense of the student data captured and LA's technical potential to design sound pedagogical interventions. In the literature visited, it is notable that questions about the technological (i.e., data literacy) and ethical competence that, for instance, teachers need to have to harness the potential of LA are rarely even evoked. It seems that it is implicitly assumed that university teachers (and the university staff concerned by the treatment of student data) will become data-literate during the LA deployment process and will engage with it without resisting or discussing the pedagogical value of it. 74 In this context, it is essential to ask how the deployment of LA will contribute to the professional competence development of the university teachers and even whether the emergence of data-driven practices in higher education will liberate teachers or suppress them. And where will the data capturing and the data analysis stop? 75 72 Teresa Cerratto Pargman and Marcelo Milrad, "Beyond Innovation in Mobile Learning: Towards Sustainability in Schools," in Mobile Learning: The Next Generation, eds. John Traxler and Agnes Kukulska-Hulme (Abingdon: Routledge, 2016), 154-178. 73 See Magnusson Sjöberg in this volume. 74 These matters are further underscored by boyd and Crawford: "In addition to questions of access, there are questions of skills. Wrangling APIs, scraping, and analyzing big swathes of data is a skill set generally restricted to those with a computational background. When computational skills are positioned as the most valuable, questions emerge over who is advantaged and who is disadvantaged in such a context. This, in its own way, sets up new hierarchies around 'who can read the numbers,' rather than recognizing that computer scientists and social scientists both have valuable perspectives to offer." boyd and Crawford, "Critical Questions for Big Data," 674. 75

Conclusion
In this chapter, we introduced learning analytics (LA). We discussed some examples of LA applications, and we touched upon the emergence of data-driven practices in higher education. We mapped the complexity of LA seen as a sociotechnical phenomenon in higher education. Lastly, we discussed the following three insights intending to provoke discussion about ethical issues inherent to the emergence of data-driven practices in higher education: (1) educational data-driven practices are highly context sensitive, (2) educational data-driven practices are not synonymous with evidence-based practices, and (3) innovative educationaldata-driven practices are not sustainable per se. As a concluding remark, we identify the almost boundless opportunities provided by LA. The capacity to capture, store, analyze, and predict based on large-scale datasets represents a profound change at the level of current educational ethos in current Nordic university practices. But, as practitioners in higher education, we need to be careful what we wish for. We are convinced that the emergence of LA creates a fundamental shift in how we think about learning and teaching in higher education. So we simultaneously acknowledge that data-driven practices need to be carefully configured by the academic freedom and educational values embedded in critical pedagogy. 76