A Call for Third-Order Change in Learning Analytics
By Bodong Chen in blog
June 6, 2019
[Disclaimer: It’s summer time, meaning time for some bold statements.]
“Any educational intervention, for the obvious, common-sense reasons mentioned above, can do harm… ignoring side effects is one of the main reasons for the perpetual wars and pendulum swings in education.” — Yong Zhao (2018)
Education often turns to other disciplines for inspirations. In medicine, precision medicine “takes into account individual variability in genes, environment, and lifestyle for each person” when treating diseases. In business, business intelligence harnesses information and analytics to improve and optimize decisions and performance. All good stuff, right?
Indeed, the emergence of learning analytics—now often referred to as ‘AI in Education’—was motivated by these very analogies. With more and richer data, as well as stronger data science methods, education is poised to better understand and support learning.
However, as pointed out by Yong Zhao, we rarely talk about ‘side effects’ in education when discussing changes and reforms. So aren’t more of us talking about types of learning and learning cultures out of the ‘curriculum box’, or not in the mainstream accountability-centric education system.
Work tagged with ‘learning analytics’ is probably facing the same challenges as earlier educational interventions, especially those related to the integration of EdTech in the education systems. While precision medicine is a great idea, is “precision education” necessarily a good one? If precision medicine is optimizing conditions of combating a disease, what outcomes would precision education work for? Outcomes worthwhile for whom?
Different Levels of Learning
Let me first turn to the ‘learning’ part in learning analytics.
In Stephen Sterling’s (2001) book titled Sustainable Education, he delineates three levels of learning. The first-order learning is about ‘basic’ acquisition of knowledge and skills. The second-order learning deals with ‘learning about learning’ and requires reflexitivity. The third-order learning recognizes the existing paradigm of learning and facilitates paradigmatic reconstruction to transform learning.
Of course, such categorization is coarse to a certain degree. What is useful an exercise is to scan articles from the Learning Analytics and Knowledge (LAK) conference or the Journal of Learning Analytics (JLA) to see what learning analytics research cares about when discussing learning. I would be not surprised if many/most articles are found to deal with first-order learning, such as earning a passing grade in an introductory algebra course. This is not saying passing a course is not a worthwhile educational goal. Rather, we need to think more broadly about what learning means and entails.
We need more learning analytics work that pushes our views of learning, given we are now better aided by data, AI, dashboards, and so on. We need to, for instance, look at nurturing higher-order competencies as demonstrated by this JLA special issue. We need to look at nascent (or simply more natural) learning contexts that involve dynamic, messy interactions among learners, artifacts, information systems, and the physical world. Recent work on ‘ Collaboration Translucence’ is one great example. We need to build ‘ pedagogical biases’ (Scardamalia, 2008)—biases conducive to fresh ideas of learning—into learning analytics applications to combat traditional fixation on first-order learning, and to help educators unlearn and relearn.
All in all, to realize its full potential, discourse around learning analytics should not be dominantly about first-order learning. While the field is progressing steadily (e.g., penetrating into board rooms of the provost offices), we probably should set bolder educational goals for learning analytics.
Different Aims of Education
“I believe that education, therefore, is a process of living and not a preparation for future living.” — John Dewey, My Pedagogic Creed
Of course, we do not agree on the aims of education, like what we don’t agree on the meanings of learning, and like we don’t agree on whether pineapple can be a legitimate pizza topping.
When discussing, designing, and integrating learning analytics at an educational institution, we need to recognize that the education system is multi-functional and carries a mix of very different aims like the following (Sterling, 2001):
- Liberal: To develop the individual and his/her potential
- Socializing: To replicate society and culture and promote citizenship
- Transformative: To encourage change toward a fairer society and better world
- Vocational: To train people for employment
There are fundamental tentions among these educational aims, fueling debates such as whether education ought to be intrinsic (an end in itself) or instrumental (a means to an end).
Just like my prior criticism of the fixation on first-order learning, I also caution a predominant emphasis on the vocational aim of education. Indeed, graduation rates and employability are important, but only to a degree of serving individual and societal well-being. Without attending to the other aims, the vocational aim would most certainly fail to stand on its own.
A Call for Third-Order Change
If learning analytics is serving a narrow band of learning and educational aims, it may lead to changes—but most likely only at the functional level.
In learning analytics, much work is serving the functional realm of education (not learning) that resolves into a current institution and its established system. Earlier attempts of distincting learning analytics from academic analytics were well-intended but futile. It seems academic analytics would naturally focus more on the functional side, while learning analytics could truly dig into complex processes of learning using sophisticated data science methods. Of course, both tracks are progressing. But I observe a chasm between people whose job codes are on the functional side vs. those who care about the learning side.
Few learning analytics efforts I witness are aiming at making a third-order change; that is, to harness data analytics to see learning differently. Startups may conveniently claim to totally transform learning or disrupt education, but unfortunately many of their claims fall flat. In the adaptive learning space, for instance, we already know the story about Knewton. And we now have SmartSparrow and Squirrel AI. A problem with many “disruptive” technologies is on Day One they decide to side themselves with a dated educational paradigm and never look back. Still too often, EdTech products harnessing learning analytics merely contribute to the functional and managerial aspects of education such as tracking student progress in a prescribed curricular space. In this case, the third-order change would hardly occur.
And yet, the side effects of the managerialist use of analytics in education cannot be neglected. It has been warned a long time ago that the managerialist approach tends to corrode what a ‘good education’ could be (Sterling, 2001), leading towards:
- a narrowing of what counts as achievement to that which can be measured
- deprofessionalization of teachers, who become technicians rather than reflective practitioners
- decline in teacher-led innovation
- a creeping up of marks over time as it is in everybody’s interests to demonstrate ‘high standards’
- valuing what can be measured, rather than measuring what is valued
To move towards third-order change, analytics in education need to broaden the scene from perfecting functional/managerial/mechanistic tasks to consider complex, emergent processes in the ‘ecosystem’. At a basic level, a learning analytics application needs to be clearly positioned in the existing institutional structure in relation to the other ‘agents’ in place—people and information systems.
- Looking inward into the application, rather than focusing on single variables, it could focus on a set of relations and the whole; rather than powering external evaluation of student learning (or “turning students into numbers on a teacher dashboard”), it could aim to support self-assessment and self-discovery by students; rather than striving for normative values and homogenization, it could nurture heterogeneity among learners in a class.
- Looking outward from the application, we need to design for a meaningful social life of learners, and to design interactions between the application and other information systems as well as ’locations’ of learning (Edwards & Usher, 2007).
To a certain degree, I’m advocating for some type of ’ ecological design’ of learning analytics. It’s certainly vague right now. It is clear though that the field of learning analytics needs to develop a set of principles and toolkits, things like the feminist chatbot design approach, to help ourselves achieve third-order change. How can we get there?
References
- Edwards, R., & Usher, R. (2007). Globalisation & Pedagogy: Space, Place and Identity. Routledge.
- Scardamalia, M., & Bereiter, C. (2008). Pedagogical biases in educational technologies. Educational Technology, 45(3), 3–11.
- Sterling, S. (2001). Sustainable Education: Re-Visioning Learning and Change. Schumacher Briefings. Green Books for the Schumacher Society.