What role should A.I. play in education?

By Bodong Chen in blog

March 15, 2022

Listening to podcasts has become a new habit of mine during the pandemic, when cooking or doing dishes, on the way to pick up my toddler, or when my eyes need a break from the screen. Last week, I listened to an interesting episode of the EdSurge Podcast titled What Role Should AI Play in Education? A Venture Capitalist and an EdTech Critic Face Off. This episode features a discussion between Neil Selwyn and venture capitalist Ryan Craig. I thought the conversation was fantastic in many ways and encourage you to take a listen.


(Photo Credit: pixabay)

I enter this conversation as someone who does design research in close collaboration with teachers and students. I do not see myself as an expert in AI in Education, but I follow many respected colleagues from AIED, learning analytics, and educational data mining who are doing important work on this topic, putting up special issues like this one that are shaping this discourse.

There were a few issues raised in the episode that caught my attention.

1. “Is this really AI?”

To a certain degree, this question is asked because of a strong temptation of EdTech companies to market their products as AI-capable. It is an intelligent response to a widely used marketing strategy deployed to make certain products more attractive to buyers who do not or do not have time to understand the technical details. But the more fake AI tools are spotted, the more distrust is built to trigger this question – “Is this really AI?”

But this may not be the only reason people ask the question. Another reason is when the educational benefits promised by the products are too underwhelming to the end users. The full question might be “Is this really AI, cuz it’s not that smart to me?” Designing AI products in education is probably less about engineering the smartest AI capabilities and more about putting AI in the right spot to augment educational practice. In other words, it’s important to ask:

  • What education problems are solved by the product?
  • Whose problems are solved by the product?

As Jeremy Roschelle wisely suggests,

Start from what is good teaching and learning. And not from what AI can do for me.

2. How can educators be involved?

While educators may know a great deal about good teaching and learning, unfortunately, they are often not involved in the design process, creating a disconnect between products and the realities of education.

Of course, many AI startups would claim that their tools are backed up by “the science of learning.” But that very claim disregards or disrespects extensive work showing the difficulty with transplanting or scaling innovations in education. Here, a Silicon Valley value system is married to a positivist research paradigm, transforming a research-practice disconnect found in some educational research to a scaled up version powered by AI technologies.

Okay, I respect post-postivist research and understand not all companies or tool developers are working this way. There are probably many amazing teams out there trying very hard to make their AI products human-centered. However, given so much hype around AI is driven by capital, the profit-oriented value system and a focus on efficiency may discourage work that takes time and relies on partnership building and co-design.

The same can be said about the current rush to to “revolutionize” or “disrupt” education through metaverse. While educators may have a spot at the table, simply having a spot is unlikely to be enough if we do not proactively work out tensions between different value systems subscribed by different groups of people.

3. How to think about education being “slow”?

In an earlier episode of the EdSurge Podcast (sorry for so many mentions of the podcast), Larry Cuban had an interesting comment on education being slow. “Is being slow really a bad thing?” Not necessarily, when we recognize learning as a gradual process, when we see students as whole persons, and when we nurture connections between schools and society.

By the same token, it’s probably okay for education to be slow in adopting technological innovations, especially when we recognize infrastructuring work that negotiates human-technology relations and seeks to meaningfully reconfigure sociotechnical conditions of new practice.

Maybe this question applies to the broader society as well. Maybe, we do need software to help us slow down.

Posted on:
March 15, 2022
Length:
4 minute read, 721 words
Categories:
blog
Tags:
AI EdTech
See Also:
The Many Faces of Artificial Intelligence (AI)
Let's complicate the idea of infrastructure in conversations about GenAI
Integrating generative AI in knowledge building
comments powered by Disqus