Graham Brown-Martin – What does AI mean for Education?

Graham Brown-Martin asks: Will learner-centred AI be banned from classrooms like smartphones?

So what is all this AI related news that we keep reading about in the media and how it’s going to take all our jobs?

Well, here’s what I think. Western society is in transition as it reaches the logical conclusion of an industrialisation process that started around 1760 where we’ve built smarter and smarter machines that replicate and replace anything that we can measure. From the railroad and cars replacing horses to factory machines that transform craft production to mass production. Whilst a future that contains sentient machines and Strong AI is uncertain, what is certain is that the computer processing capability to provide NLU and instantaneous fact recall and simulated problem-solving is just around the corner relatively speaking. Perhaps no more than 10 years away.

Whilst some of this will appear to be intelligent it will be like a sophisticated chess computer, i.e. it will still be what boffins call “weak AI”. But tasks that rely on measurement, rapid fact recall and analysis will be replaced by AI. In a sense the last couple of hundreds years of industrialisation and the capitalism upon which it was built have been leading to this moment.

The AI that we read about in sales brochures, promotions or news broadcasts are just algorithms that simulate intelligence very powerfully and often based on huge datasets. Soon almost everything you buy, at least digitally speaking, will boast about it’s AI capability but let’s be absolutely clear about this. It’s a simulation with all the same biases that anything else that originates from the human mind and hands contain.

When we hear about AI being used for education and learning, and we’re going to be bombarded with this in every sales presentation going anytime soon now, it’s from the perspective of last centuries understanding of what school and education is for. What I mean by this is that AI assisted technology will be designed to process students towards passing tests and where possible to replace teaching staff.

Given the volume of teachers leaving the profession as well as the substantial increase in demand for teachers as the world attempts to meet the UN’s commitment to the Sustainable Development Goals (SDGs) replacing teaching staff with machines seems like a pretty good bet. Machines don’t unionise, don’t get sick, don’t suffer from stress, don’t need a salary and are 100% consistent in delivering a curriculum and testing assimilation. What’s not to like?

Ed Rensi, the former CEO of McDonalds, has already suggested bringing in robots as thousands of McDonald’s workers demand a union and $15 an hour minimum wage at a shareholders meeting.

In the education world we’re already seeing commercial organisations, for example, Bridge International Academies who are actively pursuing strategies against teaching unions in countries where they can get away with it. Some argue that this is to disrupt the status quo whereas others argue it is away of reducing costs and the quality of provision.

Recently a university teaching assistant was replaced by an AI without students even noticing which, in my opinion, speaks volumes about our outdated approach to education.

But let’s look at this from a different perspective.

What if the AI was student-centred and every student had their own personal AI that they had grown up with and learned to work with to solve complex, abstract problems?

Well, I’m pretty sure that such a technology would, like smartphones today, be banned from many of our classrooms and definitely from the examination hall. Yet the world that these same students, the ones that are in school today, will join will be one that is augmented by extremely sophisticated “weak AI”. AI that can understand and respond in natural language, that can retrieve facts, information and analyse it far faster than any human mind.

As humans our only advantage over these machines is that we do, in fact, possess “strong AI” and yet we have an education system that demands we compete with the “weak AI” of machines. To me, this just doesn’t make sense.
One has to ask at what point do we “jump the chasm” and accept that students will be growing up with personalised AI systems that will help them navigate their world and massively amplify their problem-solving capacity. Or will we continue to pretend that this century and its affordances stop at the school gates so that we can tacitly maintain the business models of corporations from a bygone era?

To read the full blog post, click here.

To read Graham Brown-Martin’s speaker profile, click here.