Discussion and debate on AI has ranged from the utopian to the apocalyptic. Technophiles look forward to a future in which AI-equipped robots perform all the tasks human beings would rather not. A minority of AI sceptics fear a takeover by terminator-style androids bent on our destruction. A more plausible downside scenario is one in which AI simply leaves many human beings with too little to do.
For better or worse, it is safe to say few areas of life will be left untouched by AI. Education is no exception, and education thinkers are split on whether AI will improve or harm children’s learning. Many commentators who wholeheartedly embrace AI in education tend, implicitly at least, to hold a “21st century learning” (21CL) perspective. One of the hallmarks of the 21CL is its proponents’ enthusiasm to adopt new technology into education. Other features of the approach are the promotion of student-led learning, and the de-emphasis of traditional school subjects in favour of so-called 21st-century skills, like critical thinking. While critical thinking is indeed a vitally important skill, it depends on knowledge of the kind learned in the traditional subjects.
It is unsurprising that generative AI appeals to 21CL thinkers. It is the latest thing in information technology. It appears to offer an unprecedented opportunity for students to direct their own learning. And it is a plausible alternative to students personally holding knowledge.
In the latter respect AI is not greatly different than the internet. For years, almost any knowledge has been just a few mouse clicks and keyboard strokes away. But AI makes accessing knowledge conversational, which is engaging for students.
Furthermore, AI can write much better than just about any student – and for that matter, just about any adult. Some 21CL thinkers have claimed that AI obviates the tedium of learning to write well, leaving students free to focus critical thinking and other 21st-century skills.
Since the turn of the millennium, many English-speaking countries have adopted curricula based on 21CL ideas. These curricula emphasise critical thinking but de-emphasise knowledge. New Zealand’s current curriculum, published in 2007, is an example. The knowledge it specifies is very thin. Instead, it places great emphasis on five “key competencies”, one of which is thinking.
There is nothing unique to the 21st century about thinking, of course. Human beings have been doing it for tens of thousands of years. What is novel about the 21CL take on critical thinking, is an implicit assumption children can be taught critical thinking with minimal personally held knowledge.
According to the curriculum, “Thinking is about using creative, critical, and metacognitive processes to make sense of information, experiences, and ideas.” It goes on to say that “competent thinkers and problem-solvers actively seek, use, and create knowledge”.
The curriculum acknowledges that thinkers “seek” and “use” knowledge. Nowhere, however, does it make clear enough that, to be critical thinkers in any meaningful sense, we must hold a great deal of knowledge personally. And the lack of specified knowledge in the curriculum sends a message that personally held knowledge isn’t all that important.
The ability to write is also an important contributor to clear thinking. By committing thoughts to text, we can vastly increase the amount of complexity we can cope with. That is because thinking occupies short-term memory, which has a very limited capacity. Writing enables us to commit thoughts to text, rather than having to store all our relevant ideas in short-term memory while we develop them. It therefore allows us to partially overcome the limited capacity of short-term memory.
The paramount risks of AI to education are that it may be used as a substitute for teaching students knowledge and writing skills. If that happens, it will undermine rather than assist students to learn to think critically.
But 21CL thinkers are not entirely misguided about the potential for AI to contribute to education. Using chatbots to access knowledge may well be more enjoyable and engaging than looking it up on Wikipedia or – and I’m showing my age here – reading books. But we must ensure educational AI is trained to engage with students in a way that results in them building their own stores of knowledge.
Similarly, AI could provide great personal tutors to help students learn to write. Rather than allowing students to use AI to produce an essay from scratch, it could give valuable feedback on drafts. For example, it could identify errors, weak arguments and plagiarism, and coach students on how to make improvements.
A good principle for using AI in education is not to allow students to use it in a way that substitutes learning knowledge or skills for themselves, if that knowledge or skill is an important foundation for later learning. If it is used to provide feedback to students, rather than to do things for them, it could be a powerful aid to learning, and assist human teachers in their work. Used in that way, AI can make a strong contribution to the development of students’ critical thinking.