From little Acorns – a brief history of computer games in the classroom

Please note, this page has been archived and is no longer being updated.

This article by Rhys James Jones, Senior Lecturer in Digital Media, Swansea University, was first published on The Conversation on Friday, June 9.

Play has always been central to growing up, – whether it’s in the street or on a playing field – or in the structured formality of teachers’ quizzes.

computer gamesThese days tablet computers are in nearly every pupil’s hands and children learn through computer games – both in the classroom and at home. Children’s coding initiatives and tiny computers such as the Raspberry Pi and the BBC’s micro:bit have also become big hits in the education world, helping to teach computer basics in playful ways.

But while it’s tempting to see the gamification of education as a new development, there is in fact a long history of children using computer games to help with their learning – which goes right back to the 1970s.

This was the decade during which computers first inched out of the research lab and into everyday life, making the idea of a home or personal computer somewhat closer to a reality. In 1974, Ted Nelson, a US pioneer of information technology, wrote what is often considered “the first personal computer book” – Computer Lib/Dream Machines.

It was in this book that with uncanny foresight, Nelson suggested pupils in the future would make use of hyperlinked documents, and touchscreens to widen their knowledge.

Away from Nelson’s speculation, the classroom reality was more mundane. Few schools could afford computers of their own. And for those that could, computer science meant punching instructions onto paper tape – a form of data storage, consisting of a long strip of paper in which holes are punched to store said data.

But in the late 1970s, something of a change happened – at least in the UK. And a worried government, concerned about Japanese innovation and threats of automation, commissioned a report from the Manpower Services Commission (MSC) and the BBC to look into how to develop computer literacy initiatives.

Designed to raise computer awareness, these initiatives happily coincided with the rise of microprocessors which were enabling the manufacture of cheaper, smaller machines. And the BBC decided to invite UK companies to submit proposals for a Microcomputer System, to a predefined specification. A system proposed by a young company in Cambridge was chosen and Acorn’s BBC Microcomputer was born.

Simpler and faster

The BBC Micro, along with some other machines, could be bought by schools at half price via government subsidies.

Their beige cases and red function keys became a familiar sight from primary through to university level. But they were still expensive: a discounted “Beeb” and monitor would cost a school more than £1,000 at today’s prices.

Learning to program was simpler and faster on the Beeb than in years past, with paper tape replaced by a monitor and a beginner’s coding language known as BASIC – which stands for “beginner’s all purpose symbolic instruction code”. This meant many more students were able to engage with computer science.

The rise of the games

Then there were the games. Despite excellent arcade clones and some true originals, the cost of the unsubsidised Beeb made it difficult for it to gain a foothold as a home videogame system.

Perhaps its educational image didn’t help either – and maybe it never quite shook off what comedian Simon Munnery described as The stench of school … most of the games would be … Isn’t Geography Nice?

The Beebs’ dominance in schools led to a torrent of educational software being released, of varying quality. Indeed, many of these early educational “games” aided nothing more than rote learning.

But educational initiatives helped push the boundaries, particularly in science and maths. The best-remembered games were imaginative, often programmed by teachers themselves, and learning happened by stealth. For example, the fondly recalled Granny’s Garden, while limited, took players on a puzzle-solving journey to avoid traps and witches – all rendered in teletext graphics.

Adventure was also central to L: A Mathemagical Journey, which used shades of Lewis Carroll to build players’ numeracy skills, while encouraging them to reflect on their experience.

Straddling home and school, Acorn’s software arm Acornsoft used surprisingly entertaining artificial intelligence techniques to test and extend subject-based learning. Younger, newly literate learners could encounter Podd, a floating head performing various actions at their typed command.

But in the 21st century, it’s not just learning, but the whole education system that has become a game – as schools, classes, teachers and students strive to top the board in league tables and PISA rankings. At the same time, teachers’ unions, children and parents all argue against excessive assessment and testing.

Maybe then we should all learn from the classroom videogame pioneers of the past few decades. Because although it’s clear that game-based learning has a role to play in education, there still needs to be a point to it – within a wider context.

And while educational games can be creative and innovative, they are at their best when they don’t smell too much of school.


This article was originally published on The Conversation. Read the original article.