Why children should NOT be taught to code

There is growing enthusiasm for the idea that children should be taught digital coding. Yet what assumptions is this based upon, and how valid are they?

 

In October this year, the BBC will be providing a free pocket-sized computer called the Micro Bit to all one million incoming secondary school students in the UK. With the support of organizations like Code Club and Coder Dojo, children will be taught to write basic computer programs using the BBC’s Make It Digital website. According to the BBC, the Micro Bit will address the UK’s ‘digital skills gap’ and enable young people to ‘express themselves digitally’.

With the BBC about to undergo a root and branch review driven by a hostile Culture Minister, the Micro Bit might seem like a better investment than further inflated salaries for its executives or a set of new swivelling chairs for The Voice. Personally, I’m not so sure.

Initiatives like the Year of Code, the Hour of Code, Codecademy and Code Clubs have been accompanied by a tsunami of hype. In the UK, the existing ICT curriculum has been refreshed by an injection of coding, although this is a task for which few teachers (especially in primary schools) have been adequately prepared. Coding – or what we used to call programming – has become yet another educational technology whose time has come.

In fact, computer programming in schools is far from new. I can personally recall similar initiatives when the first computers arrived in schools back in the late 1970s. Indeed, the BBC is trumpeting its Micro Bit by comparing it with the success of the BBC Micro, which it claims inspired the early computer game designers of the 1980s.

Yet somewhere along the line, teaching programming was largely abandoned, for reasons that would be interesting to review. Ben Williamson – always a reliable commentator on such matters – has discussed the complex history of such developments, suggesting that different aspects of computing have been blurred together. Coding, he argues, is far from being the same thing as computer science: it was historically seen as a fairly low-level skill.

main-qimg-a63506d7fe04dcbc40d39979f590d8e7

The contemporary argument for teaching coding seems to be two-fold. The first aspect takes us back to the 1970s, when MIT Professor Seymour Papert developed the programming language LOGO and the Turtle device. For Papert, programming was a means of teaching logical or ‘procedural’ thinking, especially in the context of mathematics. However, his claims about the value of this ‘constructionist’ approach became progressively inflated over time.

I’ve written about this at some length in my book Beyond Technology. The argument depends upon assumptions about learning transfer – the idea that learning in one context will automatically transfer across to others. This is to conceive of the brain as a kind of muscle: a good workout in the coding gym will have payoffs when we need our logical thinking skills to solve problems elsewhere. Similar claims are often made for learning the game of chess, or Latin. Yet there is no convincing evidence that learning computer programming enables children to develop more general problem-solving skills, let alone that it will ‘teach you how to think’, as its advocates claim.

The second argument for teaching coding is an economic one. In the UK, the key document here is the Next Gen report by Ian Livingstone and Alex Hope, published in 2011 by Nesta, the government’s innovation think tank. The report drew attention to an apparent skills shortage in the area of video games and visual effects, and recommended that it should be solved by introducing compulsory Computer Science courses in schools, to replace the existing ICT curriculum.

In revisiting this report, it’s worth emphasising that Livingstone and Hope are both creative entrepreneurs in the media industries, rather than computer scientists. They argue that Computer Science should be regarded as a core STEM (Science, Technology, Engineering and Mathematics) subject; but actually their core interest is in creative areas of media like video games, animation and visual effects.

The idea that compulsory Computer Science will create employment is certainly dubious. Computer Science graduates routinely top unemployment tables. The reasons for this are complex. Black and ethnic minority students make up a much higher proportion of computing graduates than in other subjects, and they are generally more likely to be unemployed than white graduates – thus pointing to some of the limitations of attempts at ‘widening access’.

However, another factor is that the technology industry has increasingly moved many of its operations offshore – especially in the case of lower-level jobs like coding and entry-level posts that would be accessible to new graduates.

Even if there is a ‘skills shortage’ in the technology industry, it’s certainly debatable whether coding is one of the key skills that is lacking – especially if (as the enthusiasts frequently claim) coding is so easy to learn. Indeed, the idea that there is a shortage of STEM graduates more generally is also questionable: at least in the US, there are more STEM workers than available jobs.

If the government wants the UK to become a leading player in the global technology business, it may be much more in need of creative entrepreneurs than programming drones. Likewise, if we want to revive our flagging position in the games design industry, we need imaginative scenarios and compelling characters, not just lines of code.

The title of my post is deliberately provocative. If some children want to learn coding, that’s fine. Coding is probably more interesting – at least for some – than so many of the other things educational policy-makers have been shoving down their throats for the past twenty years. It’s certainly more interesting than the ICT curriculum – that wasteland of spreadsheets and file management – that it seems to be replacing. And if it enables young people to ‘express themselves digitally’ (whatever that means), then perhaps it will be worthwhile.

Yet what’s self-evidently missing here is any more critical understanding of technology, and its role in society, politics and culture. Without this, compulsory coding would seem to be just another way of disciplining children, or wasting their time.