It’s funny how programmers, coders and hackers in Hollywood movies are always made to look somehow not normal. I think the most relatable character I’ve seen so far has been Benedict Cumberbatch’s representation of the mathematical prodigy Alan Turing in The Imitation Game (some normal guy…), with the rest being generally very over/underweight outcasts who are brilliant at what they do but seemingly inept at basic human interaction.

This sort of misrepresentation reflects the common (not to say ubiquitous) prejudice that coding is simply not for everyone; and it’s not for everyone – so the jingle goes – for the simple reason that it is hard. Really hard.

Here at WBS Coding School we have seen countless people who do not fit the Hollywood mould graduate from our courses and go on to work in tech, so we already know the above myth is just a myth. Yet the question lingers – why does coding look so hard? It’s not a simple discipline to master by any stretch, but neither is it a thing reserved for number-wizards with IQs over 140. Then why do people perceive it that way?

There are various answers to this question, but I’m going to make the argument that they all effectively boil down to the same reason. Let’s list the possibilities together, and you’ll see what I mean.

The first reason that coding looks hard is that it is written in its own language, one so alien in terms of letters and symbols that it might as well be in another alphabet. Brackets, apostrophes, inverted commas, exclamation marks, arithmetical operators all have different meanings in programming languages than they do in common languages (to say nothing of compound oddities like, say, Python’s “walrus operator”). Newcomers confronting a simple page of code could be forgiven for looking at it with the sense that they are staring into a slab of hieroglyphs.

This language (or more aptly these languages, for there are many) seem so foreign because they’re not designed for humans to communicate ideas. They’re made for computers to execute commands.

The above point blurs into the next: programming looks so hard because it is so intimately related to computers. It is sometimes easy to forget how new an invention these devices are. Our grandparents didn’t use them at all. Our parents used only a tiny fraction of the functions we use today. And they are not simple tools – understanding how a computer works requires some fairly advanced concepts in mathematics, electrical physics, and engineering.

As though learning a new language and working closely with computers didn’t seem hard enough already, there’s also the fact that programming appears to require a different way of thinking than what we normally employ to solve problems. There is some truth to this, although contrary to what many believe it’s not about learning to think smarter – it’s about learning to think dumber, which is a lot harder than it sounds. Computers are, in many if not most ways, actually extremely stupid, and learning to code is largely about learning to think like a computer. This means switching off many of the brain’s parts that do clever things for us on autopilot, and approaching a problem in terms of steps and instructions so basic we normally do not process them.

Ironically, few of these apparent issues reflect the real challenges of learning to code, which have more to do with persistence, precision, and being always open to updating one’s knowledge. Even more ironically, as diverse as they appear, these issues actually all stem from one identical problem.

The real (and probably only) reason coding looks hard is that it wasn’t part of your education as a child.

Think about it. Is any alphabet so difficult that only geniuses can write in it? No; foreign alphabets look inscrutable to us as adults, but we all learned equally complicated alphabets as children, and Hollywood doesn’t stereotype literacy as the brand of incredible brilliance. Are computers hard to use? They can be, but look at the digital literacy of children today, and see the sort of difference that growing up with a tool (and its new functionalities) makes to anyone’s preparation. Does programming require you to think differently? Sure, but so does geometry the first time you encounter it, and kids start learning about squares and triangles in primary school.

Programming, like computers themselves, is something very new. It became available in the job market decades if not centuries after we had established our education systems, so it is only natural that the latter should still be lagging behind the former. Indeed most programmers only begin to practice actual coding when they reach university, often as an ancillary part of some other STEM subject. Before that, many have no idea what programming really is about.

Make no mistake, coding can be hard. It’s a serious professional discipline that requires dedication and mental agility. But the stereotype that only the world’s smartest can do it is a vacuous cultural construct, one which lays its roots in the fact that almost none of us have encountered coding in any way, shape or form when we were children. Like reading and writing, it’s a thing that any of us can do, but which feels vastly more complicated and unintuitive if you try learning it as an adult.

If you want to learn to code, then think about it the way you used to: as a child.