Real Programmers Don’t use ChatGPT

Once upon a time, writing instructions for a computer meant understanding the innermost workings of the computer - you wrote code in the binary language of the computer, and the interface was pieces of cards with holes punched in them. That meant writing (and therefore, thinking) in machine code.

For a sense of what that could look like, search for “The Story of Mel, a Real Programmer”. Or for a similar kind of story thats maybe a little more accessible, “Real Programmers Don’t Use PASCAL” - from an age where it made complete sense to say that no “real programmer” would ever talk to a computer using a mouse.1

As time went on, tools and languages to make writing software easier were developed, computers got faster and easier to use, and the kind of efficiency in writing code that really mattered in the 1970s didn’t really matter as much any more. (Why bother spending weeks and months making your code run 50% faster when in the next 18 months, your new computer is going to run 200% faster?)

So, you would get ‘simpler’ programming languages that would essentially generate the machine code for you. So, if I want to write a quick and easy program, I’ll probably do it in a language like Python. My Python code gets interpreted - essentially, converted into machine code when it is run (as opposed to a ‘compiled’ language, where the source code gets converted into machine code before it gets run). But to actually write the code that Python is written in, you would use a language like C - which is much ‘closer to the metal’. (My Python code can be run on any computer that runs Python, but the actual Python application on my M1 Mac computer wouldn’t be able to run on a Windows computer - you’d need the Windows version of Python to run my Python code on a Windows machine.) But I can write Python code without having to understand anything about the layers of code running beneath it. (Sure - it can help if I want to do something like multithreading; my point is that I can get by without even knowing that those layers even exist; I write Python, I run it; it runs.)

So, all of these different coding languages would stack up on top of one another; my Python code gets converted to something my operating system can understand, which gets converted to something the actual chips in my computer can do something with.

Another example; when I learnt to make iPhone apps (in the early days of the App Store), you needed to specifically allocate chunks of memory to be used by a particular process, and then tell the app when that memory was finished with so it could be re-allocated to another process. If you didn’t, then that memory would be locked up and could never be re-used, and you could end up with an app demanding more memory than the phone had available, and your app (if not your phone) would crash. If you started learning how to make an iPhone today, you wouldn’t have to deal with that sort of thing, because there are newer systems in iOS that - to some extent - take care of it for you.

A more recent example; I started learning how to make web pages and websites about 20 years ago; that meant learning HTML, CSS and Javascript as a bare minimum - a working knowledge of Unix and SQL was also helpful (because to have a website, you would need to have a server to run it on). Languages like PHP made it much easier to build bigger websites, then frameworks like Wordpress or Drupal, written in PHP made it even easier. Today, anyone can build a website without ever needing to understand what an HTML tag is; services like Squarespace, Wix etc. deal with a whole bunch of problems for you, and for a few bucks a month, you don’t even need to know that those problems exist.

So - today, we’re starting to see a backlash to the AI hype that, to me, looks like the same sentiment of “Real Programmers Don’t Use PASCAL”. Today, I can tell a Large Language Model what I want to do in fairly plain English and it can generate the code that will do it. Is that fundamentally any different to telling my computer in a human-readable language what I want it to do, and the computer turning that into machine code? Maybe today it isn’t quite the same - but it is still early days. (Todays AI is the worst it will ever be...)

So, the idea that large language models will ‘write the code’ and make coding skills irrelevant isn’t really anything new. Disruptive - sure. But writing instructions that a computer will convert to a different set of instructions is just a continuation of a trend that goes back to the 1970s - if not further.

I like the idea that “the programming language of the future will be English”, but that isn’t to say that an understanding of the underlying code isn’t going to be useful. But as someone writing high-level code that, at least 90% of the time I don’t really expect to run on anything other than my own computer, I’m not sure how useful that sort of understanding is ever going to be to people like me. I know what regular expressions can do - but its a lot easier to explain to ChatGPT what I want a regular expression to do than it is to write one myself. Writing Python (or Ruby, or PHP) has always involved a certain amount of Googling, reading Stack Overflow, or reading books - today, I just have a ChatGPT window open (and a GPT-powered Copilot in VSCode.)

I'm glad that I've learned how to write code, and I'm sure that knowledge is going to be useful for the next few years. I'm just not sure if its something worth teaching children any more.

  1. Also, written at a time when it was completely uncontroversial to compare "real programmers" to "real men" - the title is a reference to Real Men Dont Eat Quiche, a book that satirised masculinity stereotypes. I'm not sure how much of the satire would have been recognised by its readers at the time though.