> The use of anthropomorphic terminology when dealing with computing systems is a symptom of professional immaturity.
Fuck all the way off with this "truth!" So much coding is made more enjoyable and profitable by people having fun like this. KoboldAI is one of my favorite projects, and like it or not, attention is driven to projects that are cute and lever human perception for their benefit. Mascots are even an important part of whether a technology becomes popular or not. This isn't stogy IBM black-tie mainframe-driven development anymore, nor should it be.
I know tons of professional developers who live and breathe computer science that enjoy having fun with their terminology and how they choose to represent and discuss computers, anthropomorphizing languages, projects, interfaces, iconography, etc.
As with a lot of other topics I'd love to say "just as long as it's consistent", but the world we live in right now I just feel like a lot of this attitude is adding selfish noise.
There's a far more cynical part of my mind that says that's the whole point. How else do you extract value sometimes?
I was raised in a household where many, many things were anthropomorphized, even socks; but raised as a subhuman pet, well...
I self-identify as transhuman. It wasn't anything I chose but transhumanism chose me. I am intimately and inextricably connected with electronics and machines. Consider C-3PO, Anakin becoming Vader, or Adams' Eddie/Marvin.
To this day I surprise and amuse people by naming my devices and treating them as sentient. My devices, my home networks, are pets, or children, or plants that I care for, that I feed, that sort of help/serve me [perhaps that part is backwards]
And there is a certain sub-sentience, an autonomy, to many advanced systems. Anything connected to the Internet has a discernible mind and soul--you cannot deny this! How many decades has marketing referred to the CPU as "brain" of the computer or what have you?
It's weird when people only want to discuss my meatspace activities, as if cyberspace is irrelevant or invisible?
This mythology of "uploading our consciousness into the cloud" is well underway. Children build robots and corporations write software, imbuing it with their own business logic, lore, and logos. The Terminator franchise is not wrong, but most people still experience computing as benign or even pro-human.
How did Dijkstra intend this to be received? I started reading this at face value and was nodding along until the comments on BASIC programmers being "mentally mutilated and "use of COBOL cripples the mind" which seem more like he was trolling?
That said, this one really is a truth:
"Simplicity is prerequisite for reliability."
And this one is no longer true since LLMs:
"Projects promoting programming in "natural language" are intrinsically doomed to fail."
Doesn't your addition of a "yet" imply the parent is correct? We don't need to have a working counterexample in existence for that "truth" to be wrong, we just need to see that such a project is not "intrinsically doomed to fail".
Twenty years ago such projects were intrinsically doomed to fail. Today, they are on the cusp of not failing.
He wasn't trolling; the memo was intended for other similar academics, and the targets were all chosen to be outside of their bubble: on the "them" side rather than "us". None of his colleagues were using PL/I, Fortran, or Cobol, so they could just have chuckle. They weren't running a business depending on a single vendor, with crushing complexity, so hahaha, those guys are idiots. Etc.
There are other ways to attain reliability. Redundancy is one.
There is nothing simple about the way the Internet works but it continues to be proven robust against everything from temporary outage to nation-state revolution.
Exactly. I think Dijkstra was off on this one. Simplicity may be a prerequisite for a software engineer's sanity but there's no natural law that simple is reliable. If anything, nature trends towards highly interconnected, distributed systems with plenty of redundancy - just like the internet.
> That said, this one really is a truth: "Simplicity is prerequisite for reliability."
How's COBOL not simple though? COBOL is still in use in major banks today. We're not talking about an old Commodore 64 (love that machine by the way) still used by a lone mechanic in some rural area to compute wheel alignment (which does exists too): we're talking about at least hundred of millions of lines of COBOL still use in use throughout the world. Maybe still billions of line.
And it all just work.
COBOL has proved its reliability. I don't remember the language as particularly hard: a bit of a straightjacket but it isn't complicated, it's simple.
Dijkstra is not implying anywhere that COBOL is "complex".
On the other hand, I would not even call COBOL in itself reliable. I used it twice in my career, and it always needed a tremendous amount of handholding from user and developers to run, and very often the main user HAD to be a developer.
The first time was in a major bank, and the second time it was in a major university. My job in both cases was to migrate away from it and have a system that could run independently rather than needing a developer babysit it.
> In the good old days physicists repeated each other's experiments, just to be sure. Today they stick to FORTRAN, so that they can share each other's programs, bugs included.
This is pretty wild when you think about it. I wouldn't expect a lab to check another lab's work by re-writing their code (although, I'd love to hear some examples!), but if you don't, you're really powerless against whatever bugs they wrote into their scientific code.
I don't know if he'd be pleased or dismayed to learn that here in 2025, programming is often a whole lot easier than pure mathematics. Programming is often small science experiments performed on immensely complex systems. Math is still actually math.
The submitter really should have waited on this a bit longer. The 50th anniversary of this essay is June 18 but submitting it again then would be considered a [dupe].
As a self-taught programmer, I have just "experienced" Dijkstra through various tidbits like this[1], and from the bits I've read he comes across as a bit of an pompous asshat. However strong opinions are a fertile ground for discussion, so I assume it's a deliberate strategy and can see why he keeps getting mentioned.
That said, I found this one particularly interesting given the recent rise of LLMs:
Projects promoting programming in "natural language" are intrinsically doomed to fail.
I assume he's referring to languages like COBOL and SQL, the latter still going strong, but I can't help but think that this part will change a lot in the coming decades.
Sure we'll likely still have some intermediary language with strong syntax and similar, just like how LLVM, C# and similar have their IL, but it's hard to think the majority of the programming is done typing in regular programming languages like JavaScript, C++ or similar in 2050.
He comes across like that because he was. From him you always get these profound wisdoms mixed with denigrating statements, a lot of times about mental capacity. He was an asshole, a brilliant one, but an asshole.
I, by the way, don’t celebrate him as other people do. I don’t think there’s ever an excuse for anyone to behave like that, no matter how brilliant.
It's hard to make people view both intelligence and arrogance (or other bad traits) in hand. And neither does society encourage benevolence. I think Dijkstra was right about many things, but I can take those views and choose to act in certain ways, not just parrot Dijkstra as an all-knowing god. Although I will be the first to say that I am too arrogant for my own good. But still. Don't be like me or like Dijkstra, I guess.
I think some people mistake arrogance as wisdom. It's more prevalent in these days of social media where people mistake strong words and loud proclamation for truth, but I suspect similar things have been happening on a smaller scale in the past.
The former is also still going strong; the US Social Security Administration has sixty million lines of the stuff (as well as half of banks and approximately 80% of card transactions).
COBOL, at this point, has outlived Dijkstra and is poised to be a language-in-use for longer than he was a human-in-breathing. So I suspect he missed the mark on that one.
(I think, personally, we hackers have a bad habit of deciding a language that doesn't fit our favorite problem domains is a bad language. There are reasons, other than simple inertia, that COBOL sticks around in places where the main task is turning written laws into computer code...).
I don't consider COBOL to be a good indicator of programming quality. Although frankly, I will say that about most software. Generally products stick around not because they're good but because replacing them would be too costly. And generally products get in not because they're good but because they're decent and get lucky with timing. I saw another HN post today about Excel. I think Excel is on the better end of the software spectrum. Still, the main reasons why we don't yet have "Excel++ but for real better" everywhere is simply because developing such a thing and then getting the traction for it would be infeasible. I will once again say that I don't consider COBOL remotely impressive for still underlying many important computer systems. The computer systems are important because of their purpose, and implementation is always a conflict with purpose.
> don't consider COBOL to be a good indicator of programming quality
COBOL wasn't designed for that. The intention was a language that non-coders could code in. This was the precursor to things like FIT which was a precursor to today's Cucumber, like some history of child abuse carried on over the generations, so we still suffer today.
>you can really do only one of two things: fight the disease or pretend that it does not exist.
Choosing to fight the disease will make enemies of your friends, it will rob you of your peace of mind, it will rob you of your faith in humanity, and bring you no closer to curing said disease.
The choice of CS departments to adopt the latter strategy is one made either from either wisdom or game theory, and it is in the self-interest of rational actors.
>But, Brethern, I ask you: is this honest? Is not our prolonged silence fretting away Computing Science's intellectual integrity? Are we decent by remaining silent? If not, how do we speak up?
If you are vexed by these questions, ask not the very same, but rather whether the intellectual integrity of public discourse (including nominally professional subsets of it) is worth your sense of safety, your sense of sanity, and your social circle, because you'll pay with those every time you play the unwinnable game of trying to convince the world of uncomfortable truths.
> FORTRAN --"the infantile disorder"--, by now nearly 20 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use.
We're gonna be stuck with cpp for at least a thousand years aren't we.
In general, you can assume that any technology or standard which had significant market share during a growth period will have, at the very least, a long tail of continued use for the foreseeable future. Stuff that's in use and works doesn't get replaced unless the alternatives beat out the switching cost.
For another example, I typed this comment on a QWERTY keyboard.
I'm doing my part, still keeping C++ systems alive so the next generation can see the horrors created by our predecessors. At least this particular program doesn't have the same problems as the last one I worked on, where some genius decided to make their own shared_ptr and did it wrong among many other bizarre choices.
I think there is room for nuance. For example, it is true to say, “Programming requires intelligence.” It is also unnecessary to say, “You can’t program if you’re dumb.” One of these statements is productive and not insulting, the other isn’t.
> For example, it is true to say, “Programming requires intelligence.” It is also unnecessary to say, “You can’t program if you’re dumb.”
I promise if you get far along enough in your career you will realize this is very much not true, it's a thing people like to believe, but there are plenty of deeply stupid programmers out there with long, annoying careers.
I feel like the definition of "stupid" here is important.
There are a lot of incredibly clever programmers out there who will construct intricate webs of abstracted hell because they are clever. These guys are mostly not "stupid". One of my colleagues working on one of these code bases with me described it as "very smart people doing very stupid things".
And yes, they all have long and annoying careers.
Contrast this with the swathes of what I would characterise as "rat cunning" programmers who were everywhere during the Y2K crisis. They knew just enough to be dangerous, did some truly stupid things and disappeared from the programming world afterwards. The unkind might say they all turned into systems architects and project managers.
My observation is that there's likely a sweet spot. Though this is a multidimensional vector so it's not one sweet spot.
A programmer can find a niche where their skill has value for a long period of time, even if their situation (mental flexibility, willingness to learn, etc.) precludes exiting the niche. It can be a challenge to work with someone like that if you have to interface to them and their approach is to pull you all the way over to where they are.
... but sometimes that's how it is. And I've seen plenty of smart programmers smother their ability to provide value for real people under analysis paralysis while the "stupid" programmers bull-charge in with the first approach they can think of and write some ugly, stupid, spaghetti, working tools.
They don't seem exactly the same. I think there's a general sense that most people are in the middle. Neither intelligent nor dumb. Those are the outliers. It's not so bad to be in the middle.
> The use of anthropomorphic terminology when dealing with computing systems is a symptom of professional immaturity.
Fuck all the way off with this "truth!" So much coding is made more enjoyable and profitable by people having fun like this. KoboldAI is one of my favorite projects, and like it or not, attention is driven to projects that are cute and lever human perception for their benefit. Mascots are even an important part of whether a technology becomes popular or not. This isn't stogy IBM black-tie mainframe-driven development anymore, nor should it be.
I know tons of professional developers who live and breathe computer science that enjoy having fun with their terminology and how they choose to represent and discuss computers, anthropomorphizing languages, projects, interfaces, iconography, etc.
As with a lot of other topics I'd love to say "just as long as it's consistent", but the world we live in right now I just feel like a lot of this attitude is adding selfish noise.
There's a far more cynical part of my mind that says that's the whole point. How else do you extract value sometimes?
Yes please.
I was raised in a household where many, many things were anthropomorphized, even socks; but raised as a subhuman pet, well...
I self-identify as transhuman. It wasn't anything I chose but transhumanism chose me. I am intimately and inextricably connected with electronics and machines. Consider C-3PO, Anakin becoming Vader, or Adams' Eddie/Marvin.
To this day I surprise and amuse people by naming my devices and treating them as sentient. My devices, my home networks, are pets, or children, or plants that I care for, that I feed, that sort of help/serve me [perhaps that part is backwards]
And there is a certain sub-sentience, an autonomy, to many advanced systems. Anything connected to the Internet has a discernible mind and soul--you cannot deny this! How many decades has marketing referred to the CPU as "brain" of the computer or what have you?
It's weird when people only want to discuss my meatspace activities, as if cyberspace is irrelevant or invisible?
This mythology of "uploading our consciousness into the cloud" is well underway. Children build robots and corporations write software, imbuing it with their own business logic, lore, and logos. The Terminator franchise is not wrong, but most people still experience computing as benign or even pro-human.
How did Dijkstra intend this to be received? I started reading this at face value and was nodding along until the comments on BASIC programmers being "mentally mutilated and "use of COBOL cripples the mind" which seem more like he was trolling?
That said, this one really is a truth: "Simplicity is prerequisite for reliability."
And this one is no longer true since LLMs: "Projects promoting programming in "natural language" are intrinsically doomed to fail."
> And this one is no longer true since LLMs: "Projects promoting programming in "natural language" are intrinsically doomed to fail."
Not convinced that this is no longer true... yet
Doesn't your addition of a "yet" imply the parent is correct? We don't need to have a working counterexample in existence for that "truth" to be wrong, we just need to see that such a project is not "intrinsically doomed to fail".
Twenty years ago such projects were intrinsically doomed to fail. Today, they are on the cusp of not failing.
Kinda humoristic, but this reflect what I've seen when people heavily relies on LLM for programming: https://www.youtube.com/watch?v=vlXdUU5pd_0
It's always vibe coding. It was just harder with Google and SO.
He wasn't trolling; the memo was intended for other similar academics, and the targets were all chosen to be outside of their bubble: on the "them" side rather than "us". None of his colleagues were using PL/I, Fortran, or Cobol, so they could just have chuckle. They weren't running a business depending on a single vendor, with crushing complexity, so hahaha, those guys are idiots. Etc.
I think LLMs prove that this is true.
I think you have answered your own question.
There are other ways to attain reliability. Redundancy is one.
There is nothing simple about the way the Internet works but it continues to be proven robust against everything from temporary outage to nation-state revolution.
Exactly. I think Dijkstra was off on this one. Simplicity may be a prerequisite for a software engineer's sanity but there's no natural law that simple is reliable. If anything, nature trends towards highly interconnected, distributed systems with plenty of redundancy - just like the internet.
> That said, this one really is a truth: "Simplicity is prerequisite for reliability."
How's COBOL not simple though? COBOL is still in use in major banks today. We're not talking about an old Commodore 64 (love that machine by the way) still used by a lone mechanic in some rural area to compute wheel alignment (which does exists too): we're talking about at least hundred of millions of lines of COBOL still use in use throughout the world. Maybe still billions of line.
And it all just work.
COBOL has proved its reliability. I don't remember the language as particularly hard: a bit of a straightjacket but it isn't complicated, it's simple.
Dijkstra is not implying anywhere that COBOL is "complex".
On the other hand, I would not even call COBOL in itself reliable. I used it twice in my career, and it always needed a tremendous amount of handholding from user and developers to run, and very often the main user HAD to be a developer.
The first time was in a major bank, and the second time it was in a major university. My job in both cases was to migrate away from it and have a system that could run independently rather than needing a developer babysit it.
> In the good old days physicists repeated each other's experiments, just to be sure. Today they stick to FORTRAN, so that they can share each other's programs, bugs included.
This is pretty wild when you think about it. I wouldn't expect a lab to check another lab's work by re-writing their code (although, I'd love to hear some examples!), but if you don't, you're really powerless against whatever bugs they wrote into their scientific code.
You do not need to reimplement software to test it.
1975, also surprisingly few conversations on this one. Here are the few with more than three comments:
https://news.ycombinator.com/item?id=24776336 - Oct 2020 (73 comments)
https://news.ycombinator.com/item?id=4926615 - Dec 2012 (67 comments)
https://news.ycombinator.com/item?id=2279260 - Mar 2011 (74 comments)
https://www.cs.utexas.edu/~EWD/ - for this and many more writings by Dijkstra.
I think the truth on display is that living inside an artificial system that:
- is designed for you
- never talks back
- can be mastered
selects for (and breeds!) a deep sense of arrogance and entitlement
> never talks back
"Can be mastered" might disqualify your example.
“Can be mastered” != “never talks back”
> prof.dr.Edsger W.Dijkstra
Ah.
I don't know if he'd be pleased or dismayed to learn that here in 2025, programming is often a whole lot easier than pure mathematics. Programming is often small science experiments performed on immensely complex systems. Math is still actually math.
Writing bad programs is still easy. Writing good programs is still hard. No one agrees what "good" and "bad" mean in this context.
"For instance: with respect to COBOL you can really do only one of two things: fight the disease or pretend that it does not exist"
A better question might be, how should you proceed when your truth does not want to be heard?
Flee the country before it's too late.
Mainly, you tell all the truth, but you tell it slant; success in circuit lies. No wait, what the fuck am I saying, I'm Dijkstra!
I just loved Nerdwriter1ʼs short analysis of that poem: https://youtu.be/55kqNg88JqI
The submitter really should have waited on this a bit longer. The 50th anniversary of this essay is June 18 but submitting it again then would be considered a [dupe].
As a self-taught programmer, I have just "experienced" Dijkstra through various tidbits like this[1], and from the bits I've read he comes across as a bit of an pompous asshat. However strong opinions are a fertile ground for discussion, so I assume it's a deliberate strategy and can see why he keeps getting mentioned.
That said, I found this one particularly interesting given the recent rise of LLMs:
Projects promoting programming in "natural language" are intrinsically doomed to fail.
I assume he's referring to languages like COBOL and SQL, the latter still going strong, but I can't help but think that this part will change a lot in the coming decades.
Sure we'll likely still have some intermediary language with strong syntax and similar, just like how LLVM, C# and similar have their IL, but it's hard to think the majority of the programming is done typing in regular programming languages like JavaScript, C++ or similar in 2050.
[1]: I have of course learned about his algorithm
He comes across like that because he was. From him you always get these profound wisdoms mixed with denigrating statements, a lot of times about mental capacity. He was an asshole, a brilliant one, but an asshole.
I, by the way, don’t celebrate him as other people do. I don’t think there’s ever an excuse for anyone to behave like that, no matter how brilliant.
Again, I've not read all of his works, but based on the ones I have, such as this, I wouldn't say brilliant.
In just this piece alone he makes several objectively false and/or narrow-minded statements, even back when he wrote it.
Not saying he wasn't smart, but being so narrow-minded precludes brilliance in my view.
So yeah, I've never really understood the fame he got either.
It's hard to make people view both intelligence and arrogance (or other bad traits) in hand. And neither does society encourage benevolence. I think Dijkstra was right about many things, but I can take those views and choose to act in certain ways, not just parrot Dijkstra as an all-knowing god. Although I will be the first to say that I am too arrogant for my own good. But still. Don't be like me or like Dijkstra, I guess.
I think some people mistake arrogance as wisdom. It's more prevalent in these days of social media where people mistake strong words and loud proclamation for truth, but I suspect similar things have been happening on a smaller scale in the past.
The former is also still going strong; the US Social Security Administration has sixty million lines of the stuff (as well as half of banks and approximately 80% of card transactions).
COBOL, at this point, has outlived Dijkstra and is poised to be a language-in-use for longer than he was a human-in-breathing. So I suspect he missed the mark on that one.
(I think, personally, we hackers have a bad habit of deciding a language that doesn't fit our favorite problem domains is a bad language. There are reasons, other than simple inertia, that COBOL sticks around in places where the main task is turning written laws into computer code...).
I don't consider COBOL to be a good indicator of programming quality. Although frankly, I will say that about most software. Generally products stick around not because they're good but because replacing them would be too costly. And generally products get in not because they're good but because they're decent and get lucky with timing. I saw another HN post today about Excel. I think Excel is on the better end of the software spectrum. Still, the main reasons why we don't yet have "Excel++ but for real better" everywhere is simply because developing such a thing and then getting the traction for it would be infeasible. I will once again say that I don't consider COBOL remotely impressive for still underlying many important computer systems. The computer systems are important because of their purpose, and implementation is always a conflict with purpose.
> don't consider COBOL to be a good indicator of programming quality
COBOL wasn't designed for that. The intention was a language that non-coders could code in. This was the precursor to things like FIT which was a precursor to today's Cucumber, like some history of child abuse carried on over the generations, so we still suffer today.
Ada was the one designed for quality.
https://en.wikipedia.org/wiki/Ada_(programming_language)
>you can really do only one of two things: fight the disease or pretend that it does not exist.
Choosing to fight the disease will make enemies of your friends, it will rob you of your peace of mind, it will rob you of your faith in humanity, and bring you no closer to curing said disease.
The choice of CS departments to adopt the latter strategy is one made either from either wisdom or game theory, and it is in the self-interest of rational actors.
>But, Brethern, I ask you: is this honest? Is not our prolonged silence fretting away Computing Science's intellectual integrity? Are we decent by remaining silent? If not, how do we speak up?
If you are vexed by these questions, ask not the very same, but rather whether the intellectual integrity of public discourse (including nominally professional subsets of it) is worth your sense of safety, your sense of sanity, and your social circle, because you'll pay with those every time you play the unwinnable game of trying to convince the world of uncomfortable truths.
"Programming is one of the most difficult branches of applied mathematics; the poorer mathematicians had better remain pure mathematicians."
While this might have been true in 1975 with FORTRAN, I wonder if this holds true today?
> FORTRAN --"the infantile disorder"--, by now nearly 20 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use.
We're gonna be stuck with cpp for at least a thousand years aren't we.
C++ will outlive all of us.
In general, you can assume that any technology or standard which had significant market share during a growth period will have, at the very least, a long tail of continued use for the foreseeable future. Stuff that's in use and works doesn't get replaced unless the alternatives beat out the switching cost.
For another example, I typed this comment on a QWERTY keyboard.
I'm doing my part, still keeping C++ systems alive so the next generation can see the horrors created by our predecessors. At least this particular program doesn't have the same problems as the last one I worked on, where some genius decided to make their own shared_ptr and did it wrong among many other bizarre choices.
I think there is room for nuance. For example, it is true to say, “Programming requires intelligence.” It is also unnecessary to say, “You can’t program if you’re dumb.” One of these statements is productive and not insulting, the other isn’t.
Just my two cents.
> For example, it is true to say, “Programming requires intelligence.” It is also unnecessary to say, “You can’t program if you’re dumb.”
I promise if you get far along enough in your career you will realize this is very much not true, it's a thing people like to believe, but there are plenty of deeply stupid programmers out there with long, annoying careers.
I feel like the definition of "stupid" here is important.
There are a lot of incredibly clever programmers out there who will construct intricate webs of abstracted hell because they are clever. These guys are mostly not "stupid". One of my colleagues working on one of these code bases with me described it as "very smart people doing very stupid things".
And yes, they all have long and annoying careers.
Contrast this with the swathes of what I would characterise as "rat cunning" programmers who were everywhere during the Y2K crisis. They knew just enough to be dangerous, did some truly stupid things and disappeared from the programming world afterwards. The unkind might say they all turned into systems architects and project managers.
My observation is that there's likely a sweet spot. Though this is a multidimensional vector so it's not one sweet spot.
A programmer can find a niche where their skill has value for a long period of time, even if their situation (mental flexibility, willingness to learn, etc.) precludes exiting the niche. It can be a challenge to work with someone like that if you have to interface to them and their approach is to pull you all the way over to where they are.
... but sometimes that's how it is. And I've seen plenty of smart programmers smother their ability to provide value for real people under analysis paralysis while the "stupid" programmers bull-charge in with the first approach they can think of and write some ugly, stupid, spaghetti, working tools.
Is that nuance? Or just politeness analysis of two semantically identical statements?
Maybe “nuance” isn’t the right word.
My point is, there are usually ways to phrase identical truths with which the writer does not intend to insult.
They don't seem exactly the same. I think there's a general sense that most people are in the middle. Neither intelligent nor dumb. Those are the outliers. It's not so bad to be in the middle.
When you figure it out, please do something about Gaza.