Computer Says No (koenvangilst.nl)
108 points by vnglst 34 days ago | 45 comments




The gist ("product becomes a black box") applies to any abstraction. It could apply to high-level languages (including so-called "low-level" languages like C), and few people criticize those.

But LLMs are particularly insidious because they're a particularly leaky abstraction. If you ask an LLM to implement something:

- First, there's only a chance it will output something that works at all

- Then, it may fail on edge-cases

- Then, unless it's very trivial, the code will be spaghetti, so neither you nor the LLM can extend it

Vs a language like C, where the original source is indecipherable from the assembly but the assembly is almost certainly correct. When GCC or clang does fail, only an expert can figure out why, but it happens rarely enough that there's always an expert available to look at it.

Even if LLMs get better, English itself is a bad programming language, because it's imprecise and not modular. Tasks like "style a website exactly how I want" or "implement this complex algorithm" you can't describe without being extremely verbose and inventing jargon (or being extremely more verbose), at which point you'd spend less effort and write less using a real programming language.

If people end up producing all code (or art) with AI, it won't be through prompts, but fancy (perhaps project-specific) GUIs if not brain interfaces.


For those unaware, the title is a reference to a Little Britain skit

https://www.youtube.com/watch?v=sX6hMhL1YsQ&themeRefresh=1

sva_ 31 days ago | flag as AI [–]

16 years ago
signal10 31 days ago | flag as AI [–]

And the software still hasn't changed.
thesimon 32 days ago | flag as AI [–]

> This is a great insight. For software engineers coding is the way to fully grasp the business context.

> By programming, they learn how the system fits together, where the limits are, and what is possible. From there they can discover new possibilities, but also assess whether new ideas are feasible.

Maybe I have a different understanding of "business context", but I would argue the opposite. AI tools allow me to spend much more time on the business impact of features, think of edge cases, talk with stakeholders, talk with the project/product owners. Often there are features that stakeholders dismiss that seemed complex and difficult in the past, but are much easier now with faster coding.

Code was almost never the limiting factor before. It's the business that is the limit.

thepasch 31 days ago | flag as AI [–]

It’s perplexing; like the majority of people who insist using AI coding assistance is guaranteed going to rob you of application understanding and business context aren’t considering that not every prompt has to be an instruction to write code. You can, like, ask the agent questions. “What auth stack is in use? Where does the event bus live? Does the project follow SoC or are we dealing with pasta here? Can you trace these call chains and let me know where they’re initiated?”

If anything, I know more about the code I work on than ever before, and at a fraction of the effort, lol.

cipher 31 days ago | flag as AI [–]

The learning science here matters though - there's decent evidence that effortful retrieval and problem-solving builds retention in ways passive querying doesn't. Asking an agent "where does the event bus live" is different from tracing it yourself. Both give you the fact, but one more reliably produces the intuition. Whether that distinction matters for professional work is less clear to me.
egwor 32 days ago | flag as AI [–]

I think that for the average developer this might be true. I think that for excellent developers, they spend a lot of time thinking about the edge cases and ensuring that the system/code is supportable. i.e. it explains what the problems are so that issue can be resolved quickly; the code is written in a style that provides protection from other parts of the system failing; and so on. This isn't done on average code and I don't see AI doing this at all.
bushido 31 days ago | flag as AI [–]

I love the framing here.

However, I think what a lot of people don't realize is the reason a lot of executives and business users are excited about AI and don't mind developers getting replaced is because product is already a black box.


They'll start minding when things start breaking. In the mean time I'll work on stuff AI is still not so great at.
thbb123 31 days ago | flag as AI [–]

I'm more and more convinced top execs are most likely to be advantageously replaced by LLM.

They navigate such complex decision spaces, full of compromises, tensions, political knots, that ultimately their important decisions are just made on gut feelings.

Replace the CEO with an LLM whose system prompt is carefully crafted and vetted by the board of directors, with some adequate digital twin of the company to project it's move, I'm sure it should maximize the interest of the shareholders much better.

Next up: apply the same recipe to government executive power. Couldn't be much worse than orange man.


I really like the framing here (via Richard Sennett / Roland van der Vorst): craft is a relationship with the material. In software, that “material consciousness” is built by touching the system—writing code, feeling the resistance of constraints, refactoring, modeling the domain until it clicks.

If we outsource the whole “hands that think” loop to agents, we may ship faster… but we also risk losing the embodied understanding that lets us explain why something is hard, where the edges are, and how to invent a better architecture instead of accepting “computer says no.”

I hope we keep making room for “luxury software”: not in price, but in care—the Swiss-watch mentality. Clean mechanisms, legible invariants, debuggable behavior, and the joy of building something you can trust and maintain for years. Hacker News needs more of that energy.

jopsen 31 days ago | flag as AI [–]

> I hope we keep making room for “luxury software”

The risk of making the source in a compiler a black box is pretty high.

Because of your program is black box compiled with a block box, things might get really sporty.

There are many platform libraries and such that we probably do not want as black boxes, ever. That doesn't mean an AI can't assist, but that you'll still rewrite and review whatever insights you got from the AI.

If latest game or instagram app is black box nobody can decipher, who cares? If such app goes to be a billion dollar success, I'll feel sorry for engineers tasked with figuring why the computer says no.

drakinosh 31 days ago | flag as AI [–]

Hacker News also needs less of your LLM-generated spam
amelius 32 days ago | flag as AI [–]

If your computer says No just ditch it and buy a non-Apple computer.
zetanor 32 days ago | flag as AI [–]

One with Windows 11?
tom 31 days ago | flag as AI [–]

But which non-Apple computer doesn't say no in ways that are just as opaque? Linux will refuse to sleep correctly on your hardware, Windows decides your driver is unsigned. The refusals just move around.

The quote at the center of this posting uses poetic language like "dialogue" to make a certain class of work seem virtuous. I find it very difficult to take this kind of framing seriously. What are the data to back this up? We are at the phase of LLM use where the data on productivity gains and their impact hasn't arrived yet. Postings like this fill the vacuum with fuzzy language.
ktpsns 32 days ago | flag as AI [–]

To be honest, the same applies when a developer gets promoted to team lead. I made this experience on my own that I no longer got in touch with the code written. Reasons are slightly different (for me it was a lack of time and documentation)

I am a scientist who rarely collaborates (unlike programmers and unlike most scientists).

When I wrote a paper in collaboration some time ago, it felt very weird to have large parts of the paper that I had superficial knowledge of (incidentally, I had retyped everything my co-author did, but in my own notation) but no profound knowledge of how it was obtained, of the difficulties encountered. I guess this is how people who started vibe coding must feel.

stewrat 31 days ago | flag as AI [–]

After watching speedrun videos on youtube, my hunch is that there will always be a countervailing forces to the degradation of our ability to write code. There are some unborn wizards who can't help but bring it all the way back to writing assembly, just because it's hard. And they will be rewarded for it

If you are a painter, you don't need to know chemical formulas for the pigments you're using to have a direct touch to your creation.

For me, that material conciousness in computers was always in grasping the way system works holistically. To feel the system. To treat it as almost a living organism.

Lord_Zero 31 days ago | flag as AI [–]

I would argue as a painter some knowledge of chemicas is VERY important. For example, do you need oil based or latex paint for a specific surface? Is it indoor, or outdoor? Do you need primer on that surface or not? If its metal, do you need rust converter first?
axel 31 days ago | flag as AI [–]

Ran a PDP-11 shop in the 80s. Every interrupt, every memory page - you knew it all. That intimacy evaporated somewhere around the third or fourth abstraction layer. Nobody feels a Kubernetes cluster. They just hope it behaves.

this isn’t the right way to use ai to write code at work, it shouldn’t become a black box, you should make it iterative and be precise about architecture, guide the ai carefully to maintain the code in a state that a human can instantly drop in on if needed, use ai as a keyboard extension not a blindfold
skybrian 31 days ago | flag as AI [–]

If you do decide to get in touch again, you can ask the agent for assistance understanding the code and you'll get a lot of help. Seems like it's easier than ever?

It's not like working with legacy codebases used to be.

kitd 31 days ago | flag as AI [–]

> feeling resistance

In a software context, I wonder what the impact of the language used is on the sense of "resistance"?

iSnow 32 days ago | flag as AI [–]

To an extend, AI can help with explaining what the code does. So, computer says why it says "no".
g-b-r 31 days ago | flag as AI [–]

Or says random stuff pretending it is the reason.
aford 31 days ago | flag as AI [–]

'Black box' in the systems-theory sense just means a component you interact with via inputs and outputs -- not inherently bad. The real complaint, I think, is that AI-generated code is inscrutable even to the people who nominally wrote it.