Until now, software has been designed. Every bit of it has been deliberately made just so. We talk about "architecting" and "building" software, much like you'd architect and build other civil engineering projects - houses, bridges, and roads.
In No Silver Bullet: Essence and Accidents of Software Engineering[a] by Frederick P. Brooks, Jr. we first meet another metaphor: that of growing software.
If, as I believe, the conceptual structures we construct today are too complicated to be specified accurately in advance, and too complex to be built faultlessly, then we must take a radically different approach.
Let us turn nature and study complexity in living things, instead of just the dead works of man. Here we find constructs whose complexities thrill us with awe. The brain alone is intricate beyond mapping, powerful beyond imitation, rich in diversity, self-protecting, and selfrenewing. The secret is that it is grown, not built.
So it must be with our software-systems.
— Frederick P Brooks, Jr., No Silver Bullet: Essence and Accidents of Software Engineering[b]
Still, when Brooks talked about "growing" software he still meant a largely deliberate engineering approach. The idea was that we would create the final product in small steps, as our understanding of the problem grew better and better by continuously applying our constantly-growing and improving solution to it. He never hinted that, like farmers of old, who "engineered" their sheep by picking the most wooly ones for breeding, we'd be merely slightly influencing the direction of a process that we only barely understood and, as dog breeders know only too well, frequently produced unwanted results that plagued lineages. In fiction, Vernor Vinge gets a bit closer. In his A Fire Upon the Deep, programming as we do today is not a thing. Instead programmer-archaeologists sift through millenia-old archives of software and assemble them into something functional. It is still "built", but as the central plot of that book tells us, not understood.
With AI-assisted "vibe coding", we may be getting more Vinge than Brooks.
Coding with AI works, when you know what you want, and you know what works - you just want that AI to go look in its latent space for the actual text of the code. Designing with AI works great, too, when you know what you want and can push the AI on those points to ensure that the suggested design really does what you want. But in both cases the AI is not much more than either a glorified typist with a large library of templates, or a junior developer with a search engine - both immensely useful in their own way by letting the developer focus on the problem; but neither really lets us attack a new class of problems. We're doing the old stuff, bigger and better and faster. Excellent productivity gains, but still - old stuff.
Those who have tried to vibe-code production systems have found out, just like the farmers of old, that you need to cull undesirable traits. This can be incredibly difficult as they may not be as obvious as the desirable ones: the code runs and you get your webpage, which is obviously good, but since the LLM you used was trained on all the code out there and 90% of everything sucks it also has massive security vulnerabilities, which are non-obvious but very bad:
In the case of the startup Enrichlead, though, AI was definitely the culprit. Its founder boasted on social media that 100% of his platform's code was written by Cursor AI, with "zero hand-written code". Just days after its launch, it was found to be full of newbie-level security flaws - allowing anyone to access paid features or alter data. The project was shut down after the founder failed to bring the code up to an acceptable security standard using Cursor.
We may end up growing software, but just like we got sick of "growing" plants and animals and started directly genetically engineering them, maybe the "design" road is the one that is preferable.