-

@ Danny Morabito
2025-03-23 11:39:41
I don't believe in "vibe coding" – it's just the newest Silicon Valley fad trying to give meaning to their latest favorite technology, LLMs. We've seen this pattern before with blockchain, when suddenly Non Fungible Tokens appeared, followed by Web3 startups promising to revolutionize everything from social media to supply chains. VCs couldn't throw money fast enough at anything with "decentralized" (in name only) in the pitch deck. Andreessen Horowitz launched billion-dollar crypto funds, while Y Combinator batches filled with blockchain startups promising to be "Uber for X, but on the blockchain."
The metaverse mania followed, with Meta betting its future on digital worlds where we'd supposedly hang out as legless avatars. Decentralized (in name only) autonomous organizations emerged as the next big thing – supposedly democratic internet communities that ended up being the next scam for quick money.
Then came the **inevitable collapse**. The **FTX implosion** in late 2022 revealed fraud, **Luna/Terra's death spiral** wiped out billions (including my ten thousand dollars), while **Celsius and BlockFi** froze customer assets before bankruptcy.
By 2023, **crypto winter** had fully set in. The SEC started aggressive enforcement actions, while users realized that **blockchain technology had delivered almost no practical value** despite a decade of promises.
Blockchain's promises tapped into **fundamental human desires** – decentralization resonated with a generation disillusioned by traditional institutions. Evangelists presented a utopian vision of **freedom from centralized control**. Perhaps most significantly, crypto offered **a sense of meaning** in an increasingly abstract world, making the clear signs of scams harder to notice.
The technology itself had failed to solve any real-world problems at scale. By 2024, the once-mighty crypto ecosystem had become a **cautionary tale**. Venture firms quietly scrubbed blockchain references from their websites while founders pivoted to AI and large language models.
---
Most reading this are likely fellow bitcoiners and nostr users who understand that Bitcoin is blockchain's only valid use case. But I shared that painful history because I believe the AI-hype cycle will follow the same trajectory.
Just like with blockchain, we're now seeing VCs who once couldn't stop talking about "Web3" falling over themselves to fund anything with "AI" in the pitch deck. The buzzwords have simply changed from "decentralized" to "intelligent."
**"Vibe coding"** is the perfect example – a trendy name for what is essentially just **fuzzy instructions to LLMs**. Developers who've spent years honing programming skills are now supposed to believe that "vibing" with an AI is somehow a legitimate methodology.
This might be controversial to some, but obvious to others:
> **Formal, context-free grammar will always remain essential for building precise systems, regardless of how advanced natural language technology becomes**
The mathematical precision of programming languages provides a foundation that human language's ambiguity can never replace. Programming requires precision – languages, compilers, and processors operate on explicit instructions, not vibes. What "vibe coding" advocates miss is that beneath every AI-generated snippet lies the same deterministic rules that have always governed computation.
**LLMs don't understand code** in any meaningful sense—they've just ingested enormous datasets of human-written code and can predict patterns. When they "work," it's because they've seen similar patterns before, not because they comprehend the underlying logic.
This creates a dangerous dependency. Junior developers "vibing" with LLMs might get working code without understanding the fundamental principles. When something breaks in production, they'll lack the knowledge to fix it.
Even experienced developers can find themselves in treacherous territory when relying too heavily on LLM-generated code. What starts as a productivity boost can transform into a **dependency crutch**.
The real danger isn't just technical limitations, but the **false confidence** it instills. Developers begin to believe they understand systems they've merely instructed an AI to generate – fundamentally different from understanding code you've written yourself.
We're already seeing the warning signs: projects cobbled together with LLM-generated code that work initially but become **maintenance nightmares** when requirements change or edge cases emerge.
The **venture capital money** is flowing exactly as it did with blockchain. Anthropic raised billions, OpenAI is valued astronomically despite minimal revenue, and countless others are competing to build ever-larger models with vague promises. Every startup now claims to be "AI-powered" regardless of whether it makes sense.
Don't get me wrong—there's genuine innovation happening in AI research. But "vibe coding" isn't it. It's a marketing term designed to make fuzzy prompting sound revolutionary.
**Cursor** perfectly embodies this AI hype cycle. It's an AI-enhanced code editor built on VS Code that promises to revolutionize programming by letting you "chat with your codebase." Just like blockchain startups promised to "revolutionize" industries, Cursor promises to transform development by adding LLM capabilities.
Yes, Cursor can be **genuinely helpful**. It can explain unfamiliar code, suggest completions, and help debug simple issues. After trying it for just an hour, I found the autocomplete to be MAGICAL for simple refactoring and basic functionality.
But the marketing goes far beyond reality. The suggestion that you can simply describe what you want and get production-ready code is **dangerously misleading**. What you get are approximations with:
- **Security vulnerabilities** the model doesn't understand
- **Edge cases** it hasn't considered
- **Performance implications** it can't reason about
- **Dependency conflicts** it has no way to foresee
The most concerning aspect is how such tools are marketed to beginners as shortcuts around learning fundamentals. "Why spend years learning to code when you can just tell AI what you want?" This is reminiscent of how crypto was sold as a get-rich-quick scheme requiring no actual understanding.
When you "vibe code" with an AI, you're not eliminating complexity—you're **outsourcing understanding** to a black box. This creates developers who can prompt but not program, who can generate but not comprehend.
The **real utility of LLMs in development** is in augmenting existing workflows:
- Explaining unfamiliar codebases
- Generating boilerplate for well-understood patterns
- Suggesting implementations that a developer evaluates critically
- Assisting with documentation and testing
These uses involve the model as a **subordinate assistant** to a knowledgeable developer, not as a replacement for expertise. This is where the technology adds value—as a sophisticated tool in skilled hands.
Cursor is just a better hammer, not a replacement for understanding what you're building. The actual value emerges when used by developers who understand what happens beneath the abstractions. They can recognize when AI suggestions make sense and when they don't because they have the fundamental knowledge to evaluate output critically.
This is precisely where the **"vibe coding" narrative falls apart**.