
What Vibe Coding Actually Changes (It's Not What You Think)
Jerome used to manage teams of 20 to 25 developers. Today, he ships production software alone. Same CI/CD pipeline, same staging environment, same architectural rigor. No dev team.
Most of what you read about vibe coding is about what people build. A weekend project. A prototype. A demo. Jerome's story is different. He's been at this long enough to see what actually shifts when AI enters a real development workflow. And the answer has almost nothing to do with code generation.
One person, every stage, at the same time.
The thing Jerome does now that was structurally impossible before is manage multiple features at different stages of the development lifecycle, all in parallel. One feature in requirements. Another in implementation. A third in testing. A fourth getting pushed to production.
This used to require a team for a simple reason. No one can hold that much context. You forget the architectural decisions you made on Feature A while you're deep in testing Feature C. But AI holds the context. When Jerome switches between features and loses the thread, he asks. It remembers.
For a CTO, this reframes the question entirely. The bottleneck was never engineering hours. It was coordination overhead, context loss, and the cost of keeping humans aligned across stages. Remove that friction and a single experienced person covers ground that used to require a squad.
Jerome compared it to being put in a Ferrari after driving an old car your whole life. Exhilarating. Also exhausting. The cognitive load is real, even when the tooling is extraordinary.
Guardrails make maintenance easy. Their absence makes it brutal.
Jerome was blunt about his first vibe-coded app. It worked. It shipped. And it became a mess to maintain. Not because vibe coding produces bad code, but because nothing forced the code into a clean structure.
His second app was a different story. A senior developer helped him set proper architecture standards, design guidelines, and separation of concerns before a single line of code was generated. The result: new features slotted in cleanly, bugs stayed contained, and the AI itself performed better because it was operating within well-defined boundaries.
This is the part missing from most vibe coding conversations. The act of generating code is trivial now. The discipline of structuring a codebase so it stays maintainable at scale still requires judgment that comes from years of shipping production software. Jerome called it "compound engineering" — making sure each new brick reduces future maintenance rather than adding to it. AI won't do this on its own. It needs to be told. It needs constraints.
The processes haven't changed. Who runs them has.
Jerome was emphatic about something that might surprise people expecting a revolution. He still writes requirements. He still documents everything in GitHub issues. He still runs CI/CD, staging, and production environments as separate phases. He still tests in production after deployment.
None of these steps went away. What changed is that the conversation that used to span a product manager, a tech lead, and three developers now happens between Jerome and Claude. The back-and-forth that filled a two-week sprint resolves in a session. The handoffs disappeared, but the process held.
For CTOs who spent years building these workflows, his advice was direct. Don't throw them out. They're the reason his second app works and his first one didn't. Vibe coding accelerates execution. It doesn't replace engineering discipline.
The pull request question no one wants to answer.
I pushed Jerome on what might be the most uncomfortable operational question in this shift. If AI generates most of the code and the volume increases dramatically, who reviews it?
The traditional pull request model assumes human reviewers with enough context and time. That assumption is already cracking under the weight of AI-generated output. Jerome's take: maybe the answer isn't more reviewers. Maybe it's making the person who generates the code fully responsible for its quality, the way organizations once moved from separate QA teams to developer-owned testing.
He stopped short of saying kill PRs. But he acknowledged the model as we know it may not survive the shift. For CTOs, this isn't theoretical. It's an operational question arriving faster than most org charts can adapt.
What senior knowledge actually means now
The temptation is to look at Jerome's story and see a future where you need far fewer engineers. That misses the point. What Jerome demonstrated is that the value of experience just changed shape. The people who know how to logically separate systems, anticipate compounding risks, and define architecture standards that prevent a codebase from collapsing six months later — those people went from important to indispensable.
Jerome said it plainly. He doesn't think he could have done this coming out of school. The tools are accessible. The judgment to use them well still takes years to build.
The org chart is under pressure. The engineering methodology isn't. And the people who will define what comes next aren't the fastest coders. They're the ones who know which code should never be written in the first place.
—
Jerome Pasquero is a technologist and entrepreneur who has spent the past 15+ years building AI systems and software products used by millions of people. He previously served as Director of Machine Learning at Sama and held roles at Element AI and BlackBerry, where he was a prolific inventor with more than 150 patents to his name. He is currently co-founder of AI Vibe.