At Fern, alongside building out our multiplex infrastructure, we’ve also been exploring how AI can supercharge our small and mighty team. As a designer, I’ve been particularly curious about what AI means for the future of design. While this topic is well-trodden on design and crypto Twitter (yes, I still call it Twitter), I wanted to try it firsthand and form my own opinion.

A lot of discourse focuses on using AI to build products from scratch. But what about the more common case—building on top of existing systems? This post is a quick recap of that process and some takeaways. These are emergent insights, and they’ll evolve as the tools do.

I was curious: how does interactive prototyping change with these new tools?

Part 1: Becoming a vibe-coder by accident

To explore these questions, I built a sample app—a mini-FAQ page that pulled real help desk tickets, synthesized them into top FAQs, and displayed them as an app styled like our existing Fern for Business UI.

My stack:

I might have been in a bubble or something, but it was only after the experiment that I realized there was a term for this. Vibe coding. I had become a vibe-coder by accident.