How We Actually Build Digital Products
Over the past few years, we've learned what works and what doesn't when creating design tools. Our approach isn't revolutionary—it's just honest work that gets better results because we skip the nonsense.
Starting With Real Problems
Most software begins with features someone thought would be clever. We start by watching designers struggle with their current tools. What takes too long? Where do people get stuck? Those frustrations become our roadmap.
Last year, we spent three months just observing how agencies handle client revisions. Turned out the problem wasn't version control—it was that nobody could quickly show clients why certain design decisions mattered. So we built explanation tools instead of more file management.
This means some of our solutions look nothing like competitors. And that's fine. We're solving for actual workflows, not feature checklists.
What Guides Our Work
Speed Matters More
Every interaction should feel instant. We test on older hardware because not everyone has the latest MacBook. If something takes more than 200ms, we rework it until it doesn't.
Simple First, Always
Our rule: if a feature needs more than two sentences to explain, it's probably too complicated. We've killed plenty of "cool" ideas because they added cognitive load without earning their keep.
Designers Lead Development
Engineers don't decide how tools should feel—designers do. That might seem obvious, but you'd be surprised how often tech constraints drive product decisions. We flip that dynamic.
Learn From Usage Data
We track how people actually use our tools, not how we think they will. When we see someone doing something weird repeatedly, we assume they're solving a problem we didn't anticipate.
Break Things Privately
We test destructively with small groups before wider releases. Better to annoy 50 beta users than 5,000 paying customers. Early feedback catches the stuff internal testing misses.
Document Decisions, Not Code
Our docs explain why we built things certain ways, not just how they work. Future team members need context about rejected approaches and the constraints that shaped what shipped.
How Projects Actually Unfold
Research (2-3 weeks)
We interview users, review support tickets, and analyze where people currently get stuck. This phase produces a problem statement that everyone agrees on before we write any code.
Prototype (1-2 weeks)
Quick, messy builds to test core concepts. These usually look terrible but let us validate ideas cheaply. We've killed projects at this stage because the fundamental approach didn't feel right.
Build (4-8 weeks)
Actual development with daily check-ins between designers and engineers. We adjust as we go—rigid specs don't survive contact with real implementation constraints. Weekly demos keep everyone aligned.
Beta Testing (2-3 weeks)
Selected users get early access. We watch session recordings obsessively during this phase. The goal isn't finding bugs—it's seeing whether people understand the tool without reading documentation.
Launch & Learn (Ongoing)
We monitor usage patterns intensely for the first month, then regularly after. Most features get refined three or four times based on how people actually use them versus how we expected they would.
What Our Team Actually Does
Callum Thorne
Product Development LeadI spend half my time saying no to feature requests. Not because they're bad ideas, but because they'd make the product harder to use. My job is protecting simplicity when everyone wants to add just one more thing.
Bronwen Kelsall
UX Research DirectorI watch a lot of screen recordings. Sounds boring, but you learn everything from watching someone struggle with your interface for five minutes. The gap between what we intend and what users experience—that's where my work happens.
See How We Work With Clients
Our methodology shapes every project we take on. If you're curious how this approach might apply to your design challenges, let's have a proper conversation about it.
Talk About Your Project