5 min

My Step-by-Step User Testing Workflow

Published on
July 25, 2025
Contributors
Michelle Warner
Founder, TinyBird Creative, Inc.
Earlier this week, I ran a user testing training with my team and realized: why does user testing still feel so foreign to a lot of product teams?
After reflecting on past experiences and chatting with the crew, the consensus was pretty clear—it’s often a mix of lack of budget or testing being handled by a separate research or strategy team.
That might work in a big org, but for smaller teams or scrappier projects, it’s clunky. And honestly, it feels counterintuitive. Shouldn’t the designers who are creating the experience get direct feedback from the people they’re designing for?
My hot take: Designers should always be involved in user testing firsthand.
No more hand-offs into a black box where the results come back in a slide deck and you have no idea what the user actually said or did. I’ve been there—it’s frustrating.
So I figured I’d share the process I’ve honed over time. These steps might not apply to every single testing scenario, but the core structure has served me (and my teams) well.

1. Make a Testing Plan

I know this sounds basic, but you’d be surprised how many people skip it. I treat testing plans like I treated lab reports in college—document everything. They keep everyone aligned and make it easier to look back later.
Here’s what I include:
  • Problem statement
    Example: “Users are having trouble finding what they’re looking for on our makeup e-comm site.”
  • Goal
    Something like: “Test if a new IA structure helps users navigate the site more intuitively.”
  • Study details
    What tests are we running? How many users? Mobile or desktop? Moderated or unmoderated?
  • Audience
    Clearly state who you’re testing—even if it’s just “open to everyone.”
  • Test breakdown
    Write out your exact intro script, task prompts, follow-up surveys—everything. This gives you a chance to refine before you ever hit publish.

2. Build the Actual Test

Yes, you have to actually build the test (shocking, I know). But the number of times I’ve seen this step go sideways—even with a solid plan—is wild. Each type of test (card sort, usability, five-second test, etc.) has its own setup needs. I won’t get into all of them here (although now I want to write a follow-up blog), but one rule applies across the board:
Always QA your test before releasing it.
Run through it yourself as if you were a participant. Test how the prototype loads, how the flows behave, and how it all works on the testing platform. I’ve had sessions fail because a prototype took too long to load and users bailed. That’s not just annoying—it’s costly if you're burning through test credits.
Also yes, I once saw a test that had 75 questions. And yes, users were mad by the end of it. Don’t be that person.

3. Synthesize Your Data

Now the (sometimes painful) part: making sense of the results. This is where you go from stats, quotes, and user footage to actionable insights. It can be tedious, but it’s also where the magic happens.
Now, I have mixed feelings about AI—but I do see its value in certain situations. One of those is definitely data synthesis. I like to think of it more as a "rename layers" assistant than a "design the whole product" tool. It won’t do the heavy thinking for you, but it can help wrangle messy raw data into something you can actually work with.
Let’s say you’re running a card sort in FigJam and don’t want to pay for a full testing suite. You can screenshot your test results, feed them into ChatGPT or another tool, and use prompts to organize, summarize, and identify trends. It’s not doing the thinking for you—it’s just clearing the table so you can.

4. Iterate, Iterate, Iterate

I know timelines are tight. But if you have the opportunity to test again, take it. You’ll never catch everything in the first round, and that’s okay.

The goal isn’t perfection—it’s learning fast, adapting, and shipping better. Design is full of curveballs. Iteration is how you catch them. And if one extra week of testing gets you a better product (and happier users), that’s a trade worth making. Hopefully, your client sees it that way too 😉

Good luck out there in your testing adventures.
❤️ Michelle