Docs

Putting Your Data To Work

At its core, the value of user testing comes from the ah-ha moments when you watch people interact with your product and hear their thoughts about it. You see people struggle with the same elements. But to get to this point, you have to watch hours of video, take notes, and come up with themes. We aim to change this.

Analyzing the result of a user test can be a slow and tedious process if you follow the standard approach, which is to watch all of the recordings, take notes, and then try to tease out patterns. But this process can be streamlined if you’ve designed good task prompts and taken measurements by asking survey questions like our scale question. When you measure things we help you out by showing you at a glance how your test went. This lets you jump to who had the most trouble first for example. We do this through something we call smart tiles.

A smart tile annotated.
Smart tiles are how we summarize the things you measure, including survey questions such as scales.

Clicking on a smart tile instantly loads up participant session replays, sorted from worst to best experience, so that you can start to understand what’s behind the quantitative scores.

Meaning, on tap.
Smart tiles are your jumping off point to explore what people liked the most or the least.

Different types of measurements can help answer different questions. For example, you can see what types of problems people encountered most often or what people found most engaging.

Each test type is different

Each test type has subtly different approaches to analysis, but no matter the test type, what you're going to be doing is jumping from the high-level summary data, to the details of what happened, and eventually arriving at the "why"—the moment when you have an insight.

For more ideas about approaching your data for each test type, see the doc pages for each test type.

Iterating

Once you’ve analyzed your results, you’ll come away with some insights about how to improve your prototype or site. After you implement those changes, you’ll want to test again to see if they led to measurable improvements. SoundingBox makes it easy to re-run your test, by letting you save any test definition as a template. With each iteration, you can learn new things about how to improve the experience on your site or prototype.