# Analyzing Results

We aim to simplify and streamline the process of analyzing UX data so you can get to insights faster and plan your next iteration.

Analyzing the result of a user test can be a slow and tedious process if you follow the standard approach, which is to watch all of the recordings, take notes, and then try to tease out patterns. But this process can be streamlined if you’ve designed good task prompts and taken measurements by asking survey questions like our scale question. When you measure things data blocks can show you at a glance how your test went. This lets you jump to who had the most trouble first for example.

TIP

Clicking on a data block instantly loads up participant session replays, sorted from the worst to best experience, so that you can start to understand what’s behind the quantitative scores.

Data Blocks
Data blocks are your jumping off point to explore what people liked the most or the least.

Different types of measurements can help answer different questions. For example, you can see what types of problems people encountered most often or what people found most engaging.

# Analysis by Goal

Depending on your goal there are subtly different approaches to analysis, but no matter your approach, what you're going to be doing is jumping from the high-level summary data to the details of what happened, and eventually arriving at the "why"—the moment when you have an insight.

For deeper-dive guides on analyzing your data for various research goals, see these doc pages.

# Iterating

Once you’ve analyzed your results, you’ll come away with some insights about how to improve what you're building. After you implement those changes, test again to see if they led to improvements. SoundingBox makes it easy to re-run your test, by letting you clone your previous study. With each iteration, you can learn new things about how to improve the experience.

Learn more about our iteration tools here.