At its core, the value of user testing comes from the ah-ha moments when you watch people interact with your product and hear their thoughts about it. You see people struggle with the same elements. But to get to this point, you have to watch hours of video, take notes, and come up with themes.
Rather than watching each video in no particular order, SoundingBox shows you instantly how your test went and lets you jump to who had the most problems first. We do this through something we call smart tiles.
Clicking on a smart tile instantly loads up participant session replays, sorted from worst to best experience, so that you can start to understand what’s behind the quantitative scores.
Different types of measurements can help answer different questions. For example, you can see what types of problems people encountered most often.
Or what aspects of the product people found most engaging.
Another aspect of measurement is that you can make objective comparisons. SoundingBox was designed to make comparing experiences simple and powerful. You can compare your current site or prototype to similar sites to get a read on which are the most usable or engaging.
Using the same view, you can load up multiple tests side-by-side and compare how changes to a prototype perform with each iteration.
Of course, you can also filter on a given task, or participant group, to make comparisons over time.
We created SoundingBox to accelerate analysis by bringing together the benefits of quantitative and qualitative user testing into one platform. Rather than sifting through hours of video or trying to theorize about what drove the results of a test, use SoundingBox to get to insights faster.