Share if you like it!
And especially if you don't!

Improve your startup with product usability testing

No matter which stage your product is at, you can always improve it.

Here's a simple rule: always test everything!

What you can test

  • demand for the product (a simple landing page with a subscription form)
  • design concept (a/b testing or survey)
  • clickable prototype (usability testing)
  • MVP
  • And, of course, the mature product.

Let’s look at the mobile application for investing in alternative assets in a luxury domain. We tested a clickable prototype.

I prepared the design in Figma and connected all the screens to see how the app will work after the launch. Then we conducted several testing sessions. During the sessions, we sent people a Figma link and watched them do basic tasks in the app and work with information.

Why you need usability testing

You and your team are biased. No matter how confident you’re about your design or the founder is about the product, no one except real first-time users can say how convenient, clear, and logical your app works.

So usability testing sessions are great to rate the performance of a fresh new design.

Who to run usability tests with

You don’t need to invite a lot of people to such sessions. Usually, a founder has some acquaintances interested in the product or the domain. Invite these people, because their interest will get them highly motivated to test the app to its fullest.

It is also good practice to test the app with at least two groups of people. In our case, we had people with investment portfolios and some knowledge of the domain and those who have only taken the first steps and are doubtful about the topic and its value for them.

To perform our sessions we selected eight people plus one developer from our team who was not yet immersed in the project and didn't know the flow and the features.

Template for the table with participants
Template for the table with participants

We conducted each interview individually. Two people participated in each session from our side: one who led the conversation, and one who marked all comments, mistakes, and questions. This way, the moderator was never distracted by taking notes and could focus on the interview.

What we tested

We tested a few basic flows:

  • onboarding and registration
  • viewing basic asset information and analyzing details
  • shares booking during the initial offering period
  • buying and selling shares

We also dedicated some parts of our interview to learn about participants' investing experience, their investment decision process, and recent situations in the domain.

Example of the table with cases.
Example of the table with cases.

What we discovered in usability testing

Here are some insights we gained from product usability testing:

  • Lots of people don’t understand specific terms such as listing, bid, ask, etc. It’s a tricky moment because if people invested before, they easily know these terms, and it’s better to use them. If they had not, the terms mean nothing to them. Also, the assets our app provided weren’t the first thing people would invest in. So here we decided to simplify some terms and add some descriptions and onboarding guidance.
  • People want to know more about the app and how it works. We added a detailed process for the landing page and highlighted the steps people need to take and the technical process that goes on in the background.
  • People needed more information about the assets. The basic info was good, but not enough. Still, we cannot add more information because it’s just an MVP version of the product. Adding more things would delay the launch for months. So we created a list of items that were interesting for people and that they were looking for instead.
  • People really like real photos and the minimal interface — they told us that they are tired of illustrations and fancy things. As we built the financial app, where trust is really important, it was a great highlight for us.

Some of the respondents told us their doubts about the general idea of the app and investing in such assets.

Table with words aka Product Reaction card

For the first time in my life, I decided to test a new methodology — product reaction cards. At the end of the interview, I showed each participant a table with different words, both positive and negative, and asked people to choose 3-5 words that described their experiences with the product.

Example of Product Reaction Cards table.
Example of Product Reaction Cards table.

It turned out to be very useful. This experience showed that even though there were some flaws in the app, we did UX well. For example, people sometimes said that they didn't understand some steps, or that there wasn't enough information, but they still chose words like "simple, clear and clean, great, crisp, professional and serious, accessible.”

It means that their impression of using the app was generally good and that they will use the app and come back to it.

How to organize results from usability testing

After the sessions, we ranked the places to improve by importance, criticality, and difficulty. We divided the final list into "what can be done immediately, with little effort" and "what is good to add after the MVP.”

In the first group, we included all the changes for the copy, small logical changes, pictures, etc.

Adding new functionality and data went to the second group.

Example of the table with results.
Example of the table with results.

Negative vs Positive outcomes

In addition to the improvement lists, we also highlighted the positive feedback.

Of course, the main focus was on identifying problem areas to fix them, but it's very important to see the positive outcomes of the sessions as well. It helps the founders to see their strengths and the team to highlight the right direction of work.

What’s next?

Right now the application is being finalized, and the founder is looking for investments. Once we start working on the new stage, we’ll go for another round of tests to understand how the changes that have already been implemented worked out and how the developed product works in general.

Often founders skip the testing phase because it seems long and expensive. In fact, it can save you a lot of money on fixing the mistakes later.

The approach can vary from product to product. Testing can be short or unmoderated, different automation tools can be used or individual functions or screens can be tested. Skipping testing is still risky, especially when you're working with a product that doesn't have a large established audience and serious competitors.

Meet the author 🤝

Elena Grechits is the Head of Design at Paralect and Co-founder of Salt & Bold design agency — learn more about her design and leadership approach in this interview:

Previous Article
Most common challenges faced by startups
Next Article
A Quick Guide to Product vs Project Management

Get the useful tips on building startups to your email

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.