Skip to main content

University Library, University of Illinois at Urbana-Champaign

Usability Testing - Best Practices: Home

So, you have a new web site of piece of software. To make it most useful, you need to have some users actually test it. This LibGuide lists some of the best practices that have been discovered for conducting usability tests.

Usability Testing: Best Practices

Remember, you are testing the software or website, not the users. 

Make this clear to the participants. They will point out things they don’t like. Expect them to do things you didn’t plan for. After all, that is why you are taking your time to get their input. Realize that they may hate some feature that is your delight. If they perceive any rejection or evaluation of their opinion, you will have lost them for the rest of the interview.

Expect to go through several iterations of testing and fixing. 

If you test early (with paper drawings and users’ imagination) the iterations will be easier. Track the problems at each stage as well as the changes made. Some changes will have unexpected consequences. Keep good records. The earlier you discover an issue (good or bad), the easier it is to make the software better. This means that the development team needs to be on board before you begin. At least one developer should be involved with the testing.

Decide the makeup of your desired user group.

Are they experienced with "X"?  Do they all come from a certain background or work environment? Try to have a mix of testers with many that are similar and some that are different from your expected users. Keep track of the characteristics of the users so that later you can make correlations like, “The testers over fifty had trouble reading the blue on grey text." If you can make your site work for everyone, you will have a better chance for larger acceptance. And, if you have a few important users, make sure that they can quickly find what they need.

Plan what you are going to do during the testing sessions.

A test scenario should explore the user experience rather than features of the software. Have a set script that doesn’t change between participants. Don’t guide the user. You can explain the scenario, but don’t help them with the software. The words you use for the scenario should not be clues to words in the software. Start with general questions first then pull out greater details. Spend more time watching and taking notes than talking. You want to find out what the user will do when you are not there. 

Make the user comfortable before the test begins.

You want the experience to be as easy as when they go to the web site from their home. Don’t have more than one or two observers in the room. It is hard to be natural with people watching over your shoulder.

Keep the testing short.

The interaction with the software should be no longer than about twenty minutes. Most people get tired after that and their feedback becomes less informative.

Let the user do the work.

After setting up whatever starting scenario you want, don’t touch the mouse or keyboard. The point of the testing is to see what a user might do. You won’t be there to help the users after the site goes live, so now is your chance to figure out how to make the software help them.

Encourage the user to describe what they are thinking at each step.

It is not enough to know that they clicked a certain location. You need to know why they thought that location would work. Maybe they clicked on the search box because it was in the upper right rather than because it had the word “Search” beside it. By knowing that, as you change the layout, you will know what to keep (the location) rather than accidentally breaking a functionality that you didn’t understand.

Review the recording and make notes while the session is fresh.

Editing the recordings is usually for reports and presentations rather than evaluating the software. Clips can be extracted to give examples of issues for people that weren’t present in the testing session. Remember to preserve people’s privacy and treat them with respect.

Review your notes and the recordings again right after that entire round of testing is finished.

Make a list of things the users did. Include what they liked and didn’t like. Count the number of people, their user group and number of times each thing came up in testing. This lets you focus on the issues that will have the greatest impact. 

After you have a list of issues, talk to the software developers to find out what changes could possibly fix each issue.

Find out which are easy or hard to implement and how well the change might affect each issue. Listen to the developers, listen to the users and then balance the needs of the users with the difficulty of development.

Include the higher ups.

Make sure they have been presented with the highlights from the user testing as well as the input from the developers. Develop an action strategy and then act on it. After the changes have been incorporated, go through another round of user testing with new participants. Don’t assume that you fixed the problems. Assume that extra problems may have been created. Only real user testing can show how the software will be received.

Extra Tips

Here are a few useful tips and tricks for usability testing.

  • Not everyone uses the software in the way that you do. Other people will expect to find different things. Your software may be great for you, but other people are not you. This is why you do usability testing. You want to discover how other people will try to use your website.
  • When recording, have a slightly wide camera position. Expect the user to move in and out, right and left.
  • Remember to get consent or release forms signed before a session begins.
  • If a user gets completely stuck, the moderator can tell the user what to do and then ask how it could have been made clearer.
  • Find out why users clicked on the links they chose, and find out what they expected to see after clicking on the link. 
  • Ask participants what they think they can do on a page and what they expect to happen when they click on each link.
  • Share the results with the testers. They have become invested in it and will be waiting to see the final product.
  • Pay attention to the search terms that users enter into search boxes. Developers can make those key words prominent.
  • Never use buzz words or jargon when talking to a user.

links & list

Scholarly Commons

Scholarly Commons's picture
Scholarly Commons
306 Main Library
Drop-ins welcome
Monday-Friday 9am-6pm
Phone: 217-244-1331
Social:Twitter Page