Skip to Main Content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.

University Library, University of Illinois at Urbana-Champaign

Usability Testing: Best Practices

So, you have a new web site or piece of software. To make it most useful, you need to have some users actually test it. This LibGuide lists some resources for conducting usability tests.

Usability Testing Best Practices

Before the session

  • Remember, you are testing the software or the website, not the user
    Make this clear to the participants. They will point out things they don’t like. Expect them to do things you didn’t plan for. After all, that is why you are taking the time to get their input. Realize that they may hate features that you designed. If they perceive any rejection or evaluation of their opinion, you will have lost them for the rest of the session.

  • Expect to go through several iterations of testing and fixing
    If you test early (with paper drawings and users’ imagination), making iterations will be easier. Track the problems at each stage as well as the changes made. Some changes will have unexpected consequences. Keep good records. The earlier you discover an issue (good or bad), the easier it is to make the software better. This means that the development team needs to be on board before you begin. At least one developer should be involved in the testing.

  • Determine the makeup of your desired user group
    Are they experienced with "X"?  Do they all come from a certain background or work environment? Try to have a mix of testers with many that are similar and some that are different from your expected users. Keep track of the characteristics of the users so that later you can make correlations like, “The testers over fifty had trouble reading the blue on grey text." If you can make your site work for everyone, you will have a better chance for larger acceptance. And, if you have a few important users, you can make sure that they quickly find what they need.

  • Create your tasks for the testing sessions
    A test scenario should explore the user experience rather than the features of the software. Have a set script that doesn’t change between participants. Don’t guide the user. You can explain the scenario, but don’t help them with the software. The words you use for the scenario should not be clues to named features in the software. Start with general questions, then transition into greater detail. Spend more time watching and taking notes than talking. You want to find out what the user does when you are not there. 

  • Get consent or release forms signed before a session begins

 

During the session

  • Make the user comfortable before the test begins
    You want the experience to be as easy as when they go to the website from their home. Don’t have more than one or two observers in the room. It is hard to be natural with people watching over your shoulder.

  • Keep the testing short
    The interaction with the software should be no longer than about twenty minutes. Most people get tired after that and their feedback becomes less informative.

  • Let the user do the work
    After setting up whatever starting scenario you want, don’t touch the mouse or keyboard. The point of the testing is to see what a user might do. You won’t be there to help the users after the site goes live, so now is your chance to figure out how to make the software help them.

  • Encourage the user to describe what they are thinking at each step
    It is not enough to know that they clicked a certain location. You need to know why they thought that location would work. Maybe they clicked on the search box because it was in the upper right rather than because it had the word “Search” beside it. By knowing that, as you change the layout, you will know what to keep (the location) rather than accidentally breaking a functionality that you didn’t understand.

 

After the session

  • Review the recording and make notes while the session is fresh
    Editing the recordings is usually for reports and presentations rather than evaluating the software. Clips can be extracted to give examples of issues for people that weren’t present in the testing session. Remember to preserve people’s privacy and treat them with respect

  • Review your notes and the recordings again right after the entire round of testing is finished
    Make a list of things the users did. Include what they liked and didn’t like. Count the number of people, their user group and number of times each thing came up in testing. This lets you focus on the issues that will have the greatest impact. 

  • After you have a list of issues, talk to the software developers to find out what changes could possibly fix each issue
    Find out which are easy or hard to implement and how well the change might affect each issue. Listen to the developers, listen to the users and then balance the needs of the users with the difficulty of development.

  • Include the higher-ups
    Make sure they have been presented with the highlights from the user testing as well as the input from the developers. Develop an action strategy and then act on it. After the changes have been incorporated, go through another round of user testing with new participants. Don’t assume that you fixed the problems. Assume that extra problems may have been created. Only real user testing can show how the software will be received.

Extra Tips

Here are a few useful tips for usability testing.

  • Not everyone uses the software in the way that you do. People will expect to find different things. Your software may work great for you, but other people are not you. This is why you do usability testing. You want to discover how other people will try to use your software/website.
  • When recording, have a slightly wide camera position. Expect the user to move in and out, right and left.
  • If a user gets completely stuck, the moderator can tell the user what to do and then ask how it could have been made clearer.
  • Find out why users clicked on the links they chose, and find out what they expected to see after clicking on each link. 
  • Ask participants what they expect they can do on a given page and what they expect to happen when they click on given links.
  • Share the results with the testers. They have become invested in it and will be interested in seeing the final product.
  • Pay attention to search terms that users enter into search boxes. Developers can make these keywords prominent.
  • Never use buzz words or jargon when talking to a user.

Additional Resources

Contact the Scholarly Commons

Profile Photo
Scholarly Commons
Contact:
220 Main Library
Drop-ins welcome
Monday-Thursday 10:00am-4:00pm
Phone: 217-244-1331
Website
Social: Twitter Page
Subjects: Savvy Researcher