To the front pageThe Interaction Designer's Coffee Break - Weekly postings and quarterly articles about interaction design  
  To the front pageSign inTo the frontpageSearch in GUUUI postingsAbout GUUUI  
   
 

BROWSE GUUUI POSTINGS

 

11

Tips on moderating open-ended usability tests

Listening labs is Mark Hurst open-ended version of the traditional think-aloud test. He has put together some tips on how to moderate a open-ended test.

Some highlights:
- Don't write out specific tasks before the test, since the test should be based on where, how, and why people will use the site
- Don't lead the user in any way
- Act only on the lead of the user
- Avoid opinion-based questions
- Avoid conditional or theoretical "if" questions since they won't spotlight users' real-world actions
- Keep the user in "use mode", and avoid "critique mode"

Links:

  • The article Four Words to Improve User Research

Henrik Olsen - January 25, 2005

Permanent link Comments (0)

See also: Tips and guidelines (65) 


 

12

Usability Test Data Logger

The Usability Test Data Logger is an Excel spreadsheet developed by Todd Zazelenchuk, which can be used to collect, analyse, and present results of usability tests. It allows you to measure task completion rates, analyse questionnaire data, and summarise participant comments. It automatically generates charts and includes a timer to measure task completion times.

Links:

  • The Usability Test Data Logger

Henrik Olsen - August 17, 2004 - via Column Two

Permanent link Comments (0)

See also: Tools (51) 


 

13

Conduct usability tests regularly and constantly

According to Janice Fracer, usability testing is most effective when it's a low-stress routine activity, rather than a special event that requires a lot of attention. Successful organizations conduct usability tests on a regular, fixed schedule, integrate results quickly into the product, and spend less money.

To develop a effective culture of usability you should:
- Test regularly and constantly (once a month or more)
- Train a couple of staff members to conduct the tests
- Test with five people at a time
- Perform the tests in-house
- Keep reports crisp and to the point
- Make changes immediately
- Leave recruiting to others

Links:

  • The article The Culture of Usability

Henrik Olsen - July 15, 2004

Permanent link Comments (0)

See also: Tips and guidelines (65) 


 

14

Calculating confidence intervals of usability test

Imagine a usability test where five out of five participants completed all tasks successfully. What are the chances that 50 or 1000 will have a 100% completion rate? By calculation confidence intervals, you will be able to tell that the chances lies somewhere between 95% and as low as 48%.

In his article, Jeff Sauro shows us how calculate confidence intervals of usability tests.

Links:

  • The article Restoring Confidence in Usability Results

Henrik Olsen - July 08, 2004

Permanent link Comments (0)

See also: Tips and guidelines (65) 


 

15

Five users in a test is not enough

The discussion about how many users is enough for a usability test has been going on for years. Research by Jakob Nielsen and Tom Landauer showing that tests with five users will reveal an average of 85% usability problems has been seen as a proof that five is enough.

According to Laura Faulker, Nielsen and Laudauer's prediction is right. Five users will reveal 85% usability problems - on average. In a study, she found that problems found with five users range from nearly 100% down to only 55%. Thus, relying on a single set of five users, we run the risk that nearly half the problems could be missed.

Dr. Eric Schaffer concludes that "…for a routine usability test run 12 people for each segment. For an important one where the stakes are high run 30. If resources are really tight, you can drop to five-six per segment, but this is bad."

Links:

Henrik Olsen - July 01, 2004

Permanent link Comments (0)


 

16

How to measure a website's value

How does one measure how well a website communicates the value of its offerings? Jared M. Spool from UIE has come up with a variant of usability testing called Inherent Value Testing. It goes something like this:

1. Recruit a minimum of six loyal users and six inexperienced users, who meet the target profile.
2. Start with the experienced group and ask them to give you a tour of the site and share the features they use and like the best.
3. Bring in your new users and let them work through the same scenarios as the experienced users, find out which pages they visit, what they like and what the don't like about the site
4. Compare the results and find out whether the site revealed the same benefits for the new users as those you heard form the experienced users.

Observing experienced users will tell you which offerings people appreciate, while observing new users will tell you which precious offerings that people fail to spot.

Links:

  • The article Inherent Value Testing
  • The article Conducting Inherent Value Testing

Henrik Olsen - February 29, 2004

Permanent link Comments (0)


 

17

Listening labs vs. think aloud tests

If you heard about Mark Hurst, you probably also heard about listening labs. In his October 1, 2003 newsletter, Mark explains the how and why of the method.

Listening labs is Mark's version of the traditional think-aloud test. But instead of predefining tasks for the users to conduct, you give them tasks on-the-fly based on what they want to do on the site. In this way your test will not only tell you if users can do what you want them to, but also if the product can do what the users want to do.

Links:

  • The article Four Words to Improve User Research

Henrik Olsen - October 01, 2003

Permanent link Comments (0)


 

18

User research techniques in comic book form

Dan Willis has created a condensed overview of some of the core techniques used in information architecture. The descriptions are in a comic book form and serve as entertaining reminders of some of our development options. Willis one-pagers cover sitepath diagramming, topic mapping, free listing, card sorting, and personas.

Links:

  • IA Classics: Tools of the Trade in Comic Book Form

Henrik Olsen - April 28, 2003 - via Usability Views

Permanent link Comments (0)

See also: Site and flow diagramming (4)  Posters (5)  Card sorting (8)  Personas (13)  The design process (14) 


 

19

A test of screen recording systems

Karl Fast has made a thorough test of three screen recording systems to decide which is most suitable in usability testing settings.

His conclusions:
"ScreenCam had the best performance of any program tested, but the lack of support for Windows 2000 and XP makes it hard to recommend."

"Camtasia offers the best blend of performance, features, and ease of use among the programs tested." "The only drawback is price, but at $150 it's still within the range of almost every budget. Highly recommended."

"My first impression of HyperCam was that for $30 I was getting what I paid for. But once I fiddled with it and found the "secret" of using Camtasia's TSCC codec, I was entirely satisfied. Unless you need the extra features of Camtasia, HyperCam will probably do the job..."

Links:

  • The article Recording Screen Activity During Usability Testing
  • Lotus ScreenCam
  • TechSmith Camtasia
  • Hyperponics HyperCam

Henrik Olsen - April 06, 2003

Permanent link Comments (0)

See also: Tools (51) 


 

20

Usability Myths Need Reality Checks

Will Schroeder looks at some common Usability myths that have cemented themselves into our profession's foundation and started questioning how they got there.

Links:

  • UIE - Usability Myths Need Reality Checks

Tim Lucas - March 23, 2003

Permanent link Comments (5)

See also: Research (93)  Web page design (23)  Navigation (46) 


<< Back More >>

Browse GUUUI postings

Methods and the design process

Usability testing (30)  Prototyping and wireframing (32)  Cost-justification and ROI (19)  The design process (14)  Personas (13)  Requirement Analysis (12)  Card sorting (8)  Implementing user-centred design (7)  Expert reviews (6)  Web log analysis (7)  Eye-tracking (7)  Site and flow diagramming (4)  Use Cases (3) 

Design elements

Navigation (46)  Web page design (23)  Search (24)  Guidelines and Standards (10)  Links (12)  Text (13)  Forms (11)  Ads (6)  Site design (8)  Shopping Charts (5)  Error handling (5)  Sections (5)  Home pages (2)  Design patterns (4)  E-mails (1)  Personalization (1)  Sitemaps (1)  Print-freindly (1)  Help (2) 

General aspects

E-commerce (21)  Accessibility (11)  Information architecture (12)  Persuasive design (13)  Visual design (14)  Search engines (7)  Credibility, Trust and Privacy (6)  Web applications (2)  Intranets (1) 

Technology

Flash (6)  URLs (3)  Download time (2)  Javascript (3)  Web standards (2)  Browsers (2) 

Humor

Cartoons (8)  Funny tools and games (10)  Bad designs (7)  Fun with Jakob Nielsen (6)  Designs with humor (3)  Fun music and videos (4)  Fun posters (2)  Funny 404 pages (2)  Misc humor (3) 

Ressource types

Research (93)  Tips and guidelines (65)  Tools (51)  Books (32)  Cases and Examples (12)  Interviews (10)  Primers (9)  GUUUI articles (8)  Posters (5)  Online books (5)  Glossaries (2)  People and organisations (2) 

Information sources

Blogs (11)  Websites (9)  Discussion lists (4)  News (3)  Newsletters (3)  Online magazines (3)  Wikis (1) 

 

 
     
  To the front pageSign inTo the frontpageSearch in GUUUI postingsAbout GUUUI