To the front pageThe Interaction Designer's Coffee Break - Weekly postings and quarterly articles about interaction design  
  To the front pageSign inTo the frontpageSearch in GUUUI postingsAbout GUUUI  
   
 

BROWSE GUUUI POSTINGS

 

11

Formal vs. informal usability reports

Formal reports are the most common way of documenting usability studies, but according to Jakob Nielsen informal reports are faster to produce and are often a better choice.

"You can maximize user interface quality by conducting many rounds of testing as part of an iterative design process. To move rapidly and conduct the most tests within a given time frame and budget, informal reports are the best option."

Links:

  • The article Formal Usability Reports vs. Quick Findings

Henrik Olsen - April 25, 2005

Permanent link Comments (0)


 

12

Test review of Morae

NetworkWorldFusion has tested Morae, a software tool for usability analysis from TechSmith that records video and audio of the users along with system data (e.g. mouse clicks, keystrokes, web page changes). Their overall rating is "very good".

Pros:
- Affordable
- Annotates collected data indicating web page changes, mouse clicks, keystrokes, text data appearing on screen, and window events such as opening and closing applications

Cons:
- Remote monitoring and management capabilities could be improved
- Captured data can get quite large (in the gigabyte range)
- Only supports Windows and prefers Internet Explorer

Links:

  • More about Morae at TechSmith.com
  • Review of Morae Recorder

Henrik Olsen - February 09, 2005

Permanent link Comments (0)

See also: Tools (52) 


 

13

Tips on moderating open-ended usability tests

Listening labs is Mark Hurst open-ended version of the traditional think-aloud test. He has put together some tips on how to moderate a open-ended test.

Some highlights:
- Don't write out specific tasks before the test, since the test should be based on where, how, and why people will use the site
- Don't lead the user in any way
- Act only on the lead of the user
- Avoid opinion-based questions
- Avoid conditional or theoretical "if" questions since they won't spotlight users' real-world actions
- Keep the user in "use mode", and avoid "critique mode"

Links:

  • The article Four Words to Improve User Research

Henrik Olsen - January 25, 2005

Permanent link Comments (0)

See also: Tips and guidelines (66) 


 

14

Usability Test Data Logger

The Usability Test Data Logger is an Excel spreadsheet developed by Todd Zazelenchuk, which can be used to collect, analyse, and present results of usability tests. It allows you to measure task completion rates, analyse questionnaire data, and summarise participant comments. It automatically generates charts and includes a timer to measure task completion times.

Links:

  • The Usability Test Data Logger

Henrik Olsen - August 17, 2004 - via Column Two

Permanent link Comments (0)

See also: Tools (52) 


 

15

Conduct usability tests regularly and constantly

According to Janice Fracer, usability testing is most effective when it's a low-stress routine activity, rather than a special event that requires a lot of attention. Successful organizations conduct usability tests on a regular, fixed schedule, integrate results quickly into the product, and spend less money.

To develop a effective culture of usability you should:
- Test regularly and constantly (once a month or more)
- Train a couple of staff members to conduct the tests
- Test with five people at a time
- Perform the tests in-house
- Keep reports crisp and to the point
- Make changes immediately
- Leave recruiting to others

Links:

  • The article The Culture of Usability

Henrik Olsen - July 15, 2004

Permanent link Comments (0)

See also: Tips and guidelines (66) 


 

16

Calculating confidence intervals of usability test

Imagine a usability test where five out of five participants completed all tasks successfully. What are the chances that 50 or 1000 will have a 100% completion rate? By calculation confidence intervals, you will be able to tell that the chances lies somewhere between 95% and as low as 48%.

In his article, Jeff Sauro shows us how calculate confidence intervals of usability tests.

Links:

  • The article Restoring Confidence in Usability Results

Henrik Olsen - July 08, 2004

Permanent link Comments (0)

See also: Tips and guidelines (66) 


 

17

Five users in a test is not enough

The discussion about how many users is enough for a usability test has been going on for years. Research by Jakob Nielsen and Tom Landauer showing that tests with five users will reveal an average of 85% usability problems has been seen as a proof that five is enough.

According to Laura Faulker, Nielsen and Laudauer's prediction is right. Five users will reveal 85% usability problems - on average. In a study, she found that problems found with five users range from nearly 100% down to only 55%. Thus, relying on a single set of five users, we run the risk that nearly half the problems could be missed.

Dr. Eric Schaffer concludes that "…for a routine usability test run 12 people for each segment. For an important one where the stakes are high run 30. If resources are really tight, you can drop to five-six per segment, but this is bad."

Links:

Henrik Olsen - July 01, 2004

Permanent link Comments (0)


 

18

How to measure a website's value

How does one measure how well a website communicates the value of its offerings? Jared M. Spool from UIE has come up with a variant of usability testing called Inherent Value Testing. It goes something like this:

1. Recruit a minimum of six loyal users and six inexperienced users, who meet the target profile.
2. Start with the experienced group and ask them to give you a tour of the site and share the features they use and like the best.
3. Bring in your new users and let them work through the same scenarios as the experienced users, find out which pages they visit, what they like and what the don't like about the site
4. Compare the results and find out whether the site revealed the same benefits for the new users as those you heard form the experienced users.

Observing experienced users will tell you which offerings people appreciate, while observing new users will tell you which precious offerings that people fail to spot.

Links:

  • The article Inherent Value Testing
  • The article Conducting Inherent Value Testing

Henrik Olsen - February 29, 2004

Permanent link Comments (0)


 

19

Listening labs vs. think aloud tests

If you heard about Mark Hurst, you probably also heard about listening labs. In his October 1, 2003 newsletter, Mark explains the how and why of the method.

Listening labs is Mark's version of the traditional think-aloud test. But instead of predefining tasks for the users to conduct, you give them tasks on-the-fly based on what they want to do on the site. In this way your test will not only tell you if users can do what you want them to, but also if the product can do what the users want to do.

Links:

  • The article Four Words to Improve User Research

Henrik Olsen - October 01, 2003

Permanent link Comments (0)


 

20

User research techniques in comic book form

Dan Willis has created a condensed overview of some of the core techniques used in information architecture. The descriptions are in a comic book form and serve as entertaining reminders of some of our development options. Willis one-pagers cover sitepath diagramming, topic mapping, free listing, card sorting, and personas.

Links:

  • IA Classics: Tools of the Trade in Comic Book Form

Henrik Olsen - April 28, 2003 - via Usability Views

Permanent link Comments (0)

See also: Site and flow diagramming (4)  Posters (5)  Card sorting (8)  Personas (13)  The design process (14) 


<< Back More >>

Browse GUUUI postings

Methods and the design process

Prototyping and wireframing (34)  Usability testing (32)  Cost-justification and ROI (19)  The design process (14)  Personas (13)  Requirement Analysis (12)  Card sorting (8)  Eye-tracking (9)  Implementing user-centred design (7)  Web log analysis (7)  Expert reviews (6)  Site and flow diagramming (4)  Use Cases (3) 

Design elements

Navigation (46)  Search (24)  Web page design (23)  Text (14)  Links (12)  Forms (12)  Guidelines and Standards (10)  Site design (8)  Ads (6)  Error handling (5)  Sections (5)  Shopping Charts (5)  Design patterns (5)  Help (2)  Home pages (2)  E-mails (1)  Personalization (1)  Print-freindly (1)  Sitemaps (1) 

General aspects

E-commerce (21)  Visual design (15)  Persuasive design (13)  Information architecture (12)  Accessibility (11)  Search engines (7)  Credibility, Trust and Privacy (6)  Web applications (2)  Intranets (1) 

Technology

Flash (6)  Javascript (3)  URLs (3)  Browsers (2)  Download time (2)  Web standards (2) 

Humor

Funny tools and games (10)  Cartoons (9)  Bad designs (7)  Fun with Jakob Nielsen (7)  Fun music and videos (4)  Misc humor (4)  Designs with humor (3)  Fun posters (2)  Funny 404 pages (2) 

Ressource types

Research (94)  Tips and guidelines (66)  Tools (52)  Books (33)  Cases and Examples (13)  Interviews (10)  Primers (10)  GUUUI articles (9)  Online books (5)  Posters (5)  Glossaries (2)  People and organisations (2) 

Information sources

Blogs (11)  Websites (9)  Discussion lists (4)  News (3)  Newsletters (3)  Online magazines (3)  Wikis (1) 

 

 
     
  To the front pageSign inTo the frontpageSearch in GUUUI postingsAbout GUUUI