AbstractsPsychology

PREDICTING USABILITY PROBLEMS WITH USER GUIDES WITH EXPERT HEURISTIC EVALUATION

by Nicole Lampson




Institution: California State University – Northridge
Department:
Year: 2015
Keywords: user testing; Dissertations, Academic  – CSUN  – Psychology.
Posted: 02/05/2017
Record ID: 2134800
Full text PDF: http://hdl.handle.net/10211.3/145009


Abstract

ABSTRACT PREDICTING USABILITY PROBLEMS WITH USER GUIDES WITH EXPERT HEURISTIC EVALUATION By Nicole Lampson Master of Arts in Psychology, Human Factors and Applied Psychology This graduate project compares methods in human factors for improving the effectiveness of a consumer user guide in performing common automobile functions. The major methods used during this process included: (1) Heuristic Evaluation (HE); (2) User Testing (UT); and (3) Re-design of a user guide. Three scenarios were selected and were the focus of the HE, UT1, and UT2. During the HE, human factors experts were asked to identify any usability problems in the user guide involving the selected scenarios. In UT1, representative users were asked to complete the scenarios while following the user guide, with performance measured by failures, reported frustration, and time to complete. The user guide was re-designed based on all problems uncovered during HE and UT1 (using human factors principles), and re-tested in UT2 with five different users in order to validate the design changes and compare the performance to UT1. The first hypothesis was HE could identify at least 80% of the usability problems observed in UT1. The second hypothesis was (2) if all usability problems (both HE and UT1) were addressed in a re-design of the user guide, a re-test (UT2) should not introduce new usability problems and would take less time to complete the scenarios. The first hypothesis was supported, showing 93% (10 of 11) of usability problems observed during UT1 were discovered using HE (with the one problem not predicted in UT1 due to experienced frustration). The second hypothesis was not supported, as one new usability problem was introduced during UT2 (two failures for the step) and one previously uncovered problem was not mitigated by the design change (three participants experienced frustration). However, UT2 did show improved performance overall compared to UT1. Also, addressing the HE only problems in the re-design (four total), did not lead to new usability problems for those steps in UT2. This shows that HE can be a trusted method in finding the majority of usability problems with this system (one that does not require training to use), but re-designs should be verified using a second UT. Advisors/Committee Members: Blake, Tyler (advisor), Quilici, Jill L (committee member).