Australian Library and Information Association
home > publishing > alj > 53.2 > full.text > Web usability testing in a polytechnic library
 

The Australian Library Journal

Web usability testing in a polytechnic library

Debby R Wegener, May Goh-Ong Ai Moi, and Mae Lim Mei Li

Manuscript received September 2003


Introduction

Web usability testing. The experts say we should conduct it regularly. They say we do not need a working website as a paper prototype will work just as well, and they tell us we need only a handful of test subjects and a questionnaire to get useful results. Some experts even give us a choice as to the type of testing, for example, focus groups, surveys, and heuristic evaluation. These experts come from all walks of life, from the library world to the information technology domain, and they all agree on one thing. Web usability testing is absolutely vital to the success of your website. And we agree with them wholeheartedly.

We, the Temasek Polytechnic Library's web team, have recently redesigned our digital library portal (http://spark.tp.edu.sg) in conjunction with our first round of usability testing. When our new portal was being developed last year, we felt we knew what the patrons wanted (and needed) so we did not worry too much about the fact that we could not conduct usability tests. We made a concerted effort to put ourselves in our students' shoes and, after getting some valuable feedback from the non-library members of the portal team; we came up with a design.

It soon became obvious after its launch, however, that our patrons were having problems using the portal. We were receiving questions though our online feedback system that made us realise that a number of our patrons could not even find the link for the library catalogue. If they could not find a simple catalogue link, what else were they not able to find? It was becoming clear that the patrons needed to be involved in the development of 'their' website, but the question was how to do this.

In the literature

A select review of the literature pointed to usability testing as the latest way of getting a website up to scratch. Part of a larger process called usability engineering that involves making products easy to use (Battleson, Booth and Weintrop, 2001) this testing process is generally understood to involve users performing a number of tasks on a prototype site. In fact, this is not all there is, as the testing itself can be divided into different methods or techniques.

Hom (Guenther, 2003) and Pace (2003) both refer to a toolbox of various testing methods such as inspections, interviews and questionnaires. Regardless of the method used, the best practice would be to involve the users from the beginning. They are the people who will be using the website so their point of view is essential.

While the heart of usability testing involves ensuring patrons are able to use the website to do what they need to do, it is also important to note that this process is:

  • very, very important to the success of a website;
  • quick and easy to perform;
  • not a one-off task.

The value of usability testing to the success of a library website cannot be over-emphasised. Members of a web/design team can get so involved in the creation of the website that they can find themselves unable to see the wood for the trees (Gore and Hirsh, 2003). It is frighteningly easy to lose sight of the perspective of your patrons. Simply setting some tasks and watching how your patrons approach these tasks on the website can provide some startling insights into the way they work.

Fortunately, getting to know how your patrons work does not necessarily require a lot of time or money. Jakob Nielsen (1998), a well-known usability engineering expert, states that with a bit of experience you would need only five users and two days. Nielsen does maintain, however, that at least fifteen users are needed to uncover all of a design's usability problems (Battleson, Booth and Weintrop, 2001). He also suggests the employment of only five users in order to distribute the budget across a number of small tests instead of using all one's money on only one test.

Making use of a number of smaller tests ties in with the idea that web usability testing should be an iterative process. As the world wide web is constantly growing and developing, a website that never changes will soon lose its appeal, so regular updates are always positive. Running regular usability tests to constantly poll patron opinion is a good way to ensure that these changes are in line with the patrons' needs. Of course, the bonus here is that the more you test, the easier the process will become.

Our methodology

Our first step was to redesign our digital library portal. Using a high content design template provided by Eccher (2002) we adapted it to include information about our library and our services. We wanted to give pride of place to our research gateway, a single access point to many of our resources, but we did not want this gateway to 'overshadow' all our other resources and services.

We found Johnson (2003) to be particularly helpful with his outline of the mistakes that people make with web design. He presents a very clear picture by showing what not to do. There are many rules and guidelines that can be followed when designing websites, but we found the most important to:

  • avoid library jargon at all costs: it is more incomprehensible to your patron than you may realise.
  • do not treat your website like a print resource. Patrons will not scroll down through reams of information. They scan web pages, so keep them short and sweet and avoid large chunks of text.
  • make sure your design looks good in most screen resolutions.

Once we had the new design prototype, the next step was to decide how best to involve our patrons. Our research suggested the most efficient way would be to observe them performing a number of set tasks on the new site. We already had a fair idea from our online feedback system of the problems our patrons were experiencing with the use of the new portal. We used their feedback as a basis for a list of task and interview questionnaires. (Our intention was that, while we watched, the patrons would work through the tasks, and then answer a number of interview questions. We tested the task and interview questionnaires on two very helpful library staff members, made a few adjustments based on their feedback, and were ready to begin the actual testing.)

We decided to engage fifteen students and fifteen academic and administrative staff as we felt this would give us a good cross-section of our users. We also included fifteen library staff on the basis that because they work constantly with the patrons, they should be able to provide us with some valuable insights. We were extremely fortunate with our selected respondents, as we had no problems persuading them to participate in our testing. Apart from a new and improved portal, all we offered as an incentive was candy, and only a handful of people said they were too busy to spend time with us. We selected our respondents randomly from those students and staff already in the library, as this was the quickest and easiest way for us. We did not expect our results to be too greatly skewed by this as we were, at this stage, mainly interested in those who use the library anyway. We managed to get full-time and part-time students in their first, second and third years of study. We also found staff from all the academic departments of the polytechnic.

We conducted the tests using a laptop with the prototype website on CD-ROM. We also made the existing website available so that, by toggling between the old and new versions, respondents were reminded of what they had been using. We were fortunate in that most of the testing could take place during the term break, so there were not many patrons around for us to disturb.

Looking at the results

The testing was spread out over fifteen days, so as not to get in the way of our other duties, and the average time spent by each respondent was only fifteen minutes. Even though the shortest time taken was a mere five minutes, the results were all extremely useful. In some cases, the results were even surprising. Our biggest surprise came with our choice of labels. Although we are often told not to use jargon, librarians still have the tendency to do this, and we were no exception. We had chosen the name 'InfoWise', in the belief that our patrons would easily identify the label as belonging to our quarterly library newsletter. During the testing, however, we found this label meant very little to some respondents. This meant that a very useful system for disseminating valuable library information was going to waste.

Another very useful system to publish the latest library announcements had been designed for us during our digital library portal project. We had named this system 'e-Bulletin' and confidently assumed that everyone would realise its purpose. We soon discovered during the testing that this system was being passed over. The respondents either did not know what it was, or thought it was a bulletin board on which to post messages. Even some of our library staff were having problems in this area. Yet another example involving library jargon was 'Ask a librarian'. Many library websites include this link as a place for patrons to contact the library staff to ask questions, complain, or provide feedback. Some respondents indicated familiarity with such a link, for example, using it for the answer to the question of when the library closed during the vacation. As it turned out, however, many of our patrons considered this a place to ask research questions only, so we were missing out on very valuable patron feedback.

An even more obscure example of patrons not knowing what we were talking about involved OPAC. Before the launch of our portal, we had links to 'Web OPAC' on our site to distinguish it from the Windows version of our library catalogue. Our new portal was implemented with a new library system and only one version of OPAC, so we changed the link to read OPAC (instead of Web OPAC). As many of our patrons refer to OPAC as 'oh-pee-ay-cee' instead of 'oh-pack', removing the 'Web' from 'Web OPAC' proved too confusing for them. Other findings revealed that more than half of our student respondents had either never, or hardly ever, used our portal despite being regular library users. One of the reasons given for this was that navigation was too difficult. The students had to be taught how to use the site, and many did not have the time or the patience to seek this kind of help.

We also found it extremely interesting to watch how the respondents navigated the prototype website. Some concentrated on the information on the sides of the screen only. They would move the cursor rapidly from one side of the screen to the other, seemingly not to be able see the images and text in the middle. One respondent even suggested we move all the important information to the sides. Conversely, other respondents seemed unaware of the information on the sides and concentrated instead on the middle of the screen. Two more respondents moved off the first screen with their first click and never returned. Although we were actually testing navigation of the front page, we allowed these respondents to continue as a matter of interest.

While some of the comments received were beyond the scope of the project, for example, translating the site into Chinese, most of the comments revealed exactly where improvements were needed. On the prototype site our font turned out to be too small for some to read easily, our links were too similar in colour to the text, and it was difficult in some instances to see where one link ended and the new one began. In the interests of fitting everything on one page and making all the colours match, we had overlooked these points and actually made the page more difficult to use.

What pleased us the most was that approximately ninety-five per cent of the respondents were excited about the new design and layout. Some of the respondents were extremely appreciative, and went as far as thanking us for the effort we put into the new design. Some even asked us when the new design would be ready, as they could hardly wait to use it.

Our conclusions

Moving from the prototype design to the finished product did not involve nearly as much work as we had expected. Listening carefully to the feedback and complaints of our patrons helped us a lot in this regard, by giving us a good idea of their needs and expectations. Listening to the experts on web design from outside of the library profession was also helpful. And, of course, web usability testing helped us tie it all together.

We found this type of testing to be the quickest way of getting patron opinion on our website, as one round of testing highlighted points that may have taken us months to discover. And it did not cost us much in terms of time and money either. To put it very simply, if you need to know how people do things - watch them. If you need to know what they think - ask them. Web usability testing lets you do both, with a minimum of fuss.

Before the testing

before the testing

After the testing

after the testing

Test questions:

  1. Where do you go to renew your books online?
  2. How do you find out how many books you may borrow?
  3. If you want to find the latest in library news and announcements, where do you go?
  4. Can you book a study room using the portal?
  5. Where would you go to send an e-mail to complain about how noisy the library is?
  6. How would you find out if the library had the book called Pioneers of Singapore?
  7. What time does the library close during the vacation period?
  8. Can you easily find the information you need? (on a scale of 1 to 5 with 1 being 'very difficult' and 5 being 'very easy')
  9. How would you rate the overall design in terms of font, colour and layout? (on a scale of 1 to 5 with 1 being 'Yuks! very poor' and 5 being 'excellent')

Interview questions:

  1. Do you actually use the current library portal? If 'yes', how often, and if 'no' why not?
  2. What would you like to see in the Lifestyles section?
  3. Is there anything you really like about this website?
  4. Do you like the way this site is organised (in terms of headings, categories etc)?

References

Battleson, Brenda, Austin Booth and Jane Weintrop (2001). 'Usability testing of an academic library website: a case study' Journal of Academic Librarianship, 27 [3] p188, viewed 14 August 2003, EBSCOhost database Academic Search Premier.

Crum, Janet (2003). 'A tale of two needs', Computers in Libraries, vol 23, No 1, p22, viewed 3 May 2003, EBSCOhost database Academic Search Premier.

Eccher, Clint (2002). Professional web design: techniques and templates. Charles River Media: Hingham, Massachusetts.

Gore, Pamela and Sandra G Hirsh (2003). 'Planning your way to a more usable web site', Online 27 [3] p20, viewed 3 May 2003, EBSCOhost database Academic Search Premier.

Guenther, Kim (2003). 'Assessing web site usability', Online 27 [2] p65, viewed 3 May 2003, EBSCOhost database Academic Search Premier.

Nielsen, Jakob (1998). Jakob Nielsen's Alertbox for May 3, 1998: cost of user testing a website. [Online] http://www.useit.com/alertbox/980503.html

Pace, Andrew K (2003). 'The usability toolbox', Computers in Libraries, 23 [1] p50, viewed 5 May 2003, EBSCOhost database Academic Search Premier.

Useful references on website design

Carpenter, Beth (2002). 'Avoiding the pitfalls of web design', Computers in Libraries, 22 [10] p40, viewed 5 May 2003, EBSCOhost database Academic Search Premier.

Ensor, Pat (2000). 'What's wrong with cool?', Library Journal, 125 [7] p11, viewed 3 May 2003, EBSCOhost database Academic Search Premier.

Hohmann, Laura Kaspari (2001). 'Prescriptions for usable library web sites', Online 25 [4] p54, viewed 5 May 2003, EBSCOhost database Academic Search Premier.

Johnson, Jeff (2003). Web bloopers: 60 common web design mistakes and how to avoid them. Morgan Kaufmann: San Francisco, CA.

Keiser, Barbie (2002). 'Enhancing the value of your website' Online 26 [1] p54, viewed 5 May 2003, EBSCOhost database Academic Search Premier.


Biographical information

Debby R Wegener BA, HigherDip(LibSci), MAppSc(InfoStudies) is originally from Zimbabwe. Currently employed at Temasek Polytechnic in Singapore as a reference librarian, she is also leader of the Library's Webmaster and User Education team. Goh-Ong Ai Moi, May Dip(LibStudies) is an assistant information officer in the Temasek Polytechnic, Singapore. Appointed to the Digital Media Repository implementation committee, she develops and publishes the theme-based content packages. She is also one of the Temasek Polytechnic Library webmasters. Lim Mei Li, Mae BA (Arts and Social Science), MSc (Information Studies) is a reference librarian in Temasek Polytechnic, Singapore. She is the one of the webmasters taking care of the Digital Library portal. She also leads a team of staff in promotion and publicity as well as being the editor of the library newsletter.


top
ALIA logo http://www.alia.org.au/publishing/alj/53.2/full.text/wegener.html
© ALIA [ Feedback | site map | privacy ] rw.ed 11:59pm 1 March 2010