Australian Library and Information Association
home > publishing > alj > 53.2 > full.text > Graduate information literacy skills: the 2003 ANU skills audit
 

The Australian Library Journal

Graduate information literacy skills: the 2003 ANU skills audit

Valerie Perrett

Manuscript received January 2004


The audit

In 2003 the Graduate Information Literacy Program (GILP) at the Australian National University (ANU) introduced an information literacy skills audit which was available online to all students enrolled in PhD and MPhil degrees. It was introduced for three reasons. We wanted to be able to give students advice about training courses they should attend and after completing the audit each student was sent a training needs profile which indicated which GILP courses they were recommended to attend. Staff teaching into the GILP program would have a better understanding of the skill levels of the students, and it would give them a better idea of what courses needed to be run and how often. It was also envisaged that the audit could provide a measure of the effectiveness of the GILP program by pre-testing and post-testing of students and linking this to information literacy training undertaken. By the end of 2003 discussions had taken place to enable this to be effected in the future.

The audit was developed using Apollo, an ANU-developed poll software. A link was provided from the GILP homepage and access was restricted to research students: GILP staff feared it would be an unmanageable workload if it was open access, as the poll software required individual responses to the students and secondly, the focus was on the information literacy skills that would be vital to completing a research degree. The skills tested included information searching, information management and advanced word processing skills.

Students were not individually scored but sectional scores were given, accompanied by advice regarding training on the following bases:

  • A strong recommendation for training was given if the student had answered correctly one third or less of the questions in that section.
  • A recommendation for training was given when students answered correctly two thirds of the questions in that section.
  • Where GILP courses extended over more than one session, recommendations could be made as to which session the student needed to attend.

During 2003, 132 graduate students completed the survey. Research approval was needed for this research and 107 students gave this permission.1

Their skills

Information searching skills

Section 2 of the skills audit asked students to self-rate their ability to find information. Fifteen per cent rated their skills as excellent, fifty-three per cent as good, thirty per cent as satisfactory and two per cent as generally poor. Sections 3 and 4 then tested students' actual ability to search for information using online databases and the web. In each section there were six questions. The first question in each was compulsory, asking students if they had any experience with database or web searching. If they responded 'No', they went to the next section. The skills tested were their understanding of Boolean operator searching, the ability to use truncation, their understanding of the web and its pitfalls. As we go though this section some of the actual questions used will be introduced.

Looking first at the combined database searching and web searching skills the graduate students scored an average of seven out of a possible maximum of twelve. The range of skills was from one to twelve with the median score being eight.


Self assessment of information searching skills and actual information skills
score Self-assessed as excellent Self-assessed as good Self-assessed as satisfactory Self-assessed as poor
  No further training needed and students knew it
17 students
No further training needed by students may think it was
0 students
12 1 2 - -
11 - 1 - -
10 2 11 - -
  Further training needed but students may not know it
41 students
Further training needed and students knew it
23 students
9 2 10 4 -
8 4 6 8 -
7 3 8 1 -
6 2 6 9 1
  Further training needed but students did not know it
15 students
Further training needed and students knew it
11 students
5 2 5 4 1
4 - 4 1 -
3 - 1 3 -
2 - 3 1 -
1 - - 1 -

Figure 1

Did the students correctly assess their own skills? It can be seen from Figure 1 that fifty-one students correctly assessed their skill level. The other fifty-six overestimated their skills. In the case of information searching, no students underestimated their skills. In terms of how that may have influenced their likelihood of seeking training, we can see from Figure 1 that fifty-six students may not have thought they needed training when they did and fifteen of those were seriously in need of additional training if they were going to be able to search for information effectively. The graduate students scored better at database searching than at web searching, interestingly, however, nine students had never before searched databases, while no student had no previous web searching experience.

Database searching skills

The average score for database searching skills was 3.75 out of a maximum possible score of six. The range of scores was from 0 (nine students) to six (twenty-one students). The median score was five.

Did the graduate students correctly assess their database searching skills?


Self assessment of information searching skills and actual database searching skills
score Self-assessed as excellent Self-assessed as good Self-assessed as satisfactory Self-assessed as poor
  No further training needed and students knew it
34 students
No further training needed by students may think it was
9 students
6 3 14 4 -
5 6 11 5 -
  Further training needed but students may not know it
24 students
Further training needed and students knew it
14 students
4 3 12 5 1
3 2 7 8 1
  Further training needed but students did not know it
15 students
Further training needed and students knew it
11 students
2 2 6 6 1
1 - 2 - -
0 - 5 4 -

Figure 2

Fifty-nine of the 107 students correctly assessed their database searching skills. Nine students underestimated their skills. Thirty-nine students overestimated their skills. Sixty-four students (fifty-nine per cent) were recommended to undertake additional training and twenty-six of these strongly were recommended to do additional training.

What did the students know about database searching?

Nine students (eight per cent) had not previously searched databases. Three of these students had done their undergraduate degree in Australia, five were Asian-educated students and one was educated in North America. Questions 14 and 15 tested students' understanding of Boolean operator searching. Fifty-four students (50.5 per cent) correctly answered question 14 and fifty students (46.7 per cent) correctly answered question 15. 48.6 per cent of graduate students had a firm understanding of Boolean operator searching. Questions 16 and 17 tested students' understanding of wild-card and truncation use. Question 16 tested for skill in using truncation: seventy-one students (66.3 per cent) were able to answer correctly. Question 17 tested for an understanding of the reason for using the wild-card features: sixty-one students (fifty-seven per cent) of students understood the use of the wild-card symbol.

Question 18 asked students if they could link back to library journal holdings from some databases.2 Seventy-one students (sixty-six per cent) were aware of this capability. As this feature is common in Australian university libraries it may be thought that the students who were unaware of this feature would be those who were overseas educated. However, out of the students who had previous experience using databases and did not answer this question correctly, nineteen were educated in Australia and eight were from overseas.

Section 1 of the survey asked students for demographic information to establish if there was any link between particular demographic features and database searching skills.

In Figure 3 we see some variation in skills in relation to faculties.

figure 3

Figure 3

On the age variable, the age group with the lowest set of database searching skills was those between 26 and 34 years of age. The average score for this age group was 2.43. This compares with an average score of 4.04 for the under-25 group and 3.97 for the 35-and-over group. Gender differences in database searching skills were insignificant: female students (sixty-six students) had an average score of 3.78 and male students (forty-one students) had an average score of 3.56.

Students who had done their undergraduate degree outside of Australia had on average very much poorer database searching skills than those educated in Australia. The average score for overseas-educated students was 2.90 (forty-two students). For Australian-educated students the average score was 4.26 (sixty-five students). If the scores for the overseas-educated students are broken down regionally, we can see that students educated in South Asia have the lowest level of skill with an average score of only 1.36. Students educated in the New Zealand and Pacific region had an average score of 4.5, those educated in North and South America an average of 3.6, those educated in Europe and UK an average of 3.5 and those educated in East Asia and the Middle East and Africa an average of 3. The weak database searching skills of the overseas-educated students point to the need to provide training for these students as soon as possible after arrival in Australia.

On analysis of the database searching skills and evidence of comprehension that knowledge of these skills was of use in planning training, it emerged that students did not really understand Boolean operators and their role in searching. It was surprising one third of the students were not aware how to link from databases to library journal holdings. In training courses this is now stressed. We concluded that no additional database searching training courses needed to be run.

Graduate students and web searching skills

The average score for web searching skills was 3.3 out of a maximum possible score of 6. The range of scores was from 0 (two students both of whom had completed their undergraduate education in Australia) to six (four students, three of whom had completed their undergraduate education in Australia). The median score was 3.

Did the graduate students correctly assess their web searching skills?


Self assessment of information searching skills and actual web searching skills
score Self-assessed as excellent Self-assessed as good Self-assessed as satisfactory Self-assessed as poor
  No further training needed and students knew it
16 students
No further training needed by students may think it was
4 students
6 1 2 1 -
5 3 10 3 -
  Further training needed but students may not know it
41 students
Further training needed and students knew it
17 students
4 4 19 5 -
3 4 14 10 2
  Further training needed but students did not know it
16 students
Further training needed and students knew it
13 students
2 3 11 9 -
1 - 1 3 -
0 1 - 1 -

Figure 4

Fifty-seven out of 107 students overestimated their web searching skills and may have not have been aware of their need for additional training. Only four students underestimated their skills. Eighty-seven students (81.3 per cent) were recommended to do additional training and of that number twenty-nine were strongly recommended to do additional training.

What did the graduate students know about web searching?

Only two students had no previous web searching experience. Question 20 tested students' understanding of the ability of a search engine to search the entire web. Forty-one students (38.31 per cent) understood that a typical search engine would not search the entire web. Question 21 tested students' knowledge of advanced search features, asking how they might restrict a search to websites from a particular country. Seventy-two students (67.3 per cent) correctly answered this question. Twenty-four students thought it would be necessary to look through a set of search results to determine what their country domain was.

Question 22 asked the students how the search engine Google would interpret the following search. The question read:

Using Google you enter the following search:
Tourism Bali Indonesia

Google will look for:
  • These words next to each other (a phrase search)
  • These words separated by OR (tourism or Bali or Indonesia)
  • These words separated by AND (tourism and Bali and Indonesia)
  • Will respond that your search strategy is invalid

This was a basic question about web searching: sixty students (fifty-six per cent) of the graduate students selected the correct answer, fourteen thought that Google would interpret this as a phrase search and twenty-seven thought Google would imply an 'OR' between search terms.

Question 23 asked students what the safe search filter can be used for. The responses are worth looking at in more detail. The question read:

A safe search filter will:
  • Stop the search from stalling if there are more than 100 000 results
  • Help prevent pornographic sites from being retrieved
  • Prevent your institution from being able to record the websites you have been viewing
  • Limit your search results to reliable websites

Thirty-two students (29.9 per cent) of the graduate students knew that the role of the safe search filter was to assist in preventing pornographic websites from being retrieved. While this is disturbing, it is perhaps more alarming that forty-seven students (44.8 per cent) thought that the role of the safe search filter could be used to limit search results to reliable websites!

Question 24 tested students' knowledge of ways in which you could try to limit a search to eliminate unreliable sites. Forty-seven students (44.8 per cent) selected the correct answer. Forty-three students thought that the accuracy of grammar and spelling would not help in trying to limit results to reliable sites.

In Figure 5 we can see that there is some relationship between web searching skills and area of study. Science students slightly outscore Fine Arts and Social Science students.

figure 5

Figure 5

Comparing database and web skills with subject of study it is interesting to see that Humanities students have the biggest difference between database and web searching scores with a difference of 0.65. Science students had the least variation with only a 0.26 difference.

Students 25 years and younger had the highest web searching score with an average score of four. As with database searching skills, it was students between the ages of 26-34 who had the weakest web searching skills with an average of 3.05. The 35-and-over group had an average score of 3.29. Both females and males had an average score of 3.3.

Australian-educated students had a higher average web searching score than overseas-educated students. The average score for the Australian-educated students was 3.55. For overseas-educated students the average score was 2.97. As with database searching scores, New Zealand and Pacific-educated students outscored Australians in web searching skills with an average of 3.75. UK/European students scored an average of 3.5, North and South American-educated students had an average of 3.3, students from East Asia an average of 3.2, those from the Middle East and Africa 2.5 and students from South Asia an average score of 1.81. It is confirmed that there is a real need to provide training for overseas students as soon as they arrive, otherwise they will be disadvantaged in their studies.

Did the web searching results come as a surprise?

Web Searching classes in 2002 were not well attended, and the inference that we drew was that students must be competent web searchers. Attendance following the introduction of the skills audit increased significantly, so it would seem that students too were surprised at the scores and wanted to improve their skills. The content of the web searching course was changed slightly after reviewing the skills audit results. Metasearch engine searching was introduced and more time was spent on explaining such things as Safe Search filters and techniques for evaluating web resources. The emphasis on search techniques remained unchanged.

Graduate students and information management skills

Section 5 tested students' skill in using one of two information management software programs, EndNote or BibTeX: ANU has a site licence for EndNote and BibTeX is available to ANU students who are using LaTeX as their word processing software. In testing skill in using information management software, it was implicit that competent use of such software was a prerequisite for research students.

In the self-assessment section of the audit students were asked to assess their ability to manage information sources, keeping references and storing data. This was considerably broader than the skills tested in Section 5. However only 73 of the 107 students who completed the audit, were confident of their ability to manage information either electronically or otherwise.


Self assessment of information management skills and actual information management skills
score Self-assessed as excellent Self-assessed as good Self-assessed as satisfactory Self-assessed as poor
  No further training needed and students knew it
8 students
No further training needed by students may think it was
15 students
6 - 3 4 -
5 1 4 8 3
  Further training needed but students may not know it
16 students
Further training needed and students knew it
14 students
4 1 8 3 5
3 2 5 5 1
  Further training needed but students did not know it
14 students
Further training needed and students knew it
40 students
2 1 1 1 -
1 - - 1 -
0 1 11 24 14

Figure 6

Only 23 students were exempt from a recommendation to do EndNote or BibTeX training: not surprisingly the research students came to ANU with poorly developed or no EndNote or BibTeX skills and EndNote training courses are in heavy demand, being fully booked weeks ahead. The average score for information management skills was 2.2. Fifty students (46.7 per cent) scored zero, that is they had no previous experience in using information management software. Only seven students (6.5 per cent) scored full marks. Of the students who had previous experience it became clear that while many had some understanding of the software, many did not know how to use its capabilities to interact with databases and have information imported into EndNote libraries.

Questions 26 to 28 tested skills in using EndNote to create bibliographic references, using styles and the interaction between EndNote and Microsoft Word. Of the fifty-two students who had previous experience with EndNote, eighty-nine per cent of students answered these questions correctly.

Questions 29 and 30 tested skills in being able to use EndNote to search databases and their ability to export references from databases into EndNote libraries. Students who had previously used EndNote did less well here, only thirty-seven per cent of students correctly answered these questions. As a consequence, many students were recommended to attend the EndNote Module B course only, which focused on importing database references into EndNote libraries.

Questions 31 to 35 were selected by the five students with previous BibTeX experience.

In Figure 7 we see that there is a wide variation in information management skills depending on area of study.

figure 7

Information management skills varied with age. Students aged 25 and under had an average score of 3.4, those between 26 and 34 had an average score of only 1.64 and those 35 and over had an average score of 2.32. There was a marked difference between the average scores for females and males. Females had an average score of 2.43 while the average male score was 1.83. Australian-educated students had a slightly higher average than overseas-educated students with an average score of 2.21. The average score for overseas-educated students was 2.1. Of the overseas-educated students those educated in North and South America had the lowest score with an average of 0.5. Overseas students from both New Zealand/Pacific and Europe had average scores above the Australian-educated students with scores of 3.75 and 2.78 respectively.

Graduate students and word processing skills

Section 6 tested students' ability to use word processing packages to the extent needed to produce a thesis of around 100 000 words. As the ANU provides access to Microsoft Word and LaTeX, these were the packages that skills were tested on. Students had an opportunity to indicate skills in using other packages, but none did. All but one of the 107 students had previous word processing experience but few had sufficient skill to produce a long document such as a thesis efficiently.

Were the students able to assess their word processing skills?


Self assessed skills and actual word processing skills
score Self-assessed as excellent Self-assessed as good Self-assessed as satisfactory Self-assessed as poor
  No further training needed and students knew it
11 students
No further training needed by students may think it was
6 students
6 2 - - -
5 3 6 6 -
  Further training needed but students may not know it
35 students
Further training needed and students knew it
21 students
4 5 14 11 1
3 6 10 7 2
  Further training needed but students did not know it
20 students
Further training needed and students knew it
14 students
2 3 14 8 1
1 - 2 3 2
0 - 1 - -

Figure 8

We can see that the majority could not assess their skills accurately in that fifty-six and six underestimated. Only seventeen students were not given a recommendation to attend one or more Word training session. Questions 38 and 39 tested students' knowledge of basic word formatting and related to skills taught in the 'Producing theses in Word module A' course. Seventy-three per cent of students were able to answer correctly question 38 and sixty per cent answered question 39 correctly. Question 40 asked about custom heading styles and at this point forty-eight per cent of students could answer correctly. Skill with templates, a skill that is very important if the students are to produce a consistently formatted long document, especially if it is the result of merging a number of separate documents was tested in question 41. Eighteen per cent of students were able to select the right answer to this question. Question 42 was concerned with producing a table of contents. All students are required to produce one for their thesis: seventeen per cent knew that a table of contents could be generated from the styles and table entry fields. Questions 43 to 47 tested the skills of students who were going to produce their thesis using LaTeX, a software package favoured by physical science students because of its superior ability to handle symbols. The average score of the seven students who answered the LaTeX questions was 3.86.

In Figure 9 we can see that there was some variation in word processing skills depending on the subject of study, with business students having the best scores.

figure 9

The word processing skills of Australian- and overseas-educated students were very similar. Australian-educated students had an average score of 3.20 while their overseas colleagues' average was slightly higher with 3.26. In both groups the median score was 4. Within the overseas-educated group, the students from the New Zealand and Pacific region had an average score of 4.25 and students educated in North and South America also had an average score significantly above the average with 3.66. Students from East Asia scored the least well with an average of 2.6. In this case South Asia-educated students scored closer to the average with 3.09. Age had very little impact on the word processing scores of the graduate students. The average score for the 25-and-under group was 3.27, for the 26-34 age group 3.2, and for the 35-and-over group 3.28. There was also very little difference in average scores for females and males. The average female score was 3.24 and the average male score 3.19.

Overall, the word processing results did not come as a surprise to us but many students must have been dismayed by their scores and the training recommendations they were given. We were now able to recommend a particular training module and while this was good for the students, it did create some problems for GILP staff, who found that demand for this module was excessive since the practice hereto had been to offer modules as a series, thus distributing demand. Inevitably we also had students attending who did not have the same skill base as those who had attended earlier modules. Obviously the Audit could not test skill in everything that was taught in a session and this was showing up! We decided that we would continue to offer the specific module recommendations and were considering extending the number of questions in this part of the audit. However, the audit had been designed to be completed in ten to fifteen minutes, and on average, they were taking all of fifteen minutes to complete it already.

The literature review question was included as one of four questions in Section 7 of the audit: it attempted to test students' understanding of the nature of a literature review. It was pleasing to see that 79.43 per cent of the graduate students were able to correctly answer this question.

Skills audit and GILP feedback and the identification of additional training requirements

Section 8 of the skills audit gave the students the opportunity to inform us of training they thought they needed that we were not currently receiving: it was felt that it gave us an ideal opportunity to get this information which would help in future course developments. The three most popular choices made were Photoshop (five students), GIS Mapping (four students) and Illustrator (three students). In all, thirty-two responses were received. The second question in Section 8 gave the students the opportunity to provide feedback on the audit, which was positive overall. Five students said that the survey was useful and/or a good survey. Comments included 'Having a survey was a great idea' 'Good survey' and some told us what they found valuable about it and (perhaps the most interesting) 'the survey revealed the true extent of my ignorance'. This linked back to one of our main objectives in developing the audit, as we believed that the students did not know what they did not know. Six students commented that there needed to be more 'don't know' options. As most questions were not compulsory, this came as a surprise: what we considered doing was to make the message about mandatory questions stand out more and pointing out that unless a question is marked with an asterisk it does not have to be answered. Most students did simply leave blank those questions they could not answer, and this is what was wanted since we were looking to provide a guideline to their training needs.

It was no surprise that graduate students do not invariably arrive at ANU as information literate, and the skills audit is one step in the process of ensuring that they acquire these skills early in their graduate career by providing some understanding of their skills and providing more focussed advice on the training that was available to them.

Endnotes

1 The ANU Human Research Ethics Committee gave approval on 17 October 2003.

2 Most ANU databases provide this capability.


Biographical information

Valerie Perrett is outreach/instructional librarian, Hamilton Library, University of Hawaii at Manoa, 2550 McCarthy Mall, Honolulu, Hawaii 96822 USA. e-mail: vperrett@hawaii.edu (Until very recently Valerie was information literacy co-ordinator (Graduate Students) at the Australian National University). She would like to acknowledge the following people who in their various ways contributed to the skills audit. Connie Leikas, David Young, David Thompson, Karen Visser, Brenda Chawner and Roy Perrett.


top
ALIA logo http://www.alia.org.au/publishing/alj/53.2/full.text/perrett.html
© ALIA [ Feedback | site map | privacy ] vp.ed 11:59pm 1 March 2010