Australian Library and Information Association
home > groups > infolit > debate.topics > Debate topics
 

ALIA Information Literacy Forum

Debate topics: August 2002

Monday 19 August | Tuesday 20 August | Wednesday 21 August | Friday 23 August | Monday 26 August | Tuesday 27 August | Wednesday 28 August | Thursday 29 August | Wednesday 4 September

I've been lurking in Canada on your interesting discussion. In many ways I feel we're behind in many Canadian colleges and universities in integrating IL into the curriculum.

I've been experimenting with Portfolio Assessment for IL with an instructor (who sadly isn't teaching the course anymore) who was teaching an introductory environmental studies course. We ran this for 3 terms - having an 'information literacy portfolio' worth from between 20 to 30 per cent of the course.

It wasn't close to a great example of portfolio assessment, but I think as an authentic assessment tool, portfolio assessment has got some great potential for IL.

I'd be really interested in hearing if anyone on the list has tried portfolio assessment.

More information about our efforts can be found at http://io.uwinnipeg.ca/~khunt2/cv.html (click on Portfolio Method of Instructional Evaluation). And heh, I'm always available to come over and discuss f2f :)

Cheers!

Karen Hunt
Information Literacy co-ordinator
University of Winnipeg Library
515 Portage Avenue
Winnipeg, Manitoba

Hi,

Another poster asked how evaluation of IL is conducted in special libraries.

I'm currently researching a Master of Arts in the area of Information Literacy Instruction for journalists. Some issues that have arisen as a result of this research:

  • Special libraries don't have any criteria or curricula for IL unlike academic libraries.
     
  • Special librarians do not have time to plan and conduct training. Many libraries have less than four or five staff. It is very difficult to designate a staff member to work on training only.
     
  • Training for journalists has been somewhat lacking in the past. While journalists often learn significant aspects of their work on the job, such as learning the individual style guide requirements of each publication, there is little in the way of formal training in most news organisations.
     
  • Many journalists want training, yet lack motivation to attend in-house sessions. They prefer external training but their employers often won't give them time away from work for it. This isn't a huge motivator for librarians either if they are seen as 'second-best'.
     
  • Special libraries have to overcome the impression that they are there to solve all of the client's problems. Historically, librarians have performed most search requests for clients. The introduction of IL instruction means a shift in responsibility for searching to the client. This is causing a lot of tension in libraries across a wide range of sectors as librarians and clients adjust to these new roles.
     
  • Training budgets are very susceptible to cuts.
     

Therefore, the problem isn't so much a problem of evaluation, as it is a problem of having the time and resources to create structured IL programs in the first place. Many news organisations are planning and conducting training in a variety of information skill areas, but tend to do it on a one-on-one basis, answering what the journalist wants to know.

Other things that may be more relevant to special rather than academic libraries:

  • Motivation is critical in the workplace. In an academic library, IL instruction may be required. Motivation isn't really an issue. In the workplace, if a learner does not see any benefit to the training, they simply won't go.
  • Budget is also very important. Firstly, because training budgets are often very small to begin with, and secondly because managers want to see fiscal outcomes from training.

Therefore, any evaluation of IL in special libraries will have to focus on finding outcomes that are measurable in fiscal terms. For a news organisation, this could mean a reduction in staff turnover, fewer corrections and retractions, and decreased costs associated with more productive use of databases.

Fiona Bradley
MA Student
School of Media and Information
Curtin University

Dear Karen and INFOLITers,

I'm a big fan of portfolio assessment. I'm doing a research project at the moment looking at undergraduates' ways of experiencing information literacy in the context of an environmental science course. A Learning Portfolio was a major piece of assessment in the course and included:

  • tutorial preparation exercises
  • reflections on learning from lectures
  • reflections on learning from tutorials
  • field trip reflections
  • essay draft
  • completed essay
  • webography (critical analysis of five websites on topic for tutorial debate)
  • annotated bibliography on essay topic

What was great about the portfolio was that information literacy was fully-embedded into it, and not regarded as a separate element.

Mandy Lupton
ANU

Hi Karen! It's great to have you virtually here! How's things in W'peg?
Getting cooler I'm guessing...

Re: Portfolio assessment. There are a few academics that are using it here, but not so much with IL I don't think - yet. The two common difficulties with portfolio assessment which I hear repeated regularly seems to be the time involved in providing formative feedback to students and the guidance required for students to complete the portfolio - ie: it doesn't fit with the usual assignment regime they are used to so they feel concerned that what they are doing is not 'right'. It a worry that this kind of feedback might put people off using it as an assessment method - we really need to be collecting anecdotal evidence which shows how powerful it can be...

As students and staff become more familiar with the process, I can see that perhaps it's use will increase. It also ties in closely with broader graduate attribute initiatives, like student capability portfolios. When they take off, so will portfolio assessment to be sure. I hope so anyway.

Christine Bruce has used portfolios quite a bit - and particularly with regard to IL (no surprise there!). Perhaps she could provide some insight/ideas or just share with you some things to be avoided at all costs? Her e-mail is c.bruce@qut.edu.au.

Mandy, let us know how that portfolio assessment @ ANU went in terms of student performance and engagement?

Judy Peacock

Hello from New Zealand where I have been very interested to follow this discussion. I took a good look at your portfolio assessment, Karen, thanks for contributing this.

At Lincoln University we have been teaching IL as an assessed part of the curriculum for more than a decade now (makes me feel so old!) so have had plenty of time to experiment with assessment.

We teach an IL module as part of a generic first-year subject called 'Professional Studies' (I take no responsibility for the name) alongwith communication and computing modules. The subject is required for a number of undergraduate degrees and is tailored for those degrees. We also teach the module in a more integrated way as a four-week block embedded within other core subjects. We are currently lobbying to teach in this embedded and assessed way in a number of blocks through a student's university career rather than the first year only, but there are still plenty of obstacles to this.

Readers of this list might like to take a look at our project at
http://learn.lincoln.ac.nz/comn103/Infoproject/project.htm

which has a number of links including one to the marking schedule at
http://learn.lincoln.ac.nz/comn103/Infoproject/marksched.htm

We have mounted on the Web a sample project to give the students an idea about what is expected - this might make our assessment process clearer to you - you can follow the link or find it at http://learn.lincoln.ac.nz/comn103/Infoproject/sample.pdf

The project is scenario based and requires the students to describe and reflect on the IL process as well as to perform the basic IL steps.

The project does allow us the assess some of the higher-level skills but the down side of this is that marking is fairly labour intensive - typically 20 minutes per project (on a good day!) and time also for cross-marking between markers (just two of us). Students are expected to spend about 20 hours doing it, though this varies wildly. We developed the project (or its prototype) before the advent of the CAUL Information Literacy Standards but find, happily, that an audit of the objectives and outcomes of the project against the standards finds that they mesh very well.

We also set exam questions where we can test higher-level skills. The module covers 'cultural issues surrounding the use of information' (CAUL Information Literacy Standard 6) where we explore the impact of managing information from an oral tradition (like the Maori tradition) with tools which grew out of a western literary tradition. We investigate Maori attitudes to information, which vary widely from western views and also cover issues of intellectual property as they relate to Maori information. A typical long-answer exam question might be:

'Accessing traditional Maori information in libraries can be difficult. Describe some of the problems that may contribute to these difficulties. Your answer should contain reference to: (a) how Maori information is arranged and described in libraries (b) differences between the Maori and western views of information'.

Exam questions can also cover web evaluation. A typical question would suggest a scenario where your friend has been diagnosed with low iron levels (or some other ailment) and is becoming alarmed at the information she is finding on the web. She is unskilled at evaluating Web info and has e-mailed you in horror at her results. The question asks the student to write a full reply explaining to her the ways in which she can tell whether a website is reliable and useful, illustrating their answer with examples (which they can invent).

Assessing the second question is straightforward, but it's not unusual for racist responses to the cultural question. Our mark schedule allows us to stay objective and impartial in response to this (I hope!)

As we expand our programme on the campus there is increasing pressure on our resources. We may not always have the luxury of spending as much time marking or offering such a high level of personal feedback as we currently do. If we want to continue to assess higher-order thinking skills perhaps we need to share the marking with the academics. We have some instances where we do this but it's fairly embryonic and needs a lot of fine-tuning.

Would be glad to hear ideas on this, or any feedback on our assessment. Would also be very interested in hearing from anyone who includes in their teaching issues concerning information of indigenous peoples.

June Laird
Information Studies Librarian
Lincoln University Library

The two common difficulties with portfolio assessment which I hear repeated regularly seems to be the time involved in providing formative feedback to students and the guidance required for students to complete the portfolio - ie: it doesn't fit with the usual assignment regime they are used to so they feel concerned that what they are doing is not 'right'. It a worry that this kind of feedback might put people off using it as an assessment method - we really need to be collecting anecdotal evidence which shows how powerful it can be...

Yep - there are a few issues here. The time/workload issue is one for students as well as many students have the perception that continuous assessment is onerous. The students in the environmental science subject (Resources, Environment and Society) didn't actually receive formal formative feedback and this emerged as an issue. It was decided in the evaluation sessions that next year one of the tasks (probably the webography or equivalent) would be formally assessed early in the semester. The way it was done this year was that students presented the work in tutorials, but actually formally-submitted the stuff at the end of the semester.

Students were worried about the portfolio form of assessment if they hadn't come across it before (some had as many primary and secondary schools use portfolio assessment). The geography lecturer involved in the course also uses portfolios with his third-year class, so he posted some models for students to have a look at. If you are interested, they are at http://sres.anu.edu.au/people/richard_baker/examples/portfolios/index.html.

The tutorial preparation consisted of 'one-page summaries' that students needed to have in order to get into the tutorial. That is, they were not allowed into the tutorial if they hadn't done a 'one-page summary'. The summaries were structured - each week students had to address particular issues. For example, the first-week summary consisted of this series of questions:

What do you want out of this course?
What do you bring with you to this course?
What you think are the key factors for making a good discussion?
Your response to the National Museum Exercise
Your thoughts on the proposed assessment

One of the later weeks:

Come to the tutorial with a one-page summary with your estimate of what population you think Australia can sustainably support.

  • Briefly substantiate your figure by listing in order of importance the most significant resources that limit the population that this country can sustainably support
  • Come prepared to discuss and defend your opinion on the sustainable population figure and your list of the priority resources.

As you can see, the 'one-page summaries' are all designed to feed into the tutorial discussion. The population issue integrated with the 'webography', a critical analysis of five websites. Students worked in pairs and chose a subtopic to the population debate which they generated via a tutorial brainstorming session. The brainstorm and web search and evaluation strategies tutorial was taught by the librarian. The submitted webographies were place on the class website so that sharing of information could occur. Interestingly, this was an example of how it is so important to look at information literacy learning and teaching in context. It wasn't until the webographies were posted to the website and I was actually able to look at all of them, that I was able to evaluate our IL teaching. For example, we gave the students a website evaluation checklist to help in the analysis. Students had to critically analyse point of view, purpose, currency and how they would use the information in the five websites for the the debate. Some students worked through the checklist and addressed each factor separately which resulted in a fragmented response, while others integrated the suggestions of the checklist into an analysis which resulted in an integrated and holistic response. This result has convinced me not to use checklists again!!! See http://sres.anu.edu.au/people/richard_baker/examples/webographys/index.html for a couple of examples.

Mandy, let us know how that portfolio assessment at ANU went in terms of student performance and engagement?

I was extremely impressed with the level of reflection that was demonstrated in the portfolios (I collected the portfolios of my 20 research project participants). The whole subject was designed to encourage questioning and reflection, so it was a case of an integrated approach. For example all the lectures were interactive, with structured activities involving students asking questions and picking out key issues. In addition, there were weekly panel discussions where a panel was formed to debate a particular issue, for example, global warming. The panels consisted of academics, government reps, community organisations and environmental groups. At the end of each panel there was a question session and opportunity to follow-up issues. Student engagement was subsequently very high. The same goes for the tutorials as they consisted of debates, discussions, case studies and role-plays.

Mandy

ALIA logo http://www.alia.org.au/groups/infolit/debate.topics/28.08.2002.html
© ALIA [ Feedback | site map | privacy ] it.it 11:50pm 1 March 2010