![]() home > publishing > alj > 52.2 > full.text > When 'best value' may not quite say what it means: a reflection on measuring quality in public services in the United Kingdom |
|||
The Australian Library JournalWhen 'best value' may not quite say what it means: a reflection on measuring quality in public services in the United KingdomKeith Trickey
'Aye' said the boozer, 'I tell you it's true sir, Manuscript received April 2003 Have you ever been to one of those parties where the company was good and the drink flowed free and everything was going well until you realised the impact that the amount you had imbibed would have on you the following day? What follows is a review of the development of the United Kingdom obsession with 'value for money', and performance indicators that have led to the current 'best value' approach that is working through the public sector in the United Kingdom. I hope to identify the values and problems with the approach offering a salutary warning to those considering following this approach. The basic tools for this approach have been formulated by the Audit Commission, whose significance has grown from a useful checker of columns of figures to the major player in the United Kingdom government assessment of quality service delivery. I confess to being enmeshed with the works of the Audit Commission back in 1984 when I used the effectiveness of their Colleges document to evaluate a teaching school in the college in which I was then working. I was very aware of the limits of the instrument that I used, I was also made aware of the very limited understanding the college management had about the implications of their resource decisions. That was a long time ago. The context for public sector accountability in the United Kingdom needs to be reviewed to make sense of the position we now find ourselves in. Some of the analysis involves reviewing political approaches as they condition the use made of particular techniques and also help in understanding the drive behind such initiatives. Performance measurement has a two-hundred year history so what changed to give it such prominence in United Kingdom thinking? The current very British obsession with performance indicators evolved during the Thatcher administration in the early 1980s. This was a response to the acknowledgement that government expenditure was out of control. It was no longer possible for a government minister to predict the relationship between financial inputs and identifiable outputs, so they were unable to estimate with any sort of precision the actual cost of delivering a policy objective. This was because governments, both local and central were still operating to the same post-war model that evolved to manage far smaller organisations with far smaller budget commitments, and the standard response to solving a problem had become to throw more money at it. The building of the British Library at the St Pancras site is a classic example of a public sector project that, post-Thatcher, was deemed financially out of control, in part due to the year-on-year budget uncertainties faced due to financial strictures. The Thatcher agenda was overtly about 'value for money' but was really about cost reduction. Consequently the criteria for evaluation (efficiency, effectiveness and economy) were biased towards the lowest cost option, with quality being a lower priority. The tools that were developed at that period by the Audit Commission were crude yet effective enough to allow for primitive comparison. Once indicators were available to make comparisons, comments about the relative effectiveness of a service were possible. A major political football in the United Kingdom for the last twenty years has been the education system. The concern with education and measuring educational effectiveness, as evidenced by the introduction of a detailed National Curriculum covering education from 5 to 16, led to the evolution of a school inspection system. This system evaluated all schools on a five-year basis so that league tables could be produced to indicate the quality of the school, and to inform parental choice. This was linked with performance tables based on Nation Curriculum levels of achievement and examination results. The instruments were crude, but popular with the achievement-oriented middle class voters who would review possible schools by evaluating their websites and their position in the local and national league tables - in a way applying a managerial approach to the education of their children. I was always more interested in the ethos and culture of the school and the nature of the relationships between teachers and pupils and pupils with pupils - knowing my own children were robust enough to survive and learn from the odd bit of academic rough and tumble caused by changing government or educational fashion. However the league tables became a significant measure of 'goodness' of a school. Naturally the activity of a school became biased towards the achievement of appropriate results in terms of the league table as funding followed performance. Sadly some of the fun of learning evaporated from the primary school as the focus turned to hitting the targets. Schools fitted the inspection model well. There were a range of them serving a particular area (it is important to remember that these initiatives are driven from London and tend to be designed to suit reasonably urban environments) so one school being rewarded for excellent performance would (theoretically) act as a stimulus to other schools in the area which risked losing pupils (and funding) to their 'excellent' neighbour. If the school was deemed to be 'failing' it could be closed or placed under (dynamic) new management. So the inspection system created a theoretical 'market' in education provision allowing 'good' schools to increase their pupil numbers and failing schools to be eased out of the system. The flaws in this approach are evident - as the pupils in a school that is designated as 'failing' have their education potentially blighted if the school goes down with all hands. In 1997 to a chorus of 'Things can only get better', the Conservative administration of John Major was swept out of power and the 'New Labour' Blair regime came in on a tidal wave of public sector expectation of better times and a more humane and equitable approach to resource allocation. However it became very clear very quickly that the majority of the mechanisms instituted by the previous administration would stay in place and be trimmed to fit current concerns. The model that the incoming Blair administration looked to to spread the political objective of quality service to more areas of the public sector was, like many of their policies, a reworking of existing initiatives with a more humane twist. So 'Best value' (BV) became the answer, the base for assessment being an entire local authority, a traditional county like Lancashire or Cumbria or a metropolitan borough like Liverpool or Birmingham. The exercise required the authority to produce a vast range of documentation as evidence it was delivering services to clear standards. These standards have to cover both cost and quality. The impact of this requirement on authorities was considerable as project teams were formed to remove the 'brightest and best' of local management from their frontline tasks so they could help drive forward the best value team within the authority, to allow the authority to meet the requirements of the inspection. The Thatcher regime had the 3 'e's - efficiency, effectiveness and economy, are bedded into the BV approach of the 4 'c's - challenging, comparing, competition and consultation. The assessment is based on two main criteria: Does the authority provide a good service? Is the authority going to improve? The basic assumption in this process is that an authority can go through a continuing process of improvement. The definition of a good service involves three elements: Are the authority's aim clear and challenging?
The authority has to have corporate aims and a community plan neatly planned out and programmed. As for challenging the need for the 'service', this process is compromised by statutory requirements. A local authority cannot decide to stop disposing of dead bodies, or providing statutory support services. However a library information service is less tied to obligatory statutory provision - no health risk occurs if it does not open. Does the service meet these aims?
The requirement to demonstrate and document that you are doing what you say you are doing creates a requirement for record keeping and reporting which can be disproportionate, and again bleeds resources away from front line services. However the rules of the game require that you do it this way. How does its performance compare?
'Best value' requires an almost obsessive concern with comparing own performance to that of other similar authorities. The process you have to engage in can become the major organisational focus or obsession, displacing its primary reason for being as the authority tries to metamorphose its processes to suit Best Value requirements. The second evaluation on improvement helps bed the BV process into the authority. The question is simple - does the BV review drive improvement? What is being looked for here is the way the BV process is driven. Has the authority fundamentally challenged what it does? Has good use been made of consultation? Have rigorous comparisons been made throughout the review? Is the authority's procurement activity competitive? The results of BV review are available for inspection. The review process leads to a grading of the authority in terms of its performance and attitude. Four terms are used to describe the level of activity with in the authority. Failing - these are authorities whose performance falls below the accepted or expected standard. Coasting - the authority is taking too relaxed an approach to the BV process, and needs to focus up, get motivated and start developing service provision. Striving - the attitude is right, the authority is hopefully on its way to the next coveted category. Beacon - these authorities are like 'cities on a hill' 'whose excellence can encourage their striving colleagues to improve' according to the rhetoric. Depending on the condition of the authority the style and content of the inspection will vary: Refer - if the authority is failing and appears to lack the internal capability to sort it out, this option is used and external consultants could be put into the authority to run a specific service. Challenge - if the authority is merely 'out of condition' then a suitable challenge might do wonders to promote effectiveness. Persuade - sadly, certain authorities do not buy into the process, although they perform fairly effectively. In this case a challenge would not work - but persuasion might. Encourage - when an authority is making a real effort then this style is appropriate. Celebrate - this is the Beacon recognition - I always think of the party sequence at the close of The Empire strikes back! We are a nation subject to at least ten years of progressively increasing spin in political presentation. Even making that allowance I find the quasi-evangelical language employed in BV disconcerting. When this is interpreted into decisions again the risk is a triumph of style over substance. The recent designation of beacon library services contained several examples of an otherwise mediocre library service picking a specific niche area and majoring in it to achieve beacon status and the accolades, including the funding that go with it. The move towards a common basis for comparison was evident in the publication of Comprehensive, efficient and modern public libraries - standards and assessment in 2001. It outlined expected levels of performance and a series of indicators which libraries had to collect as part of the BV process. The purpose is basically sound - to attempt to set minimal standards of service across the public libraries in England. However when it came to the details some were unhelpful, specifically the loan period 'the standard is a minimum of three weeks' which will lead services to migrate to an imposed loan period rather than reflecting local requirements. The same is true for the maximum number of books users are allowed to borrow at one time 'the standard is eight books'. The requirements for staff helpfulness - rated as 'good' or 'very good' 'the target will be ninety-five per cent for both adults and children' has an almost farcical quality. It is possible to understand what the standards are trying to achieve, however the prescriptive nature of the outcome will tend to lead to the massaging of the data to yield the appropriate figure. The same weakness is seen in the standard for users reporting success in gaining information as a result of a search or an enquiry : 'the target will be seventy-five per cent for both adults and children' - I assume that enquiries like 'Where's the bathroom?' will count as 'one' alongside 'what was the tonnage of potatoes imported from Egypt in 2001?', or 'who is the bald one in the E Street Band?' However these are the tools by which the service will be evaluated. The United Kingdom public sector culture sadly has consistently been one of acceptance, so when such tasks are handed down to library services the staff get on with it, and a massive opportunity is missed. The BV process absorbs a lot of staff time, much of which can be wasted as the local service simply jumps through the hoops rather than taking the opportunity to sort out the data it requires to evaluate the quality of the service it wishes to provide to its population. From that information it could then sort how that service can be appropriately advanced to meet local information needs. TailpieceI like stories, particularly metaphorical stories. When discussing Best Value with a colleague the following story came to mind. You remember the Athenian or Venetian galleys - all those poor slaves rowing themselves to death to transport the vessel elegantly over the waters of the Mediterranean? The United Kingdom under the Conservative administration was like a massive sustained storm for the poor public sector galley slaves - pitched and soaked, half drowned, no respect or recognition for work in a very hostile climate when even your paymasters did not value you. Then the government changed in 1997 and you thought you would be sailing into calmer waters with a chance to recover and row more comfortably towards harbour. However, just as you were getting your breath back, the wind stiffened, the waves started to run higher and an instruction comes down from 'on high' - 'We think you have been doing a really excellent job, we really value your contribution, sadly we cannot increase your rations and could you please sing in harmony as you row?' ReferencesGoodwin, T, Best value explained http://localgov.monster.co.uk/articles/value Great Britain Audit Commission Seeing is believing: how the Audit Commission will carry out best value inspections in England. Audit Commission, 2000 [Recent best value reports produced by local authorities are available at: http://www.bvpps.audit-commission.gov.uk/ - this collection is indexed by local authority name and is regularly updated to include new documents. Great Britain DCMS Libraries. Information and Archives Division Comprehensive, efficient and modern public libraries - standards and assessment. DCMS, 2001 available from http://www.culture.gov.uk/ Related documents: DCMS Comprehensive and efficient - standards for modern public libraries : a consultation paper. DCMS, 2000. DCMS Standards for modern public libraries: analysis of feedback results. DCMS, 2001 both available from: http://www.culture.gov.uk/heritage/ I&DeA Beacon councils: Round three Beacon councils - libraries as a community resource [http://www.idea.gov.uk/beacons/round3/#9] Patterson, A. B. Banjo Patterson favourites. Viking, 1992 Keith Trickey is a part-time senior lecturer in the School of Business Information at Liverpool John Moores University and lead trainer for Sherrington Sanders and delivers training for Cilip in 'cat and class' and personal development topics and is an established performance coach. |
|