Civitas prison reform
prison reform prison reform
prison reform
Civitas Home Page About Civitas Author Search How to Purchase What's New Online Catalogue Online Catalogue

Warning: include(../sendpage/sendFriend.htm) [function.include]: failed to open stream: No such file or directory in /homepages/5/d168254365/htdocs/www/pubs/NHSBriefingApr05.php on line 69

Warning: include() [function.include]: Failed opening '../sendpage/sendFriend.htm' for inclusion (include_path='.:/usr/lib/php5') in /homepages/5/d168254365/htdocs/www/pubs/NHSBriefingApr05.php on line 69
Printer Friendly Page Printer Friendly Version
crime policy
crime policy
Online Briefing


The NHS: Has the Additional Funding Worked?

PDF Version PDF

  • NHS expenditure in England has increased from £33 bn in 1996/97 to £76 bn in 2005/06. But has the service improved?
  • Overall productivity. The Office for National Statistics estimated that from 1995 to 2003 productivity had increased between minus one and zero per cent.
  • Waiting lists fell from 1,158,000 in 1997 to 857,000 in September 2004. But the National Audit Office found in 2001 that nine trusts had ‘inappropriately adjusted’ their lists and the Audit Commission founding 2003 that there was evidence of ‘deliberate misreporting’.
  • Operations cancelled at the last minute for non-clinical reasons increased from 51,000 in 1997/98 to 66,000 in 2003/04.
  • The NAO found in 2000 that the UK had the worst record in Europe with about 100,000 infections a year leading to 5,000 deaths. Deaths from MRSA increased fifteenfold from 1993 to 2002.
  • Cancer and heart disease are the main causes of premature death. How does the NHS compare? A BMJ study ranked the UK 19th out of 19 countries on ‘mortality amenable to health care’.
  • Cancer survival in England is poor. The Eurocare-3 study found that England had the lowest 5-year survival rate for lung cancer and was among the lowest for prostate cancer.
  • The NAO found in 2003 that, out of ten countries, the UK had the highest, second highest and third highest rates of ischaemic heart disease, myocardial infarction and cerebro-vascular disease respectively.
  • The OECD found in 2002 that fatalities within the first seven days of a stroke were about twice the average in other countries.

Expenditure

Expenditure has increased rapidly, but have medical outcomes improved in proportion? In 1996/97 NHS expenditure in England was £33 bn. It had increased to £69 bn by 2004/05 and again to £76 bn in 2005/06.

How have health outcomes for patients changed? We can look at some overall measures of the performance of the NHS and at some specific measures. Output had increased by 28 per cent and inputs had grown by 32 per cent on one measure and 39 per cent on the other, the ONS concluded that the average annual change in NHS productivity between 1995 and 2003 was between -1 and zero per cent.[1] When the data were fed through a series of permutations and subjected to thorough examination, a decrease of anything between 3 per cent and 8 per cent was observed in NHS productivity from 1997 to 2003.[2]

Productivity

Academics at the LSE have tried to develop a better measure of overall health system achievement. The World Health Report of 2000 ranked countries according to ‘disability adjusted life expectancy’, which deducts a proportion of the expected years of life to allow for the reduced quality of life resulting from disability.[3] Ellen Nolte and Martin McKee compared the results of the WHO report with a new ranking of health attainment: ‘mortality amenable to health care’. The two measures produced substantially different results and the UK’s performance was poor. On the measure that assumed half of the deaths due to ischaemic heart disease to be the result of poor health care, the UK came 19th out of 19 countries. It had been tenth out of 19 on the WHO measure. When ischaemic heart disease was excluded, the UK was 18th out of 19.[4]

Waiting Lists

Due to the fall in overall NHS waiting lists – people waiting for admission to hospital – from 1,158,000 in 1997 to 857,000 by September 2004, the Department of Health has deployed waiting lists as key evidence of an improvement in performance. In December 2001, it published a report entitled Inappropriate Adjustments to NHS Waiting Lists which concluded that: ‘Nine NHS trusts inappropriately adjusted their waiting lists for some three years or more, affecting nearly 6,000 patient records. For the patients concerned this constituted a major breach of public trust and was inconsistent with the proper conduct of public business.’[5] By March 2003, when the Audit Commission produced a report entitled Waiting List Accuracy, things had hardly improved. The Commission found that while waiting lists for patients with possible breast cancer were generally well managed, ‘there was evidence of deliberate misreporting of waiting list information’ at three trusts and in a further 19 trusts auditors found evidence of reporting errors – deriving from inadequate policies, procedures or operational systems for collecting or recording data; and ineffective, wrongly set up or poorly integrated IT systems – in at least one of six performance indicators.[6] The report concluded that the fraudulent actions were ‘disturbing’; that ‘data quality varies widely’; that ‘a number of trusts were found to be operating in ways that seem weighted away from the interests of patients’[7]; and that where mistakes were not made intentionally ‘trusts could and should be doing [more] to reduce the likelihood of reporting errors’.[8]

As recently as March 2004, the King’s Fund was announcing that ‘the true scale of such inappropriate adjustments across the NHS is unknown’,[9] which at the very least implies that any evaluations of the ‘considerable achievement’ of having generally made waiting times ‘shorter than at any time in the history of the NHS’ should be treated with caution. Success, the report said, has been ‘patchy, with Wales and Northern Ireland, for example, experiencing growing problems with their waiting times.’[10] In February 2005, the King’s Fund produced another report, entitled Cutting NHS Waiting Times, which found that confidence in the current figures might be misplaced. It states that ‘there has been some success in reducing very long waiting times but average waiting times have changed very little.’[11] In addition, it argues that NHS waiting time reduction policies have relied on ‘the incorrect view that waiting lists represented a backlog that could be removed by temporary initiatives’ when in fact ‘sustainable reductions must rely on long-term policies designed to respond to a range of factors.’[12] The publication closes with the point that ‘important issues concerning the goals of policies on waiting times, demand management and the development of more appropriate targets focusing on access to care still need to be addressed.’[13]

Cancellations

In order to ensure that patients do not wait more than four hours between arriving and being treated, more are being admitted straight to hospital, which in turn leads to an increase in cancellations for planned operations because of a shortage of bed space. In February 2000, the NAO found that bed unavailability was the most common cause of cancelled operations, with beds occupied by new emergency cases or patients whose discharge has been delayed.[14] ‘Some 70 per cent of NHS acute trusts’ told the NAO ‘that the intended bed being occupied by a new emergency admission was the most common cause of a cancelled elective operation.’[15] Although at the time the UK had fewer beds per 1,000 of population (3.3) than Germany (6.4), Italy (4.5), France (4.2) and Australia (3.8),[16] it is most likely, not that there was an absolute shortage of beds, but that, as the chairman of the House of Commons public accounts committee said, the findings were ‘symptomatic of poor bed management.’[17]

As with waiting lists, the sheer unreliability of the data makes it difficult to determine with any exactitude the degree of the problem. The Audit Commission has found that: ‘Many trusts had incorrect or confused policies for how to record DNAs [Did Not Attends] and cancellations. A typical example would be where, when recording outpatient appointments cancelled by the trust, the waiting time was reset incorrectly to the cancellation date rather than being left as the date the referral was received originally’.[18] The Department of Health figures indicate that cancellations have been rising steadily from 50,505 in 1997-98 when the present government started its tenure to 66,303 in 2003-04. In those years there have been 15,798 more cancellations – or an increase of 31.2 per cent. Figures for the first three quarters of 2004-05 (47,010) suggest that the numbers are still rising. The Patient’s Charter states that a patient’s elective operation should not be cancelled by the hospital on or after admission, for non-medical reasons, and that where it does occur, the hospital is required to treat the patient within one month from the date of cancellation. Although the number not admitted within twenty-eight days of cancellation rose dramatically from 7250 in 1997-98 to 19,087 in 2001-02, there has been a decrease again, down to 6270 in the year 2003-04.[19]

However, beating the 24-hour cut off point means that the hospital is not obliged to treat the patient within one month of cancellation. TheSunday Times has calculated that the NHS may be cancelling more than twice as many operations at short notice than the government has acknowledged. The paper said that the figures disclosed in reports prepared by individual hospital trusts contradict ministers’ claims that 66,000 operations are cancelled a year, with the figure instead estimated to be at least 132,000.[20] Frustratingly, ‘[t]here are no national data on the number of operations cancelled before the date of admission’, and the proportion of operations cancelled varies across NHS acute trusts.[21] That said, there are some hospitals which do collect this information. For example, the Royal Shrewsbury Hospitals NHS Trust cancels around five operations before the day of admission for every one operation it cancels on the day.[22] Early cancellations were widespread in other trusts. According to a study published last year by the health scrutiny panel of Worcestershire county council, only 856 of the 1,791 operation cancelled in 2003 were called off within 24 hours of the appointment. Jonathan Fielden, vice-chairman of the BMA’s consultants’ committee, said: ‘This is not uncommon. When managers are faced with losing their jobs if they miss a target they will find any way to get round that target.’[23] Clearly, the claim that Labour has abolished ‘hidden’ figures does not stand up to close examination[24].

Hospital acquired infections

Hospital acquired infections are infections that are neither present nor incubating when a patient enters hospital. According to the NAO in February 2000, Britain has the worst record in Europe. At any one time, 9 per cent of patients – equivalent to at least 100,ooo infections a year – had an infection that had been acquired during their hospital stay. The effects varied from an extended length of stay and discomfort to prolonged or permanent disability and, in at least 5,000 patients a year, death. These infections were costing the NHS as much as £1 billion a year and around 15 per cent could be prevented by better application of good practice, releasing resources of £150 million for alternative NHS use.[25] In the same year, the Committee of Public Accounts concluded that the lack of grip on the extent and costs of hospital acquired infections impeded NHS trusts in targeting activity and resources to best effect. In addition, it claimed that a root and branch shift towards prevention would be needed at all levels of the NHS if hospital acquired infection were to be kept under control. But in his December 2003 report, Winning Ways[26], the Chief Medical Officer stated that such data as are available show that the degree of improvement has been small.

A major obstacle to tackling the spread of antibiotic resistant bugs is that while patients and staff prefer hospitals which are visually clean, this will only have a minimal impact on the spread of MRSA. Even the Department of Health’s Patient Environment Action Teams (PEATs) only assess cleanliness on visual criteria. Many more hospitals are now rated ‘good’ by the PEATs, but over the same period (2001-02 to 2002-03) rates of MRSA (0.17 per 1000 bed days) have not changed, according to the MRSA surveillance scheme, and between 1993-2002 the number of deaths increased fifteen-fold.[27]

Many of the conclusions of previous NAO reports were repeated in July 2004. Improving patient care by reducing the risk of hospital acquired infection: A progress report found that good practice with respect to the prevention, control and management of hospital acquired infection needed to be more widely known and that there was a lack of basic comparative information on infection rates. It expressed concern that there appeared to be a growing mismatch between what was expected of infection control teams and the staffing and other resources allocated to them, and identified considerable scope for improving performance. ‘Implementation of our and the Committee’s recommendations has been patchy… wider factors continue to impede good infection control practice and there has been limited progress in improving information on the extent and costs of hospital acquired infections. Progress in preventing and reducing the number of infections acquired while in hospital continues to be constrained by the lack of data, limited progress in implementing a national mandatory surveillance programme that meets the needs of the NHS, and a lack of evidence of the impact of different intervention strategies.’[28]

Cancer

The most comprehensive international data about the value added to healthcare systems relate to cancer survival rates. The standard measure is the percentage of cancer patients alive five years after treatment. Yet again, ‘[t]he performance of England is consistently at or near the bottom of the league, alternating bottom position with Scotland.’[29] The latest evidence of cancer survival rates comes from the EUROCARE-3 study, which compares results in 22 European countries up to 1999. Survival rates are given for 19 countries, based on survival for 5 years after diagnosis. Separate figures are given for England, Scotland and Wales. All are below the European average for all cancers. England was below the European average for survival rates from liver cancer[30], below average for breast cancer survival,[31] had the lowest survival rates for lung cancer (along with the lowest proportion of small cell lung cancer patients receiving chemotherapy[32]) and among the lowest survival rates for prostate cancer.[33] Overall, England comes 11th out of 19 and Scotland 12th for survival rates among men, and England is 12th and Scotland 13th for women.[34]

Circulatory disease

The situation regarding deaths from circulatory diseases – cardio-vascular diseases including ischaemic heart disease, myocardial infarction and cerebro-vascular disease – is equally as worrying. UK death rates are highest, second highest and third highest respectively across the group of countries for these three circulatory conditions.[35] Despite recent improvements, internationally the death rate from Coronary Heart Disease (CHD) in the UK is relatively high. Among developed countries only Ireland and Finland have a higher rate than the UK. While the death rate from CHD has been falling in the UK it has not been falling as fast as in some other countries.[36] The statistics show that victims of heart disease, stroke or breast cancer in Britain die early, and perhaps unnecessarily, compared to other western countries. Worse still, it seems that access to care is being limited according to age. Roger Dobson, a regular contributor to the BMJ, reports on an international study that found the proportion of health spending on those aged 65+ in England and Wales is not keeping track with that in other countries.[37]

Stroke

Where stroke is concerned, the UK is literally in a league of its own. When the OECD Age-Related Diseases (ARD) team reported in 2002 on in-hospital mortality and one-year case mortality for stroke patients, it found that there were few differences between the countries, with the exception of the UK. Fatalities in the UK over the first seven days were approximately twice the average for all age groups, making it the only country in the study classed as having high death rates. These data do not, though, reflect the total continuum of care which includes care outside the hospital setting: to do that, it is necessary to also account for non-hospital deaths by using case fatality rates. These rates were lowest in Denmark and highest by far in the UK, and the OECD observed that the UK stood out for its poor performance.[38] A year later, the OECD found that relatively speaking things had scarcely improved. For age-standardised mortality rates, the UK was high – behind only Hungary and Japan – and while the rates were decreasing they were not doing so as fast as elsewhere. 7-day hospital mortality was substantially higher than in any other country in the survey for all age categories, both male and female, with the gap widening for 30-day hospital mortality,[39] and of the 11 countries included in the study, only the UK was labelled as exhibiting high fatality rates.[40]

Notes

[1] ONS, Public Service Productivity: Health, London: ONS, October 2004, pp. 1-2.

[2] ONS, 2004, pp. 21-25.

[3] World Health Organisation, The world health report 2000. Health systems: improving performance, WHO: Geneva, 2000. Cited in Nolte, E. and McKee, M., ‘Measuring the health of nations: analysis of mortality amenable to health care’, British Medical Journal, 327: 1129 (15 November: 2003).

[4] Nolte, E. and McKee, M., ‘Measuring the health of nations: analysis of mortality amenable to health care’, British Medical Journal, 327:1129-32 (15 November: 2003).

[5] National Audit Office, Inappropriate Adjustments to NHS Waiting Lists, London: NAO, December 2001, p. 1.

[6]Audit Commission, Waiting List Accuracy, London: AC, 2003, p. 3.

[7] Waiting List Accuracy, p. 20.

[8] Waiting List Accuracy, p. 21.

[9]BBC ‘Your NHS’ Day 2004 briefing, p. 19.

[10]BBC ‘Your NHS’ Day 2004 briefing, p. 14.

[11]King’s Fund, Cutting NHS Waiting Times: Identifying strategies for sustainable solutions, London: King’s Fund, February 2005, p. 2.

[12]King’s Fund, Cutting NHS Waiting Times, p. 3.

[13]King’s Fund, Cutting NHS Waiting Times, p. 8.

[14] National Audit Office, NHS Executive: Inpatient Admissions and Bed Management in NHS acute hospitals, London: NAO, 24 February 2000, p. 22. Hereafter: ‘NHS Executive’.

[15]NHS Executive, p. 22.

[16] NAO, International Health Comparisons, p. 13.

[17] Linda Beecham, ‘NHS cancels record number of operations in England,’ British Medical Journal, 320: 599 (4 March: 2000). http://bmj.bmjjournals.com/cgi/content/full/320/7235/599

[18] Waiting List Accuracy, p. 18.

[19] http://www.performance.doh.gov.uk/hospitalactivity/data_requests/cancelled_operations.htm (downloaded: 22/03/2005)

[20] Sarah-Kate Templeton and Jonathon Carr-Brown, Sunday Times, 06/03/2005. http://www.timesonline.co.uk/printFriendly/0,,1-523-1513588,00.html

[21] NHS Executive, p. 2o.

[22] NHS Executive, pp. 20-22.

[23] The Times, 06/03/2005. http://www.timesonline.co.uk/printFriendly/0,,1-523-1513588,00.html

[24] Better or Worse? p. 12.

[25] The Management and Control of Hospital Acquired Infection in Acute NHS Trusts in England, London: NAO, February 2000, p. 10.

[26] Winning Ways: working together to reduce Healthcare Associated Infection in England: Report by the Chief Medical Officer, Department of Health, December 2003.

[27] BBC ‘Your NHS’ Day 2004 briefing, p. 25, p. 27.

[28] Improving patient care by reducing the risk of hospital acquired infection: A progress report, London: NAO, 14 July 2004, p. 2.

[29]Health Comparisons, 2003, p. 21, pp. 21-27.

[30] Faivre, J., Forman, D., Obradovic, M., Sant, M., and the EUROCARE Working Group, ‘Survival of patients with primary liver cancer, pancreatic cancer and biliary tract cancer in Europe’, European Journal of Cancer, Vol. 14, No. 14, 1998, pp. 2184-2190.

[31] Quinn, M.J., Martinez-Garcia, C., Berrino, F., and the EUROCARE Working Group, ‘Variation in survival from breast cancer in Europe by age and country, 1978-1989’, European Journal of Cancer, Vol. 14, No. 14, pp. 2204-2211, 1998.

[32]Janssen-Heijnen, M.L.G., Gatta, G., Forman, D., Capocaccia, R., Coebergh, J.W.W., and the EUROCARE Working Group, ‘Variation in survival of patients with lung cancer in Europe’, European Journal of Cancer, Vol. 14, No. 14, pp. 2191-2196, 1998.

[33] Post, P.N., Damhuis, R.A.M., Van der Meyden, A.P.M., and the EUROCARE Working Group, ‘Variation in survival of patients with prostate cancer in Europe since 1978’, European Journal of Cancer, Vol 14, No 14, 1998, pp. 2227-31.

[34] Coleman, M. P. et al, ‘EUROCARE-3 summary: cancer survival in Europe at the end of the 20th century, Annals of Oncology 14 (Supplement 5): v128-v149, 2003.

[35] Health Comparisons, 2003, p. 21.

[36] http://www.heartstats.org/temp/Mortalityspchapter.pdf, p. 2. For example the death rate for men aged 35-74 fell by 39% between 1989 and 1999 in the UK, but it fell 47% in both Norway and Australia. For women the death rate fell by 41% in the UK, but in Australia, Finland and Ireland the rate fell by 52%, 46% and 44% respectively.

[37] Dobson, R., ‘Proportion of spending on care for older people falls’, British Medical Journal, 325:355 (17 August: 2002).

[38]OECD, ARD Team, ‘Summary of Stroke Disease Study’ (Draft), What is Best and at What Cost? OECD Study on Cross-National Differences of Ageing related Diseases, DEELSA/ ELSWP1/ ARD (2002)4. OECD Working Party on Social Policy, Ageing-Related Diseases, Concluding Workshop, Paris, 20-21 June 2002.

[39] Moon, L., Moise, P., Jacobzone, S., and the ARD-Stroke Experts Group, Stroke Care in OECD Countries: A Comparison of Treatment, Costs and Outcomes in 17 Countries, OCED Health Working Papers, No. 5, DELSA/ ELSA/ WD/ HEA (2003)5, pp. 64-70.

[40] Moon et al, 2003, p. 86.

crime policy
Civitas: the Institute for the Study of Civil Society