Menu

Home

Private Eye

Tour Dates

#VoteDrPhil

#health4all

Books

Staying Alive

Videos

Biography

Contact

Press Info

Interview Feature

Press Quotes

Tour Reviews

Merchandise

Photos

Archive - Year: 2012

December 31, 2012

Private Eye: Medicine Balls 1330
Filed under: Private Eye — Dr. Phil @ 10:42 am

Will patients notice when NHS reforms go live in April 2013?

The biggest reforms in NHS history go live in April 2013, but will patients notice? The strap-line of Andrew Lansley’s baby was ‘no decision about me, without me’ but patients have had little say in the make up or operation of the NHS Commissioning Board, the Health and Social Care Information Centre, Health Education, the NHS Trust Development Authority, Healthwatch England, Local Healthwatch, Health and Wellbeing Boards and Clinical Commissioning Groups. Every year, the Health Service Journal produces a list of the people with ‘ the greatest influence on  health policy and the NHS’, and MD shows it to patients. This year, none could identify any of the top ten (hint: all white men, and four of the top six are called David).

At least Lansley became recognizable, but the current NHS is led by men you’ve never heard of who are miles away. For all the talk of devolving power to GPs, there are none in the top 20 and only one Clinical Commissioning Group chief in the top 100. Anna Bradley, the chair of Healthwatch England, ‘the consumer champion who will make the system listen to the patient voice’ is number 32. Despite fears that the NHS is being carved up for privatisation, the only private sector entries are Ali Parsa (58), who has just resigned as chief executive of Circle, and Richard Branson who has snuck in at 98 by virtue of his 75% stake in Assura and expansion of Virgin health.

A bigger fear for the government is that their hugely disruptive, expensive and widely opposed reforms will make very little positive difference to patients come the election. On top of the ambitious £20 billion savings plan over the next two years, the Nuffield Trust is now predicting a decade of austerity for the NHS, with a ‘funding gap’ of up to £54 billion by 2021/22. If every patient in the UK stopped smoking, ate and drank sensibly, took 30 minutes exercise every day, used condoms, stayed mentally well and only bothered the NHS for vaccinations, it might just survive.

Alas there are lots of chronic diseases without a cure, and these patients take up most of the NHS budget. Lansley’s test of whether the reforms are working is if a patient has a good idea to improve his or her care and takes it to a GP, the GP has the power to make it happen. MD suspects the results will be quite variable, but there are plenty of enthusiastic commissioning GPs across England who are already treating more patients closer to or in their homes, getting quicker access to consultants, getting city centre stores to stop selling cheap alcohol, liaising with charities, social services, pharmacies and opticians and cutting down on unnecessary referrals, A+E visits  and prescriptions.

There are also plenty of GPs in Scotland and Wales collaborating in this way, without the added pressures of a competitive health market, and only time will tell which model works best. And there are demoralised GPs in England who strongly opposed the reforms, think they’ve been stitched up in their new contract and resent the extra work and regulation for less money and pension. They may well have to make redundancies or end up selling out to Virgin.

Even the enthusiasts for clinical commissioning worry that, come April, GPs will be targeted by the press and blamed for hospital mergers and closures, increasing waiting times and lack of access to expensive drugs that are bound to happen in such austere times. Ultimately, the care patients get may depend on whether their GP is ‘energised’ by the reforms, or demoralised. Occasional NHS users will notice little change other than their GP looks even more stressed, but a few will have a personal budget to ‘shop around in the health market’. Those with multiple illnesses and complex needs will find life toughest, unless they’re lucky enough to find a bullet-proof workaholic GP who relishes the extra effort and responsibility of buying them the best care.





December 14, 2012

Medicine Balls 1329
Filed under: Private Eye — Dr. Phil @ 3:00 pm

Will the BMA stand up for whistleblowers or shut them up?

IN JULY consultant paediatric surgeon Edwin Jesudason won a high court injunction with costs against Alder Hey Children’s Hospital (AHCH), which is seeking his “no fault” dismissal after certain surgical colleagues refused to work with him and surgeon Shiban Ahmed after they blew the whistle on malpractice and mistreatment of staff (Eye 1315). Next week, Jesudason hopes to make the injunction permanent. If successful, he may improve on the woeful statutory protections for whistleblowers by forcing trusts to follow their whistleblowing policies, or risk similar actions for breach of contract.

Jesudason, an award-winning surgeon who has never received a patient complaint or malpractice suit, has worked at AHCH since 1998 but since 2010 has been in the US on a Medical Research Council study. In 2009 he protested when Ahmed, who worked in AHCH and the University Hospital of North Staffordshire (UHNS), was suspended by UHNS after AHCH colleagues made the unsubstantiated claim that he was suicidal. The Eye has seen a 5.9.10 letter from surgeon Colin Baillie to AHCH which reads: “Shiban mentioned he had considered suicide. I have no doubt this was what was said because I asked him to repeat himself. I shared this with the clinical director Matthew Jones.” Ahmed knew nothing of this. The claim was made behind his back when a proper response to a genuinely suicidal colleague would have been to arrange an urgent mental health assessment. He was however suspended for 14 months pending an investigation which cleared him of being any risk to himself or his patients. He is still not back at work. This is a huge loss to AHCH as a Royal College of Surgeons (RCS) report found that “many members of the departments spontaneously described Jesudason and Ahmed as exceptionally skilled and talented surgeons”. Their crime has been to raise concerns about substandard care as the GMC obliges them to.

Jesudason led the petition to reinstate Ahmed and in 2009 made a confidential protected disclosure to AHCH which was circulated to his consultant colleagues, some of whom now refuse to work with him. Baillie’s 2010 letter is very revealing. ‘It is imperative that our legal position is solid should trust wish to terminate the employment of Jesudason… The allegations of patient harm go beyond the cases mentioned in this document, so we can expect more damaging revelations.  There are only two possible outcomes; major departmental restructuring (on the quiet) with Jesudason returning… or a very dirty fight, fully in the public eye, with the organisation’s chief weapon being to bring Jesudason (who remains a talented surgeon and researcher) before the GMC for sanction.’

The public interest disclosure act offers no real protection to whistleblowers against trusts with vast legal resources, and the CQC has shown no interest in policing trusts who break their own whistleblowing codes with impunity.  Represented by the BMA, Jesudason is arguing that AHCH is in breach of contract by failing to enforce the provisions in its whistleblowing policy. The trust now accepts Jesudason is a whistleblower, but argues that concerns regarding his working relationships with other surgeons have nothing to do with his protected disclosure in 2009, but ‘date back to 2004’, when he was a trainee. Odd then that colleagues now seeking his removal interviewed and appointed him to a consultant post in 2006.

Ahmed and Jesudason’s concerns have not been fully investigated, despite visits to AHCH from the CQC and RCS, who’s report has been redacted. In 2010, Dr Alan Phillips, head of psychological services, interviewed over 50 members of theatre staff and found “a significant number of highly de-motivated and demoralised members of the theatre team across all professional disciplines, and some very serious health and safety concerns”. The full report remains a trust secret and Phillips refused to sign a 2-page summary. He took retirement, with the customary gagging clause. The Eye has gone to the Information Tribunal to ask for the full report. Alder Hey still has skeletons in its cupboard and is fighting hard to keep them there.

 

M.D.





December 5, 2012

Medicine Balls 1327
Filed under: Private Eye — Dr. Phil @ 12:34 pm

Closure of Lewisham ICU – where’s the evidence?

 

Matthew Kershaw, the Trust Special Administrator for the now dissolved South London Healthcare Trust (SLT), is making recommendations under the ‘Unsustainable Providers Regime’ that will result in the closure of the Lewisham Intensive Care Unit (ICU). Some closures are inevitable, but is this one based on evidence or simply cost cutting?

 

Lewisham ICU expanded in December 2006 into a combined ICU and High Dependency Unit (HDU) in a State of the Art facility in the new Riverside building, providing up to 21 patients with their own bay. It has space for an additional 3 ICU and 3 HDU beds and could provide a significant proportion of the services currently provided within SLT.

 

The Borough of Lewisham contains some of the most deprived wards in England. Deprivation is known to make severe, complex illness more likely. Despite this, Lewisham ICU is one of the better performing ICUs in the country (www.ICNARC.org). The standardised mortality ratio (SMR) is used to measure performance and quality of care in ICUs in England, and results consistently show that a patient admitted to Lewisham ICU is significantly more likely to get better than a patient admitted to a unit representative of the national standard of care

 

Lewisham’s ICU takes critically ill patients from all over London. In the last 12 months the ICU/HDU has looked after 772 patients at 94.9% capacity, with 34.8% on full life support and 12.6% requiring renal support. Kershaw’s current recommendations result in the net closure of 6 fully funded ICU and 8 fully funded HDU beds in South East London. No consultation with the critical care staff has taken place. Within Europe, the UK already has the smallest proportion of acute hospital beds allocated to critical care with 3.5 beds per 100,000 people. Germany has 24.6 per 100,000 and the US has 20 per 100,000.

 

Lewisham is the only DGH ICU in London that has been recognised by the Faculty of Intensive Care Medicine (FICM) as of sufficient quality to train the intensive care doctors of the future. It provides a consultant intensivist led outreach service that provides daily review, advice and expertise to all the other specialties to help recognise and initiate the prompt treatment of patients who may be deteriorating in the hope that we can stop them needing intensive care at all.

 

Lewisham ICU also conducts regular patient, relatives and staff wellbeing surveys. The responses are universally positive responses and these results have been presented at international meetings. The physiotherapists, pharmacists, nutritionists, speech therapists, radiographers, clerks, cleaners, 66 nurses, 9 doctors in training and 7 consultant intensivists have worked hard to deliver a truly excellent service serving such a deprived area, and understandably don’t want their service to be shut down.

 

The biggest challenge for those overseeing the current wave of NHS reorganisations is to provide robust evidence to those whose services are going to be disrupted, downsized or closed that the new service will be better. This has undoubtedly happened with the reorganisation of stroke care in London, partly because it was properly planned, consulted on and coordinated. SLT became a financial disaster in part due to two ridiculously unaffordable PFI developments, and the fear is that high quality services may now close. A final report will land on Jeremy Hunt’s desk in January.





November 14, 2012

Medicine Balls 1326
Filed under: Private Eye — Dr. Phil @ 1:05 pm

Show Me the Data

 

On October 17th, Tim Kelsey, the National Director for Patients and Information at the NHS Commissioning Board and founder of Dr Foster, said he ‘should be sacked’ if the NHS doesn’t undergo ‘a data revolution’ under his leadership. Both Kelsey and David Cameron are fond of citing the publication of outcome data for adult heart surgeons in England as proof of a more transparent, accountable NHS. Alas, as the Telegraph spotted, the scheme has stalled due to a lack of funding.

 

The publication of comparative clinical outcomes was one of the key recommendations of the Bristol heart inquiry and in 2004, heart surgeon Sir Bruce Keogh – now clinical director of the NHS – managed to persuade his 240 colleagues to publish the results of adult heart surgery. Dramatic improvements in survival rates followed. As Kelsey puts it: ‘In some procedures, more than a third of patients are living when they might previously have died and adult heart surgery in England is measurably, demonstrably and statistically better than anywhere else in Europe.’ Or at least it was until they stopped publishing the data.

 

Mortality ratios don’t give the full picture of how a surgeon, unit or hospital is performing, but if they’re high they warrant proper investigation and questions from patients and relatives. This methodology not only helped spot the Bristol heart scandal but guided the unit’s eventual turnaround. When an external investigation finally took place at Bristol, changes were implemented that saw the mortality ratio drop from 29% to 3.5% within three years. Mid Staffordshire hospital had a significantly high mortality ratio from 1998. The Francis report (due in January 2013) might finally tell us, 15 years later, why mortality ratios and other statistical alerts were ignored for so long. Julie Bailey, founder of Cure the NHS, has published a book – ‘From Ward to Whitehall’ – that shows how patients and relatives were ignored too. The truth of the statistics is often revealed by a visit the ward.

Professor Brian Jarman’s analysis of child heart surgery mortality (Eye last) is not perfect, but it still identified Oxford as a significant outlier. The official Central Cardiac Audit Database (CCAD) did not, and it was left to a whistleblower to get surgery suspended there after four deaths in 2010 (Eyes passim). The Dr Foster Unit (DFU) and CCAD both want to find a fair way of comparing child heart surgery units in England, so it would make sense for them to share data and expertise. But when the DFU applied to have access to data and coding from CCAD, the request was declined because it didn’t ‘demonstrate value to patients.’

 

CCAD argues that risk adjusted mortality for child heart surgery isn’t yet perfect enough to publish comparative outcomes, but they’ve done it in New York since 1997. In the UK and Ireland, paediatric intensive care has used risk adjusted mortality predictions for over ten years, constantly modified to keep it up to date, and valued and trusted by all the units (www.picanet.org.uk).

 

Child heart surgery teams are under huge pressure in their understaffed, over-stretched units awaiting the outcome of Jeremy Hunt’s review of the Safe and Sustainable reorganization. Bristol Children’s Hospital has now been put on an official warning by the Care Quality Commission after a series of deaths were linked to understaffing on the cardiac ward. The sooner resources and expertise are pooled in fewer centres, the better. MD’s guess is that when the reorganization finally happens, official mortality ratios will start to be published.

The NHS needs not just to measure clinical outcomes in real time, but act swiftly on them if they give cause for concern. Breast surgeon Ian Paterson is alleged to have performed over a thousand inappropriate cancer procedures since 1994. Who was monitoring his outcomes? The National Joint Registry has data on the comparative results of individual orthopaedic surgeons, and the success rates vary widely. Why can’t patients see them? And who has a clue how their GP compares? Good data costs money, but not nearly as much as the avoidable harm of secretive, substandard care. Kelsey knows this, but he’ll have a job persuading much of the NHS.





November 8, 2012

How to choose a child heart surgeon (continued)
Filed under: Private Eye — Dr. Phil @ 12:18 pm

Below are some strong arguments against and for publishing the adjusted mortalities of child heart surgery units, with certain caveats. Personally, I’m in favour of publishing. Mortality rations are not perfect, but I think they can help spot problems. They pointed out the problems of child heart surgery in Bristol and Oxford, and the high death rates in Mid Staffordshire. The problem was that the medical and political establishments sought to discredit the data, rather than investigate swiftly to see if there was a problem and so prevent patients suffering avoidable harm.

Mortality ratios for child heart surgery have been published in New York since 1997 and the world hasn’t come to an end. So it can be done. The latest report was published in October 2011 (see http://www.health.ny.gov/statistics/diseases/cardiovascular/index.htm, and scroll down to ‘Pediatric Congenital Cardiac Surgery in New York State’ near the bottom).

What seems to have caused most offence is my statement that I would choose a unit with a below average mortality ratio for my child. There are clearly other complex factors involved but if my local unit had a high mortality ratio, I would want a good explanation as to why before I proceeded. The Bristol heart scandal taught me to look at the data, however imperfect, and ask awkward questions.

Please enter the debate below and let me know which side you come out on.

Thanks

Phil Hammond

 

How to choose a child heart surgeon

Sir

MD makes a bold statement that he would choose where his child with congenital heart disease went for treatment based upon Brian Jarman’s (Dr Foster’s) latest website data on standardised mortality ratios. Would he? Really? There’s little doubt that most of us, public or professional, would choose a centre with the lowest mortality for the procedure our child needed. It doesn’t necessarily follow that a centre good at one thing is as good at another.

It would be wonderfully convenient for all if there was a valid way of “scoring” a congenital cardiac centre’s “quality”but there isn’t. Just isn’t. Not anywhere. Not anywhere in the world, despite years of international effort and numerous abandoned attempts to do so. The problem lies in the very nature of congenital heart disease, both in diagnosis and in treatment. Many things can go wrong with many parts of the heart, and it’s very common to have more than one thing wrong. When there is more than one thing wrong there are few fixed patterns of abnormality – there are literally hundreds of permutations of abnormalities. Often a procedure doesn’t address all the abnormalities at once, and repeated procedures of the same kind or a different kind may be required over many years, even into adulthood. This huge diversity of diagnoses and treatment make it impossible to put many patients into convenient, neat groups to compare outcomes. Risk adjustment, therefore, whilst desirable, is hugely complex. To date, nobody in the world has come up with a validated and complete model for adjusting risk so that all patients can be pooled together to produce a nice, tidy “quality score” for overall outcomes. That’s not an excuse, it’s statistical reality.

The methodology used by Brian Jarman (Dr Foster) and cited by MD was based upon that used for the Bristol Inquiry. It is crude, outdated and invalid. As far as one can see from his website it did not involve any clinical congenital cardiac input – bizarre in such a clinical minefield. In correspondence with MD and between the interested parties following MD’s piece in the Eye, it was suggested that the government funded national audit, Congenital CCAD (Congenital Cardiac Audit Database, part of NICOR, National Institute of Cardiovascular Outcomes Research) had refused to collaborate with Dr Foster (DFI).

Parents and patients over 16 give consent for their data to be sent to congenital CCAD and it goes without saying that we should do our best to ensure their data is used responsibly. Dr Foster applied for CCAD’s data with the aim of comparing it with NHS HES (Hospital Episode Statistics) data using opinion (not data) based risk adjustment. CCAD rejected the application on the basis of flawed methodology and the application was also rejected by HQIP (the Health Quality Improvement Partnership) because of the lack of demonstrable benefit to patients.

Prior to Dr Foster’s request, CCAD and HQIP had approved a research application (which had been through the usual independent rigours of research funding and ethical review) from the Clinical Operational Research Unit (CORU) at UCL, for a new approach to risk adjustment on real data. Their team have expert statisticians with direct input from congenital heart disease specialists. The first stage of this peer reviewed, collaborative work is due to be published shortly.

In emotive situations it’s very easy to get the public to pay attention to sensational statistics, even when the stats are wrong. The importance of being fair to the public and being fair to doctors in such a complex field can’t be overemphasised. In the UK we (congenital CCAD) put more data on treatment for congenital heart disease in children and adults into the public domain than any other country in the world. Risk stratification still hasn’t been worked out by anyone, anywhere, so it’s not proper or fair to use it in any of its current forms.

MD says on his website “It seems churlish to complain that there isn’t a robust method of risk adjustment for PCS if you aren’t enabling your best statisticians to collaborate on it”. We are collaborating with an independent expert statistical research group and are disappointed to hear from MD that they are not the best. So who are the best and the worst statisticians? I think we should be told. How about an Eye article on “how to choose your statistician”? A league table with quality scores would be handy.

John Gibbs FRCP

Lead clinician for congenital heart disease

Central Cardiac Audit Database

NICOR

170 Tottenham Court Road

London W1T 7HA

Dear Phil,

Thank you for sending me your correspondence with John Gibbs, and also that with Leslie Hamilton on Monday.

The data that I put on my website is not a “quality score” for overall outcomes and I did not claim that it was. The data represents adjusted mortalities for PCS units. I acknowledged on my website that the casemix used is outdated because it is based on that used for the Bristol Inquiry. I also mentioned that we had applied for the more up-to-date CCAD data but been turned down by HQIP. John confirms that our application was partly “rejected by HQIP (the Health Quality Improvement Partnership) because of the lack of demonstrable benefit to patients.” As John probably knows, there was very extensive clinical congenital cardiac input into the methodology used for the analyses that our unit did for the Bristol Inquiry. They were approved for the Inquiry report, including by the congenital cardiac clinicians (http://www.bristol-inquiry.org.uk/).

I said in my reply to the email from Leslie Hamilton that you sent me on Monday:

“Thank you for copying me into your email exchange with Leslie Hamilton. He says that “[i]t would seem obvious to use outcome data to assess the quality of care when deciding which units to designate as the surgical units for the future. However the clinical Steering Group (who advised the JCPCT) was very clear that the data should not be used in this way.” I agree that it seems obvious that outcome measures should be one of the factors that should be used when deciding which units to designate as the surgical units for the future. I disagree with the clinical Steering Group that they should not be considered. As Leslie knows, at Bristol the adjusted mortality for open heart PCS for children under one was high for about a decade, significantly high for several years, and it was not until the results were published and an external investigation was carried out by Hunter and de Leval in 1995 that the extensive problems at that unit were confirmed. The 1988 internal report of the Regional Cardiac Strategy, Cardiology and Cardiac Surgery Medical Advisory Subcommittee, dated 1 November 1988, chaired by Dr Deirdre Hine (Bristol Inquiry reference UBHT 0174 0057) had come to conclusions similar to those that we reached in during the Bristol Inquiry report regarding the problems at Bristol and had made some similar recommendations to those that we made in the Bristol Inquiry report, but little action seemed to have been taken to implement the Subcommittee’s recommendations. Once action was taken after the Hunter and de Leval external investigation the mortality ratio that I have referred to dropped at Bristol from 29% to 3.5% within three years.

I believe that the hundreds of parents who had taken their children to Bristol for PCS should have been able to see the significant differences between the adjusted mortality ratios of the English PCS units in order to allow them to make decisions as to where to take their child, just as much as that information should be one of the factors in deciding which units to designate for the future. I mentioned on my website the limitations of the data that we used and the need for more up-to-date information on case-mix, together with the fact that we (Paul Aylin) had applied for the CCAD data. Our application was turned down by HQIP. David Cunningham said in his email to me on 31 October “I think HQIP’s point was there seemed little in the application that demonstrated any value to patients.” I find HQIP’s attitude appalling: it doesn’t auger well for a re-application.

Surely it should be possible to share the CCAD data (our unit is not without experience regarding calculating adjusted hospital mortality ratios, including those for PCS). There could possibly be advantages for both the public and those making decisions about designating units.”

That is still my view. I believe, on balance, that it would be beneficial to get a ‘good enough’ measures of adjusted mortalities for PCS units, using the best data available and being aware of the caveats.

Brian Jarman.

 


From: John Gibbs
Subject: Re: MD’s piece on choosing a child’s heart surgeon
Date: Thu, 8 Nov 2012 16:10:23 +0000
To: hamm82@msn.com

Hmm… would redress the balance a bit if you published in the Eye rather than bury it on the website.

 

You really don’t need to persuade me (or my colleagues at CCAD) of the benefits of putting good data in the public domain. I’ve worked my nuts off at CCAD for over 16 years trying to do that and trying to persuade surgeons that it isn’t just a witch hunt. And I didn’t say that calculating SMRs couldn’t be done. I said there was no validated way of doing it properly at present. If it can’t be done properly it’s not fair to the public or doctors to do it. You keep citing New York State. Yeah – nobody else in the world has adopted their risk adjustment, have they? Wonder why that is (it’s totally unfathomable on their website).

 

CCAD are very much in favour of sharing data, but only if data requests pass the usual reviews by CCAD and HQIP. We have shared data with a wide variety of researchers, clinicians and commissioners over the years. We’ve only rejected two applications ever – and one of those was because the data requested was already available on our public website.

 

Brian Jarman says he’s appalled at our (and HQIP’s) decision to reject Dr Foster’s application for data. I think it would have been appalling and irresponsible for us to agree to a collaborative study using outdated methodology which has been abandoned by everyone else in the field. The request was rejected because of its poor design which dictated that it was highly unlikely to come up with information of benefit to patients.

 

Come on Phil, be objective. Brian actually admits his SMR methodology is outdated, that it may not be suitable for today’s case mix, and that he has not had up to date input from specialists in the field (he just cites clinical input from the Bristol Inquiry). Is that reasonable? Does it sound like “the best” statistical approach? Does it really sound fair and in the public interest?

 

It’s worth bearing in mind that paediatric cardiac surgeons (and cardiologists) do incredibly complex and stressful stuff with crazy working hours and bugger all home life. Even the ones in the highest general regard feel hounded

by repeated analyses of mortalities done by different bodies using different methods and coming up with different results and yet another “scandal” baby murdering story in the ST. In the last few years at least four paediatric cardiac surgeons have either left the country or left the specailty prematurely – and it’s increasingly hard to find trainees to want to come into a specialty under such relentless criticism. If it carries on like this there won’t be anything like enough surgeons to cope with the workload, and then the patients will be really screwed. I’m not suggesting one should be soppy or should give them any special treatment – just that we should all be fair. And I can’t see any reason why it’s not possible to be fair to doctors and patients at the same time.

 

We’ve argued in the past about the safe & sustainable decision making before, so prob not much point revisiting that but wow, does that need some simple, objective thinking injected into it! (Not from you, judging from your previous logic free comments!!). Looks in a right mess just now – what a tragic waste of an opportunity!

 

Pip pip

John

 

Thanks John

Jarman’s analysis picked up Oxford as a significant outlier. As you know, surgery at Oxford was suspended in 2010, but largely because of a whistleblower. Did CCAD flag it up too? I know Paul Aylin from DFU flagged up Oxford in the BMJ in 2004 as an outlier and got referred to the GMC for his troubles. Mortality Ratios might be crude but they do point to where the outliers might be, and I’d rather have them published than not published. I’d also much prefer if you and DFU collaborated on casemix, coding etc to get as fair an analysis as you can. If you don’t, another analysis will come out through the FOI cracks (eg Spiegelhalter’s a few years ago).

First you tell me know one in the world has published mortality ratios for PCS, then I discover New York has been doing it since 1997, and you dismiss it as unintelligible. Would be more interesting to ask them how they did it, what the reaction has been, whether it has increased pressure on surgeons or improved outcomes as it did in adults.

With Tim Kelsey on the NHS CB I would imagine all outcomes – for GPs, nurses, orthopaedic surgeons – will be published, all imperfect but working towards transparency and accountability. You might as well stay ahead of the curve and work with DFU.

Phil


On 9 Nov 2012, at 00:18, Jarman, Brian wrote:

 

Phil,

 

I saw on your website John’s response to my reply to his letter below. Neither he nor Leslie has corresponded directly with me or copied me into their emails to you about my publication.

 

I wish John would quote me correctly. It is not correct to say “Brian Jarman says he’s appalled at our (and HQIP’s) decision to reject Dr Foster’s application for data.”I said that that I was appalled that HQIP rejected our application “because of the lack of demonstrable benefit to patients.” That is very different .

 

Brian.

 

From: John Gibbs  Sent: 09 November 2012 11:23 To: Jarman, Brian Subject: Re: MD’s piece on choosing a child’s heart surgeon

 

To be fair to HQIP, Brian, I think we share the responsibility for the rejection. HQIP made their comments having seen ours, so my interpretation was that they couldn’t see any benefit to patients from using outdated methodology. HQIP also knew (having approved the data request) of CORU’s ongoing research into risk adjustment. Clearly if there was validated risk adjustment then putting SMRs into the public domain would be in the public interest. If our collaborative research with CORU (or with any other group) comes up with validated risk adjustment that gets past the scrutiny of peer review, the SCTS and the BCCA, we would be mad not to publish SMRs. But we won’t until the methodology is right. It’s not fair to patients or doctors to do otherwise.

John

 

John Gibbs

 

November 9. From Brian Jarman to John Gibbs

Thanks John.

I’m grateful that you have corresponded with me directly.

 

I think you know that we have a fair amount of experience in calculating risk adjusted mortality ratios and would have used whatever we found was the best method, once we had the data to do so. The CCAD data has been available for years. I do believe that ‘good enough’ analyses of PCS data for units (as we produced for Bristol: the results will never be perfect but caveats can be given) should have been made available to the public since the days of the Bristol Inquiry. I find the current situation quite indefensible when one considers the reduction of the death rates that occurred at  Bristol once Private had published their articles in 1992 and Hunter and de Leval had done their inspection and made their recommendations in 1995, and that was using poorer data than is currently available.

 

Are you of the school that considers that only perfect adjusted data for units should be available to the public? In a somewhat analogous situation, the Care Quality Commission criticised the year long investigation of Mid Staffs done by its predecessor organisation, the Healthcare Commission, for what were roughly similar reasons. I think it’s generally agreed that in fact the HCC did a pretty good job but it might have taken a bit too long and they might have been more proactive after their 23 May 2008 letter but it was certainly good enough for action to be taken for patient safety (the matter is on my mind at the moment because I went to a lecture about the Mid Staffs Independent Inquiry by Robert Francis at the Medical Society of London last night). Even with adult cardiac surgery, fully adjusted data is not published, as far as I know. I have been involved with both the Bristol Inquiry and the Mid Staffs Public Inquiry and I am not totally convinced that the doubts of the HQIP experts about our ability to analyse the CCAD data was the sole reason for their rejecting our application.

 

I think it should be possible for those with considerable expertise to collaborate to produce the best results for patients.

 

I’ll copy this to Phil –  he has been involved in the debate.

 

Brian.

Update from Brian Jarman, May 13, 2015

Dear Phil,

I should mention that although we use HES data and the methodology and case-mix that was used for the Bristol Royal Infirmary Inquiry (and hence the case-mix recommended by the cardiac surgeons is now out of date) there could be advantages for our unit to work with NICOR and CCAD data to update our case mix to be the same as theirs because analyses based on HES data have advantages:-

1. Our HES-based analyses are done monthly and are available two or three months after the month of the surgery – probably at least a year more timely than the NICOR CCAD-based results.

2. Our analyses use HES administrative data, which is complete and does not depend on variables such as the patient’s weight (used by NICOR – I understand the Leeds unit had missing values for weight in one of the NICOR analyses). At Bristol we decided to use HES data and not the cardio thoracic surgical register (the precursor of CCAD), partly for that reason.

Regards,

Brian.





1 2 3 4 7

Page 1 of 7