Search Journal-type in search term and press enter
Southwest Pulmonary and Critical Care Fellowships
In Memoriam

General Medicine

(Click on title to be directed to posting, most recent listed first)

Infectious Diseases Telemedicine to the Arizona Department of Corrections
   During SARS-CoV-2 Pandemic. A Short Report.
The Potential Dangers of Quality Assurance, Physician Credentialing and
   Solutions for Their Improvement (Review)
Results of the SWJPCC Healthcare Survey
Who Are the Medically Poor and Who Will Care for Them?
Tacrolimus-Associated Diabetic Ketoacidosis: A Case Report and Literature 
   Review
Nursing Magnet Hospitals Have Better CMS Hospital Compare Ratings
Publish or Perish: Tools for Survival
Is Quality of Healthcare Improving in the US?
Survey Shows Support for the Hospital Executive Compensation Act
The Disruptive Administrator: Tread with Care
A Qualitative Systematic Review of the Professionalization of the 
   Vice Chair for Education
Nurse Practitioners' Substitution for Physicians
National Health Expenditures: The Past, Present, Future and Solutions
Credibility and (Dis)Use of Feedback to Inform Teaching : A Qualitative
Case Study of Physician-Faculty Perspectives
Special Article: Physician Burnout-The Experience of Three Physicians
Brief Review: Dangers of the Electronic Medical Record
Finding a Mentor: The Complete Examination of an Online Academic 
   Matchmaking Tool for Physician-Faculty
Make Your Own Mistakes
Professionalism: Capacity, Empathy, Humility and Overall Attitude
Professionalism: Secondary Goals 
Professionalism: Definition and Qualities
Professionalism: Introduction
The Unfulfilled Promise of the Quality Movement
A Comparison Between Hospital Rankings and Outcomes Data
Profiles in Medical Courage: John Snow and the Courage of
   Conviction
Comparisons between Medicare Mortality, Readmission and
   Complications
In Vitro Versus In Vivo Culture Sensitivities:
   An Unchecked Assumption?
Profiles in Medical Courage: Thomas Kummet and the Courage to
   Fight Bureaucracy
Profiles in Medical Courage: The Courage to Serve
and Jamie Garcia
Profiles in Medical Courage: Women’s Rights and Sima Samar
Profiles in Medical Courage: Causation and Austin Bradford Hill
Profiles in Medical Courage: Evidence-Based 
Medicine and Archie Cochrane
Profiles of Medical Courage: The Courage to Experiment and 
   Barry Marshall
Profiles in Medical Courage: Joseph Goldberger,
   the Sharecropper’s Plague, Science and Prejudice
Profiles in Medical Courage: Peter Wilmshurst,
   the Physician Fugitive
Correlation between Patient Outcomes and Clinical Costs
   in the VA Healthcare System
Profiles in Medical Courage: Of Mice, Maggots 
   and Steve Klotz
Profiles in Medical Courage: Michael Wilkins
   and the Willowbrook School
Relationship Between The Veterans Healthcare Administration
   Hospital Performance Measures And Outcomes 

 

 

Although the Southwest Journal of Pulmonary and Critical Care was started as a pulmonary/critical care/sleep journal, we have received and continue to receive submissions that are of general medical interest. For this reason, a new section entitled General Medicine was created on 3/14/12. Some articles were moved from pulmonary to this new section since it was felt they fit better into this category.

-------------------------------------------------------------------------------------

Friday
Apr062012

Correlation between Patient Outcomes and Clinical Costs in the VA Healthcare System

Richard A. Robbins, M.D.1

Richard Gerkin, M.D.2

Clement U. Singarajah, M.D.1

1Phoenix Pulmonary and Critical Care Medicine Research and Education Foundation and 2Banner Good Samaritan Medical Center, Phoenix, AZ

 

Abstract

Introduction: Increased nursing staffing levels have previously been associated with improved patient outcomes.  However, the effects of physician staffing and other clinical care costs on clinical outcomes are unknown.

Methods: Databases from the Department of Veterans Affairs were searched for clinical outcome data including 30-day standardized mortality rate (SMR), observed minus expected length of stay (OMELOS) and readmission rate. These were correlated with costs including total, drug, lab, radiology, physician (MD), and registered nurse (RN), other clinical personnel costs and non-direct care costs.

Results: Relevant data were obtained from 105 medical centers. Higher total costs correlated with lower intensive care unit (ICU) SMR (r=-0.2779, p<0.05) but not acute care (hospital) SMR. Higher costs for lab, radiology, MD and other direct care staff costs and total direct care costs correlated with lower ICU and acute care SMR (p<0.05, all comparisons). Higher RN costs correlated only with ICU SMR. None of the clinical care costs correlated with ICU or acute care OMELOS with the exception of higher MD costs correlating with longer OMELOS. Higher clinical costs correlated with higher readmission rates (p<0.05, all comparisons). Nonclinical care costs (total costs minus direct clinical care costs) did not correlate with any outcome.

Conclusions: Monies spent on clinical care generally improve SMR. Monies spent on nonclinical care generally do not correlate with outcomes.

Introduction

Previous studies have demonstrated that decreased nurse staffing adversely affects patient outcomes including mortality in some studies (1-5). However, these studies have been criticized because studies are typically cross-sectional in design and do not account for differences in patients’ requirements for nursing care. Other observers have asked whether differences in mortality are linked not to nursing but to unmeasured variables correlated with nurse staffing (6-9). In this context, we correlate mortality with costs associated with other clinical expenditures including drug, lab, radiology, physician (MD), and other clinical personnel costs.

The observed minus the expected length of stay (OMELOS) and readmission rates are two outcome measures that are thought to measure quality of care. It is often assumed that increased OMELOS or readmission rates are associated with increased expenditures (10,11). However, data demonstrating this association are scant. Therefore, we also examined clinical care costs with OMELOS and readmission rates.

Methods

The study was approved by the Western IRB.  

Hospital level of care. For descriptive purposes, hospitals were grouped into levels of care. These are classified into 4 levels: highly complex (level 1); complex (level 2); moderate (level 3), and basic (level 4). In general, level 1 facilities and some level 2 facilities represent large urban, academic teaching medical centers.

Clinical outcomes. SMR and OMELOS were obtained from the Inpatient Evaluation Center (IPEC) for fiscal year 2009 (12). Because this is a restricted website, the data for publication were obtained by a Freedom of Information Act (FOIA) request. SMR was calculated as the observed number of patients admitted to an acute care ward or ICU who died within 30 days divided by the number of predicted deaths for the acute care ward or ICU. Admissions to a VA nursing home, rehabilitation or psychiatry ward were excluded. Observed minus expected length of stay (OMELOS) was determined by subtracting the observed length of stay minus the predicted length of stay for the acute care ward or ICU from the risk adjusted length of stay model (12). Readmission rate was expressed as a percentage of patients readmitted within 30 days.

Financial data. Financial data were obtained from the VSSC menu formerly known as the klf menu.  Because this is also a restricted website, the data for publication were also obtained by a Freedom of Information Act (FOIA) request. In each case, data were expressed as costs per unique in order to compare expenditures between groups. MD and RN costs reported on the VSSC menu were not expressed per unique but only per full time equivalent employee (FTE). To convert to MD or RN cost per unique, the costs per FTE were converted to MD or RN cost per unique as below (MD illustrated):

Similarly, all other direct care personnel costs/unique was calculated as below:

Direct care costs were calculated as the sum of drug, lab, x-ray, MD, RN, and other direct care personnel costs. Non-direct care costs were calculated as total costs minus direct care costs.

Correlation of Outcomes with Costs. Pearson correlation coefficient was used to determine the relationship between outcomes and costs. Significance was defined as p<0.05.

Results

Costs: The average cost per unique was $6058. Direct care costs accounted for 53% of the costs while non-direct costs accounted for 47% of the costs (Table 1 and Appendix 1).

Table 1. Average and percent of total costs/unique.

Hospital level. Data were available from 105 VA medical centers with acute care wards and 98 with ICUs. Consistent with previous data showing improved outcomes with larger medical centers, hospitals with higher levels of care (i.e. hospitals with lower level numbers) had decreased ICU SMR (Table 2). Higher levels of care also correlated with decreased ICU OMELOS and readmission rates (Table 2). For full data and other correlations see Appendix 1.

Table 2. Hospital level of care compared to outcomes. Lower hospital level numbers represent hospitals with higher levels of care.

 

*p<0.05

SMR. Increased total costs correlated with decreased intensive care unit (ICU) SMR (Table 3, r=-0.2779, p<0.05) but not acute care (hospital) SMR. Increased costs for lab, radiology, MD and other direct care staff costs and total direct care costs also correlated with decreased SMR for both ICU and acute care SMR (p<0.05, all comparisons). However, drug costs did not correlate with either acute care or ICU SMR. Increased RN costs correlated with improved ICU SMR but not acute care SMR. For full data and other correlations see Appendix 1.

Table 3. Correlation of SMR and costs.

*p<0.05

OMELOS. There was no correlation between SMR and OMELOS for either acute care (r= -0.0670) or ICU (r= -0.1553). There was no correlation between acute care or ICU OMELOS and clinical expenditures other than higher MD costs positively correlated with increased OMELOS (Table 4, p<0.05, both comparisons).

Table 4. Correlation of OMELOS and costs

*p<0.05

Readmission rate. There was no correlation between readmission rates and acute care SMR (r= -0.0074) or ICU SMR (r= 0.0463).Total and all clinical care costs directly correlated with readmission rates while non-direct clinical care costs did not (Table 5).

Table 5.Correlation of readmission rates and costs.

*p<0.05

Discussion

The data in this manuscript demonstrate that most clinical costs are correlated with a decreased or improved SMR Only MD costs correlate with OMELOS but all clinical costs directly correlate with increased readmission rates. However, non-direct care costs do not correlate with any clinical outcome.

A number of studies have examined nurse staffing.  Increased nurse staffing levels are associated with improved outcomes, including mortality in some studies (1-5). The data in the present manuscript confirm those observations in the ICU but not for acute care (hospital). However, these data also demonstrate that higher lab, X-ray and MD costs also correlate with improved SMR. Interestingly, the strongest correlation with both acute care and ICU mortality was MD costs. We speculate that these observations are potentially explained that with rare exception, nearly all physicians see patients in the VA system. The same is not true for nurses. A number of nurses are employed in non-patient care roles such as administration, billing, quality assurance, etc. It is unclear to what extent nurses without patient care responsibilities were included in the RN costs.

These data support that readmission rates are associated with higher costs but do not support that increased OMELOS is associated with higher costs implying that efforts to decrease OMELOS may be largely wasted since they do not correlate with costs or mortality. It is unclear whether the increased costs with readmissions are because readmissions lead to higher costs or the higher clinical care costs cause the higher readmissions, although the former seem more likely.

These data are derived from the VA, the Nation’s largest healthcare system. The VA system has unique features and actual amounts spent on direct and non-direct clinical care may differ from other healthcare systems. There may be aspects of administrative costs that are unique to the VA system, although it is very likely there is applicability of these findings to other healthcare systems. 

A major weakness of these data is that it is self reported. Data reported to central reporting agencies may be confusing with overlapping cost centers. Furthermore, personnel or other costs might be assigned to inappropriate cost centers in order to meet certain administrative goals. For example, 5 nurses and 1 PhD scientist were assigned to the pulmonary clinic at the Phoenix VA Medical Center while none performed any services in that clinic (Robbins RA, unpublished observations). These types of errors could lead to inaccurate or inappropriate conclusions after data analysis.

A second weakness is that the observational data reported in this manuscript are analyzed by correlation.  Correlation of decreased clinical care spending with increased mortality does not necessarily imply causation (13). For example, clinical costs are increased with readmission rates. However, readmission rates may also be higher with sicker patients who require readmission more frequently. The increased costs could simply represent the higher costs of caring for sicker patients.

A third weakness is that non-direct care costs are poorly defined by these databases. These costs likely include such essential services as support service personnel, building maintenance, food preparation, utilities, etc. but also include administrative costs. Which of these services account for variation in non-direct clinical costs is unknown. However, administrative efficiency is known to be poor and declining in the US, with increasing numbers of administrators leading to increasing administrative costs (14).

A number of strategies to control medical expenditures have been initiated, although these have almost invariably been directed at clinical costs. Programs designed to limit clinical expenditures such as utilization reviews of lab or X-ray expenditures or decreasing clinical MD or RN personnel have become frequent.  Even if costs are reduced, the present data imply that these programs may adversely affect patient mortality, suggesting that caution in limiting clinical expenses are needed. In addition, programs have been initiated to reduce both OMELOS and readmission rates. Since neither costs nor mortality correlate with OMELOS, these data imply that programs focusing on reducing OMELOS are unlikely to be successful in improving mortality or in reducing costs.

Non-direct patient care costs accounted for nearly half of the total healthcare costs in this study. It is unknown which cost centers account for variability in non-clinical areas. Since non-direct care costs do not correlate with outcomes, focus on administrative efficiency could be a reasonable performance measure to reduce costs. Such a performance measure has been developed by the Inpatient and Evaluation Center at the VA (15). This or similar measures should be available to policymakers to provide better care at lower costs and to incentivize administrators to adopt practices that lead to increased efficiency.

References

  1. Needleman J, Buerhaus P, Mattke S, Stewart M, Zelevinsky K. Nurse-staffing levels and the quality of care in hospitals. N Engl J Med 2002;346:1715-22.
  2. Aiken LH, Clarke SP, Sloane DM, Sochalski J, Silber JH. Hospital nurse staffing and patient mortality, nurse burnout, and job dissatisfaction. JAMA 2002;288:1987-93.
  3. Aiken LH, Cimiotti JP, Sloane DM, Smith HL, Flynn L, Neff DF. Effects of nurse staffing and nurse education on patient deaths in hospitals with different nurse work environments. Med Care 2011;49:1047-53.
  4. Diya L, Van den Heede K, Sermeus W, Lesaffre E. The relationship between in-hospital mortality, readmission into the intensive care nursing unit and/or operating theatre and nurse staffing levels. J Adv Nurs 2011 Aug 25. doi: 10.1111/j.1365-2648.2011.05812.x. [Epub ahead of print]
  5. Cho SH, Hwang JH, Kim J. Nurse staffing and patient mortality in intensive care units. Nurs Res 2008;57:322-30.
  6. Volpp KG, Rosen AK, Rosenbaum PR, Romano PS, Even-Shoshan O, Canamucio A, Bellini L, Behringer T, Silber JH. Mortality among patients in VA hospitals in the first 2 years following ACGME resident duty hour reform. JAMA 2007;298:984-92.
  7. Lagu T, Rothberg MB, Nathanson BH, Pekow PS, Steingrub JS, Lindenauer PK. The relationship between hospital spending and mortality in patients with sepsis. Arch Intern Med 2011;171:292-9.
  8. Cleverley WO, Cleverley JO. Is there a cost associated with higher quality? Healthc Financ Manage 2011;65:96-102.
  9. Chen LM, Jha AK, Guterman S, Ridgway AB, Orav EJ, Epstein AM. Hospital cost of care, quality of care, and readmission rates: penny wise and pound foolish? Arch Intern Med 2010;170:340-6.
  10. Render ML, Almenoff P. The veterans health affairs experience in measuring and reporting inpatient mortality. In Mortality Measurement. February 2009. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/qual/mortality/VAMort.htm
  11. Jencks SF, Williams MV, Coleman EA. Rehospitalizations among patients in the Medicare fee-for-service program. N Engl J Med;360:1418-28.
  12. Render ML, Kim HM, Deddens J, Sivaganesin S, Welsh DE, Bickel K, Freyberg R, Timmons S, Johnston J, Connors AF Jr, Wagner D, Hofer TP. Variation in outcomes in Veterans Affairs intensive care units with a computerized severity measure. Crit Care Med 2005;33:930-9.
  13. Aldrich J. Correlations genuine and spurious in Pearson and Yule. Statistical Science 1995;10:364-76.
  14. Woolhandler S, Campbell T, Himmelstein DU. Health care administration in the United States and Canada: micromanagement, macro costs. Int J Health Serv. 2004;34:65-78.
  15. Gao J, Moran E, Almenoff PL, Render ML, Campbell J, Jha AK. Variations in efficiency and the relationship to quality of care in the Veterans health system. Health Aff (Millwood) 2011;30:655-63.

Click here for Appendix 1.

Reference as: Robbins RA, Gerkin R, Singarajah CU. Correlation between patient outcomes and clinical costs in the va healthcare system. Southwest J Pulm Crit Care 2012;4:94-100. (Click here for a PDF version)

Friday
Mar302012

Profiles in Medical Courage: Of Mice, Maggots and Steve Klotz

“I never did give them hell. I just told the truth, and they thought it was hell.”-Harry S. Truman 

Mice and maggots bring to mind visions of filth and decay with an accompanying sense of sickening revulsion-hardly the impression you want associated with a hospital. However, an infestation of mice and maggots did occur in a hospital- not in medieval Europe as you might expect, but in 1998 at the Kansas City Veterans Administration (VA) Hospital. Although the mice and maggots are the attention grabbers, the story is worth repeating because it illustrates how dysfunctional modern hospitals can become and how Steve Klotz tried to affect change by speaking against the management that allowed the situation to occur.

To understand the story, we need to go back to 1995, ancient history to our fellows, residents and medical students but not so ancient to lots of us. Many Veterans Affairs (VA) hospitals were called Dean’s hospitals and had a special relationship with a local medical school (1). Each of these hospitals had a Dean’s committee, made up of officials from the VA and the local medical school. This committee approved physician hires and had a voice in most major decisions affecting the medical school faculty at the VA. The rationale for such an arrangement was the VA would not be able to hire high-quality faculty unless associated with a medical school where the faculty had an appointment. Overall this arrangement had served the VA well since shortly after World War II. The VA did get first rate faculty resulting in a level of care that could not be provided to Veterans by less qualified practioners.

However, not everyone was happy with the arrangement, particularly the hospital administrators. At that time, the administrators were in charge of the Medical Administration Service (MAS). This service supervised the business functions of the hospital (fiscal, human resources, purchasing, etc.) and several of the non-medical services (food preparation, janitorial services, security, etc.). The medical functions were headed by the chief of staff. The hospital director and the chief of staff had a shared and equal partnership between the hospital director and the chief of staff.

The administrators argued that the arrangement was disadvantageous to the VA in several ways. First, it gave the medical school, and therefore, the physicians too much voice in hospital operations. Second, physicians were often hired to fill medical school needs rather than VA needs. Third, the physician hires were often subspecialist and there was a move at the VA to emphasize primary care. Fourth, split responsibilities sometimes resulted in conflicts that were not easily resolved. Dr. Ken Kizer, then Under Secretary for Veterans Health Affairs in charge of all VA hospitals, was persuaded to dissolve the partnership and make the hospital director the ultimate authority at the hospitals in his Prescription for Change (2).

Against this background, Dr. Steve Klotz, an infectious disease specialist was consulted on two patients in the Kansas City VA ICU with nasal myasis (3). The first case occurred in July, 1998. The myasis was thought to be due to flies having direct access to the hospital through open windows during construction. However, after the second case in September, 1998, Klotz called his brother, John Klotz, an entomologist at the University of California, Riverside. He advised sending some of the flies and maggots to Nancy Hinkle, an entomologist with expertise on flies. She identified the flies and maggots as the green blowfly and explained to Klotz that these flies prefer to lay their eggs in mice carcasses. She suggested that the presence of green blowfly maggots suggested that the hospital had a mouse infestation.

In agreement with Dr. Hinkle’s speculation, the Kansas City VA was known to have a mouse problem preceding and coincident with the two cases of myasis (3). In response to this problem, warfarin-based mouse bait had been scattered throughout the hospital by a pest control contractor. However, this approach was largely unsuccessful. Numerous mice were observed during the daylight hours on all hospital floors. In some patient wards mice were being cared for as pets by the nursing personnel. Mice were so common in the building that they scampered over the feet of the associate hospital director during administrative morning report in the hospital director’s suite.

On learning of the egg-laying preferences of the blowfly, the warfarin baits and traps were replaced with live traps. The results of mouse captures showed the mice to be centered on the fourth floor of the hospital where the canteen was located (3). During an infection control inspection of the canteen and hospital canteen, inspectors discovered mouse carcasses in food storage rooms adjacent to the canteen on glue boards, mouse nests behind boxes on food shelves in the canteen, live mice trapped in a large wastebasket, and mouse droppings covering the floor of the canteen work room.

However, the above did not explain why there was a mouse problem.  All VA hospitals have canteens, many are in older buildings, and most do not have a mouse problem. It was apparent from the results of the live trappings and the canteen inspection that the mice seemed centered around the canteen (3). However, the real clue to the cause came when Klotz asked the janitors (S. Klotz, personal communication). They pointed out that a computer program had been purchased to indicate when rooms would be cleaned. The head of housekeeping was removed during downsizing along with the night time janitor who cleaned the canteen. However, the canteen and its storage rooms were not on the cleaning schedule. The janitors pointed out that these rooms had not been cleaned by housekeeping personnel for at least a year and every canteen employee was aware of the magnitude of the mouse problem (3).

Given that this was an interesting chain of epidemiological events, Klotz published the results of his investigations on March 25, 2002 (3). Action by VA Central Office was swift. Anthony Principi, Secretary of Veterans Affairs removed the regional network director, Patricia Crosetti, and ordered a full investigation. Principi would likely have removed the acting hospital director, Kent Hill, except that he was just hired. Hill had replaced the previous hospital director, Hugh Doran, who resigned after being filmed soliciting prostitution on “John TV”.  

Even prior to publication of the article, Klotz had phone calls from Hill and was told “not to talk to anyone” (4). The VISN headquarters in Kansas City closed a research lab with six employees that Klotz had formerly supervised. The employees were keyed out of the lab. Crosetti, the VISN director who had been removed, was heard to say ‘‘Klotz ain’t gonna work for the VA anymore” (4).

The publicity caused Congress to be involved and a field hearing was held at the Kansas City VA in June, 2002 (4). Klotz appeared as a witness and went first. Although he discussed the mice and maggot problem, he focused on five major root causes which he thought led to the incident:

  1. “The addition of an entire cadre of middle managers who embrace a business model of management. These managers have fiscal oversight in the clinical side of the organization and are neither sufficiently knowledgeable nor trained in the areas they supervise.
  2. The hospital director has more real power than the chief of staff. There is no equal partnership.
  3. A sundering of any meaningful relationship with local medical schools.
  4. Individuals in the organization with direct patient care, for example, physicians and nurses, have no meaningful influence in the organization of patient care.
  5. Supervisory positions are all too frequently held until retirement.”

To support his claims he showed that the number of doctors and nurses caring for the patients decreased in the VA while the number of support personnel increased (Table 1).

 

Yet the number of patients and expenditures had increased (Table 2).

When all of this was occurring, it appeared as if the possession of real credentials for a job position was grounds for immediate disqualification at the Kansas City VA according to Klotz (4). For example, “an engineer was given authority over pharmacy and housekeeping, disciplines for which he was untrained and had only superficial knowledge. Internists were placed in direct charge of subspecialty surgeons whose specific requirements often went unmet. A nonphysician was placed in charge of pathology and radiology”. The chief of pulmonary was asked to set a broken arm. When he refused, he was asked “You’re a doctor, aren’t you?” The position of Chief of Staff was eliminated because it was “obsolete”.  

Testimony was given by a representative from the office of inspector general (IG). He stated that most of the environmental problems identified during the April, 2002 inspection ordered by Principi fell into one of several categories: 1. An overall lack of cleanliness; 2 Failure to maintain equipment, furniture, utilities, hospital services; and 3. Inadequate pest control (4). Not mentioned is that 10 months prior to this report the IG had visited the Kansas City VA on a routine visit (5). Although rodent problems were identified by the employees, the IG’s recommended action was to remove the rodent traps from patient areas and the canteen.

Officials from the VA management testified next. Although there were multiple administrators that testified, particularly revealing were the testimonies of Doran, the former director of the Kansas City VA, and Robert Roswell, then Under Secretary of Veterans Health Affairs.

Doran went first. He stated that, “The unfortunate incident involving the maggots was handled expeditiously and appropriately by our staff.” This was the last time he mentioned any of his staff in a positive light. He went on to say that his first priority had been patient care and that he had initiated a number of construction projects towards this end. According to Doran, the problems arose from the Kansas City VA being an older building with an inadequate budget. Attacking Klotz he said, “There is absolutely no evidence to establish a relationship between the two nasal myasis cases and the alleged mouse problem. You have an obviously disgruntled former employee’s opinion who managed to get the article published.” Doran went on to tout his accomplishments at the Kansas City VA, particularly noting his Joint Commission on Healthcare Organization (JCAHO) scores. He noted that the JCAHO had recently inspected the hospital and found no problems. However, he failed to acknowledge the nurses, doctors and support personnel who were responsible for the success.

The ranking minority member, Dr. Bob Filner (D, CA) was unconvinced. By background, Dr. Filner is a former academic from the University of San Diego whose PhD is in the history of science. After some intense questioning, Filner chastised Doran saying, “The vocabulary used and the tone you use to defend yourself makes your testimony suspect in my eyes and it is contradictory to everything that we have heard over the years about problems here [Kansas City VA]. So I will tell you if you had to have me vote on who I was going to believe here, I would vote for the employees on the first line and I would have to say, you, sir, are the weakest link.”

Doran responded by attacking Dr. Filner. Saying that Filner’s personal attack was “unprofessional and totally uncalled for” and only done to embarrass him. He further accused Filner of “grandstanding for the cameras”.

Roswell, who had been confirmed as the Under Secretary for Veterans Healthcare Affairs only a few months earlier, went next. Like Doran, he also attacked Klotz noting that, “Despite the author’s assertions of a relationship between the rodents and the flies, there was (and is) no conclusive evidence that such a relationship existed”.  Again like Doran, Roswell went on to blame the age of the facility but did acknowledge the “lack of effective supervision and leadership in the housekeeping department…Due to the lack of knowledgeable leadership and supervision, the infrastructure within the housekeeping department eroded”.

Representative Dr. Filner was again skeptical. He asked Roswell, “What is it about a system that requires a publication of a significant problem to direct the resources where they need to go?” Roswell responded, “I think what we are dealing with is a situation where there were competing priorities, limited resources, ineffectual communication between various levels of management, and less than ideal monitoring.” Filner asked Roswell to assure him that the VA would not be retaliate against Klotz and was assured that the VA would not.

With that, the hearing and the controversy surrounding the cleanliness at the Kansas City VA ended. The Kansas City VA did receive a multi-million dollar facelift, but no changes occurred affecting the management problems that according to Klotz led to the incident.  Central office management became more concerned about employees publically speaking even through scientific publications.  A mandate was issued that all scientific manuscripts needed to be submitted to the local Research and Development Committee for review prior to publication (6).  

In the aftermath of the controversy Patricia Crosetti was proven right-Klotz does not work for the VA anymore. After receiving poor reviews on his Merit Review grant which he held for nearly 20 years, he left the VA to become a full time university professor. He is currently Chief of Infectious Disease at the University of Arizona. Ken Kizer left the VA when it became apparent his appointment would not be renewed by Congress. Roswell resigned from the VA a couple of years after these events in a controversy about a failed computer system. Patricia Crossetti, the VA regional director, was subsequently dismissed. Kent Hill became the permanent director of the Kansas City VA where he is today. Dr. Filner remains on the Veterans Affairs Committee and Hugh Doran remains retired.

Although Klotz’s 5 root causes of the mice and maggots incident have yet to result in substantial change, we should remember Klotz for his courage in speaking up and identifying the managerial problems that led to the infestation of mice and maggots.

Richard A. Robbins, M.D.*

References

  1. http://www1.va.gov/vhapublications/ViewPublication.asp?pub_ID=979 (accessed 3/18/2012).
  2. www.va.gov/HEALTHPOLICYPLANNING/rxweb.pdf (accessed 3/18/2012).
  3. Beckendorf R, Klotz SA, Hinkle N, Bartholomew W. Nasal myiasis in an intensive care unit linked to hospital-wide mouse infestation. Arch Intern Med 2002;162:638-40.
  4. http://veterans.house.gov/107th-congress-hearing-archives (accessed 3/18/2012).
  5. www.va.gov/oig/CAP/01-01515-40.pdf (accessed 3/18/2012).
  6. http://www.research.va.gov/resources/policies/pub_notice.cfm (accessed 3/26/2012).

*Dr. Robbins is a former employee of the Department of Veterans Affairs and was the Associate Chief of Staff for Research at the Southern Arizona VA when these events occurred in 2002.

Reference as: Robbins RA. Profiles in medical courage: of mice, maggots and Steve Klotz. Southwest J Pulm Crit Care 2012;4:71-7. (Click here for a PDF version of the manuscript)

Wednesday
Mar142012

Profiles in Medical Courage: Michael Wilkins and the Willowbrook School

“Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed, it is the only thing that ever has. "- Margaret Mead

With this article we begin an intermittent series on physicians who displayed courage in trying to help their patients. Although there are many examples, hopefully what will be illustrated are examples of the lesser known doctors who identified problems and stood up to address them. Few remember the controversy surrounding the now closed Willowbrook School and Dr. Michael Wilkins’ involvement. However, Wilkins’ courage in advocating for change not only resulted in substantial improvement in conditions at the school but also led to the Civil Rights of Institutionalized Persons Act (CRIPA) of 1980.

Wilkins was originally from Kansas City and graduated from the University of Missouri School of Medicine in 1967 (1). He left Missouri to do his pediatric internship and complete his military obligation in the United States Public Health Service on Staten Island. There he became familiar with the Willowbrook State School, which at the time was the largest institution for the mentally retarded in the world. After completing his time in the Public Health Service, Wilkins was persuaded by his friend, Dr. Bill Bronston, to join him at Willowbrook as a full time physician.

The Willowbrook School already had a bad reputation when Wilkins arrived (2). Built in the late 1930’s, the campus was large with 38 buildings on 375 acres in the Willowbrook section of Staten Island. Designed to hold 4000 patients, it held 6000 when Wilkins arrived. As a result of the overcrowding and unsanitary conditions, virtually all children developed hepatitis, primarily hepatitis A. This led to a highly controversial medical study carried out from the mid-1950s through the 1970s by researchers Saul Krugman and Robert W. McCollum. Healthy children were intentionally inoculated, orally and by injection, with hepatitis virus, then monitored to gauge the effects of gamma globulin in combating the infection (2,3). Senator Robert Kennedy of New York, the younger brother of the slain President John Kennedy, had toured the institution in 1965 and proclaimed that individuals in the overcrowded facility were "living in filth and dirt, their clothing in rags, in rooms less comfortable and cheerful than the cages in which we put animals in a zoo" and offered a series of recommendations for improving conditions (4). However, the result was a public relations effort with the creation of a few “model” buildings (5). Visitors were escorted through these “model” buildings and the controversy subsided.

Wilkins was not assigned to one of the model buildings but to building 6. There he cared for about 70 severely mentally retarded clients along with 2 or 3 attendants. In a 2008 interview Wilkins said, “The first thing that assaults you when you walk in the building is the smell and that sets the tone for the whole experience. The smell is the smell of decay and a mixture of sweat and feces and lack of being cleansed …it permeated the building” (1). The clients spent their days in the day room. Few went to school. For the incontinent the attendants would get 6-8 clients together in the shower room and hose them to keep them clean and clean up after them between showers. The severely retarded would be on the floor in straightjackets so as not to scratch themselves or assault other patients. Some would rock on the floor, while others sat motionless. Each building had a supervisor, sometimes with a nursing degree, but in Wilkins’ building the supervisor had no nursing background. He was the person that would answer to the administration and carry out their orders.

Wilkins and Bronston began advocating for change especially in the wake of Governor Nelson Rockefeller’s budget cuts which decreased the number of employees from 3000 to 2000. Parents and friends were not allowed in the buildings so they held Sunday weenie roasts for the clients and their parents. There they would advocate for improvement in the conditions. This eventually led to a conference for the parents. The keynote speaker, an expert on mental retardation, described conditions at Willowbrook as “primitive” and “outdated” (1). After trying to organize the parents to advocate change, Wilkins was fired by the School’s administrator, Dr. Jack Hammond.

Now unemployed, and not having completed his residency, Wilkins fought back. He contacted his friend, a local WABC-TV New York newsman, Geraldo Rivera. Fortunately, when Wilkins was fired, Willowbrook had not asked for his key and he used it to let Rivera come to Willowbrook and document the conditions. The State of New York attempted to prevent the release of the resultant film citing patients’ privacy, but Rivera, an attorney by education, persuaded his producer to make the film public. The exposé, entitled “Willowbrook: The Last Disgrace”, garnered national attention and won a Peabody Award for Rivera. Rivera and Wilkins later appeared on the nationally televised Dick Cavett Show with the film.

As a result of the conditions, a class-action lawsuit was filed against the State of New York and a settlement in the case was reached mandating reforms. The publicity generated by the case was a major contributing factor to the passage of the Federal Civil Rights of Institutionalized Persons Act of 1980. This law enabled the US Department of Justice to protect the rights of those individuals who were in the care of state institutions, including jails and prisons, juvenile correctional facilities, public nursing homes, mental health facilities and institutions for individuals with intellectual disabilities. The law allows for the US Attorney General to intervene on behalf of institutionalized people whose rights may have been oppressed and ensure the safety of those individuals who may feel uncomfortable reporting issues of abuse in these government run institutions.

After leaving Willowbrook, Wilkins returned to Kansas City and completed his residency. For many years he ran a clinic in the inner city serving Kansas City’s poor. We should remember Dr. Michael Wilkins and how his act of courage led to improvement at Willowbrook and a law protecting all institutionalized individuals.

Richard A. Robbins, M.D.

Editor, Southwest Journal of Pulmonary and Critical Care

References

  1. library.albany.edu/speccoll/findaids/apap015/WILKINS.pdf CSEA interview with Dr. Michael Wilkins. Accessed February 21, 2012.
  2. http://en.wikipedia.org/wiki/Willowbrook_State_School Accessed February 21, 2012.
  3. Giles JP, McCollum RW, Berndtson LW Jr, Drugman S. Viral hepatitis — relation of Australia/sh antigen to the Willowbrook ms-2 strain. N Engl J Med 1969; 281:119-122.
  4. "Excerpts From Statement by Kennedy", The New York Times, September 10, 1965. Accessed February 21, 2012.
  5. library.albany.edu/speccoll/findaids/apap015/BRONSTON.pdf CSEA interview with Dr. William Bronston. Accessed February 21, 2012.

Reference as: Robbins RA. Profiles in medical courage: Michael Wilkins and the Willowbrook School. Southwest J Pulm Crit Care 2012;4:54-6. (Click here for a PDF version)

Wednesday
Mar142012

Relationship between the Veterans Healthcare Administration Hospital Performance Measures and Outcomes

Richard A. Robbins, M.D.1

Richard Gerkin, M.D.2

Clement U. Singarajah, M.D.1

1Phoenix Pulmonary and Critical Care Medicine Research and Education Foundation and 2Banner Good Samaritan Medical Center, Phoenix, AZ

Abstract

Health care organizations have been using performance measures to compare hospitals. However, it is unclear if compliance with these performance measures results in better healthcare outcomes. We examined compliance with acute myocardial infarction, congestive heart failure, pneumonia and surgical process of care measures with traditional outcome measures including mortality rates, morbidity rates, length of stay and readmission rates using the Veterans Healthcare Administration Quality and Safety report. Disappointingly, increased compliance with the performance measures was not correlated with better outcomes with the single exception of improved mortality with higher rates of compliance with echocardiography. We also evaluated the hospital level of care and found that higher levels of complexity of care correlated with the acute myocardial infarction performance measure, but not with the congestive heart failure, pneumonia, or surgical process of care performance measures. However, level of complexity of care strongly correlated with all cause mortality (p<0.001), surgical mortality (p=0.037) and surgical morbidity (p=0.01). These data demonstrate that compliance with the performance measures are not correlated with improved healthcare outcomes, and suggest that if measures are used to compare hospitals, different measures need to be developed.

Introduction

The Joint Commission recently released “Improving America’s Hospitals: The Joint Commission’s Annual Report on Quality and Safety 2011 (1). In this report the results of hospital compliance with the Joint Commission’s performance measures are listed. The Joint Commission announced not only is compliance improving but identified 405 hospitals as their “Top Performers on Key Quality Measures Program”. In a letter at the beginning of the report Mark Chassin, President of the Joint Commission, said “This program is designed to be an incentive for better performance on accountability measures and to support organizations in their quest to do better”.

However, there have been several criticisms of the report. First, many hospitals which were recognized as top hospitals by US News & World Report, HealthGrades Top 50 Hospitals, or Thomson Reuters Top Cardiovascular Hospitals were not included (2). Small community hospitals were overrepresented and large academic medical centers were underrepresented in the report. Chassin commented that this should be "a wake-up call to larger hospitals to put more resources into these programs…”. This is surprising since teaching hospitals, which are usually large, urban hospitals, have previously been reported to have lower risk-adjusted mortality rates and lengths of stay (3). Second, it has been pointed out that many of the performance measures are not or only weakly associated with traditional outcomes such as mortality (4-7). Therefore, we compared the compliance with the Joint Commission performance measures compared to mortality rates, morbidity rates, length of stay and readmissions using the Nation’s largest healthcare system, the Department of Veterans Affairs. The results demonstrate that compliance with performance measures are not correlated with improved outcomes.

Methods

The study was approved by the Western IRB.

Process Performance Measures. We evaluated hospital performance based on publicly available data from the 2010 VHA Facility Quality and Safety Report (9). These measures evaluate quality of care for acute myocardial infarction, congestive heart failure, pneumonia and surgical care improvement program (SCIP) during fiscal year 2009. For each of the measures, a hospital’s performance is calculated as the proportion of patients who received the indicated care out of all the patients who were eligible for the indicated care. The quality indicators are based on, and in most cases identical to those used for the Joint Commission’s Hospital Compare (acute myocardial infarction-Appendix 1; congestive heart failure-Appendix 2; pneumonia-Appendix 3, surgical quality-Appendix 4). Data were also available for each component of the congestive heart failure quality measure (see Appendix 2) which was evaluated independently.

Disease specific mortality. Hospital-specific, risk-standardized rates of mortality within 30 days of discharge are reported for patients hospitalized with a principal diagnosis of heart attack, heart failure, and pneumonia. For each condition, the risk-standardized (also known as "adjusted" or "risk-adjusted") hospital mortality rate are calculated using mathematical models that use administrative data to adjust for differences in patient characteristics that affect expected mortality rates (10).

Surgical morbidity and mortality. VA’s Surgical Quality Improvement Program (VASQIP) monitors major surgical procedures performed at VHA facilities and tracks risk adjusted surgical complications (morbidity) and mortality rates. Patient data are collected at each facility by a specially trained nurse and entered into the VA’s electronic health record: detailed preoperative patient characteristics including chart-abstracted medical conditions, functional status, recent laboratory tests, information about the surgical procedure performed, and 30-day outcomes data.

The VASQIP program analyzes these patient data using mathematical models to predict an individual patient’s expected outcome based on the patient’s preoperative characteristics and the type and nature of the surgical procedure. Overall patient outcomes for major surgical procedures are expressed by comparing observed rates of mortality and morbidity to the expected rates for those patients undergoing the procedure as observed-to-expected (O/E) ratios. For example, if, based on patient characteristics, a facility expected 5 deaths following major surgery, but only 4 patients died, the O/E ratio would be reported as 0.8.

Medical Surgical Length of Stay (LOS). These data are the VA hospital average length of stay for patients who were discharged from acute medicine or surgery bed sections. It does not include patients discharged from observation beds or discharged from other areas of the hospital such as mental health.

Readmission rates. A readmission was defined as a patient who has had a recent hospital stay and needs to re-enter the hospital again within 30 days. These rates are not adjusted for patient characteristics that affected expected admission rates, so comparisons among hospitals should be interpreted with caution.

CHF readmissions were reported separately. CHF readmission is defined by patients who had an initial hospitalization for CHF and were readmitted at least once to acute care in the hospital within 30 days following discharge for CHF.

Hospital level of care. For descriptive purposes, hospitals were grouped into levels of care. These are classified into 4 levels: highly complex (level 1); complex (level 2); moderate (level 3), and basic (level 4). In general, level 1 facilities and some level 2 facilities represent large urban, academic teaching medical centers.

Correlation with Outcomes. Pearson’s correlation coefficient was used to assess the correlation of compliance with the performance measures and outcomes. Significance was defined as p<0.05. For comparisons among hospital levels, ANOVA or Kruskall-Wallis testing was done, as appropriate.

Results

Disease specific and all cause mortality rates compared to performance measures. Hospital-specific, risk-standardized rates of mortality within 30 days of discharge for patients hospitalized with a principal diagnosis of heart attack, heart failure, and pneumonia were compared to performance measure compliance. There was no correlation (Table 1, p>0.05 all conditions) but with an increased incidence of pneumonia actually weakly correlating with higher compliance with the pneumonia performance measures (Table 1, p=0.0411). Furthermore, there was no correlation between all cause mortality and the average of the three compliance measures (Table 1, p>0.05). Because each table is large, only the correlation coefficients are presented in the text. The table data on which the correlations are based are given at the end of the manuscript. (N=the number of hospitals. NA=not available).

Table 1. Disease Specific Mortality Correlated with Performance Measure Compliance

Correlation Coefficients

r value

N

p value

Acute Myocardial Infarction Mortality and AMI Performance Measure

0.0266

103

0.7897

Congestive Heart Failure Mortality and CHF Performance Measure

0.0992

123

0.2752

Pneumonia Mortality and Pneumonia Performance Measure

0.1844

123

0.0411

All Cause Mortality vs. Average of Performance Measures

0.1118

122

0.2202

Each component of the congestive heart failure performance measure was evaluated individually. Performance of echocardiography correlated with improved mortality (Table 2, p=0.0496) but there was no correlation with use of a angiotensin converting enzyme inhibitor (ACEI) or angiotensin receptor blocker (ARB) at discharge, discharge instructions, nor smoking cessation advice (Table 2, p>0.05 all comparisons).

Table 2. Heart Failure Mortality Correlated with Compliance to Individual Heart Failure Performance Measures

Correlation Coefficients

r value

N

p

ACEI or ARB

-0.1007

112

0.2908

Smoking Cessation

0.0651

112

0.4953

Discharge Instructions

0.1411

111

0.1396

Echocardiography

-0.1860

112

0.0496

Surgical mortality and morbidity rates compared to surgical performance measures. There was no correlation between compliance with the surgical care improvement program (SCIP) and surgical mortality or morbidity (Table 3, p>0.05 both comparisons).

Table 3. Surgical Care Improvement Program (SCIP) Compliance Correlated with Observed/Expected (O/E) Morbidity/Mortality

Correlation Coefficients

r value

N

p value

O/E Mortality

0.0943

99

0.3530

O/E Morbidity

0.0031

99

0.9757

Length of Stay. None of the performance measures correlated with medical-surgical length of stay (Table 4, p>0.05 all comparisons).

Table 4. Length of Stay (LOS) Correlated with Performance Measure Compliance

Correlation Coefficients

r value

N

p value

LOS compared to AMI

0.1047

103

0.2926

LOS compared to CHF

-0.0178

123

0.8451

LOS compared to Pneumonia

-0.1679

123

0.0634

LOS compared to SCIP

-0.0404

106

0.6809

LOS compared to Average

0.0028

123

0.9755

Readmission rates. There was no correlation between all cause readmission rates and the acute myocardial infarction, congestive heart failure, pneumonia or surgical performance measures (Table 5, p>0.05 all comparisons). There was no correlation between heart failure readmission rate and the heart failure performance measure (data not shown, r=0.1525, p=0.0921).

Table 5. Readmission Rate Correlated with Performance Measure Compliance

Correlation Coefficients

 

r value

N

p

AMI

0.1688

103

0.0883

CHF

0.1505

123

0.0966

Pneumonia

0.0581

123

0.5233

Average

0.1281

122

0.1597

Hospital level of care. Acute myocardial infarction performance measures inversely correlated with the hospital level of care, i.e., the higher the hospital complexity level, the better the compliance (Table 6, p=0.004). However, there was no correlation between congestive heart failure, pneumonia, surgical care improvement program or the average of the measures and the hospital level of care (Table 6).

Table 6. Hospital Level Correlated with Performance Measure Compliance

ANOVA

N

p

Acute Myocardial Infarction (AMI)

103

0.004

Congestive Heart Failure (CHF)

120

0.782

Community Acquired Pneumonia

120

0.296

Surgical Care Improvement Program (SCIP)

106

0.801

Average of Process of Care Measures

120

0.285

There was no correlation between the level of hospital care and the acute myocardial infarction, congestive heart failure, nor pneumonia mortality (Table 7, p>0.05 all comparisons). However, there was a strong correlation between all cause morality (p<0.001) and a correlation between surgical Observed/Expected mortality (Table 7, p=0.037) and surgical Observed/Expected morbidity (p=0.010).

Table 7. Hospital Level Correlated with Mortality and Surgical Morbidity

ANOVA

N

p

Acute Myocardial Infarction (AMI) Mortality

103

0.835

Congestive Heart Failure (CHF) Mortality

120

0.493

Pneumonia Mortality

120

0.547

All Cause Mortality

106

<0.001

Surgical O/E Mortality

99

0.037

Surgical O/E Morbidity

99

0.010

Discussion

These data from the Nation’s largest healthcare system demonstrate that increasing compliance of the performance measures prescribed by the Joint Commission does not affect disease specific mortality, all cause mortality, surgical mortality, surgical morbidity, length of stay or readmissions with the single exception of improved mortality correlating with increased compliance with performance of echocardiography. In contrast to the Joint Commission’s list of top hospitals which found smaller and rural hospitals to be overrepresented, we found that only the acute myocardial infarction performance measure correlated with a higher level of hospital care which represents mostly large, urban hospitals. We did find that all cause mortality and surgical morbidity highly correlated with the level of care. This would appear to differ from the Joint Commission’s list of top hospitals which tended to be small and rural, since VA hospitals with higher levels of care largely represent large urban, academic teaching medical centers.

There are multiple possible reasons for the lack of correlation between the performance measures and outcomes. Many of the outcomes are evidence based but several are not. For example, there are no randomized, multi-center studies evaluating the efficacy of discharge instructions, smoking cessation advice and pneumococcal vaccination. Studies with discharge instructions are retrospective, observational studies and have largely not shown improved outcomes (11,12). Several meta-analyses have failed to demonstrate the efficacy of pneumococcal vaccine in adults (13-15). Advice to quit smoking without follow up support or pharmacologic intervention has not been shown to lower smoking cessation rates (16). Mandating ineffective interventions such as these would not be expected to have a positive effect on outcomes. However, this is where most of the improvement in performance measure outcome has occurred (2).

Most of the interventions are grouped or bundled. Lack of compliance with any one of the bundle is taken as noncompliance with the whole. However, if the only difference between hospitals is noncompliance with an ineffective performance measure, there would not be any expected improvement in outcomes.

Many of the strongly evidence-based outcomes have very high compliance, usually exceeding 95% (9). It is possible that small improvements of 1 or 2% in effective performance measures might have too small an impact on outcomes to be detected even in large databases such as the Veterans Administration which examined 485,774 acute medical/surgical discharges in 2009.

The performance measures appear to avoid highly technical or costly interventions and often avoid interventions which have been shown positively affect outcomes. For example, beta blockers and spironolactone have been shown to be effective in heart failure but are not included in the congestive heart failure performance measures (17,18). Furthermore, carvedilol has been shown to be superior to metoprolol in improving survival (19). Why the performance measures include use of an angiotensin converting enzyme inhibitor or angiotensin receptor blocker but not carvedilol and spironolactone is unclear.

Some of the performance measures may have caused inadvertent harm. For example, administration of antibiotics within 4 hours to patients with pneumonia was a previous performance measure. However, studies showed that this performance measurement led to administration of antibiotics in many patients who proved not to have pneumonia or another infectious disease, and a systematic review concluded that “evidence from observational studies fails to confirm decreased mortality with early administration of antibiotics in stable patients with [community acquired pneumonia]” (20-22). The time has since been changed to 6 hours, but it is unclear if that it is any better than the initial 4 hour timing used (7).

We did not confirm the Joint Commission’s findings that the top hospitals are overrepresented by small, rural hospitals. We found no correlation between hospital level of complexity of care and performance measure compliance with the exception of acute myocardial infarction which was higher in hospitals with higher complexities of care. Although we found no correlation of the performance measures with any outcome measures, we did find a strong correlation between the hospital level of complexity of care and overall survival and surgical morbidity with the hospitals having the higher level of complexity having improved survival and decreased surgical morbidity. This would seem consistent with concept that volume of care correlates with outcomes.

It seems surprising that initiation of performance measures seem to go through such little scrutiny. In a 2005 editorial Angus and Abraham (23) addressed the issue of when there is sufficient evidence for a concept to be widely applied as a guideline or performance measure. Comparing guidelines to evaluation of novel pharmacologic therapies, they point out that promising phase II studies are insufficient for regulatory approval. Instead, one, and usually two, large multicenter phase III trials are necessary to confirm reliability. The same principle is echoed in evidence-based medicine, where grade A recommendations are based on two or more large, positive, randomized, and multicenter trials. This seems a reasonable suggestion. Perhaps what is needed is an independent Federal or private agency to review and approve performance measures, and as Angus and Abraham suggest, require at least two randomized, multicenter trials before implementation

The data presented in this manuscript do not support the usefulness of increasing compliance with the Veterans Administration’s (or the Joint Commission’s) performance measures in improving outcomes such as mortality, morbidity, length of stay or readmission rates. Until compliance with the performance measures results in improved outcomes, investment to improve these performance measures seems to be a poor utilization of resources. It suggests that oversight of regulatory agencies is needed in developing and implementing performance measures. If performance measures are to be used, new, clinically meaningful measures that correlate with outcomes need to be developed.

References

  1. Available at: http://www.jointcommission.org/accreditation/hospitals.aspx (accessed 9-25-11).
  2. Available at: http://www.forbes.com/sites/davidwhelan/2011/09/20/is-the-joint-commission-list-of-top-hospitals-worth-heeding/ (accessed 9-25-11).
  3. Rosenthal GE, Harper DL, Quinn LM. Severity-adjusted mortality and length of stay in teaching and nonteaching hospitals. JAMA 1997;278:485-90.
  4. Werner RM, Bradlow ET. Relationship between Medicare's hospital compare performance measures and mortality rates. JAMA 2006;296:2694-702.
  5. Peterson ED, Roe MT, Mulgund J, DeLong ER, Lytle BL, Brindis RG, Smith SC Jr, Pollack CV Jr, Newby LK, Harrington RA, Gibler WB, Ohman EM. Association between hospital process performance and outcomes among patients with acute coronary syndromes. JAMA 2006;295:1912-20.
  6. Fonarow GC, Yancy CW, Heywood JT; ADHERE Scientific Advisory Committee, Study Group, and Investigators. Adherence to heart failure quality-of-care indicators in US hosptials: analysis of the ADHERE Registry. Arch Int Med 2005;165: 1469-77.
  7. Wachter RM, Flanders SA, Fee C, Pronovost PJ. Public reporting of antibiotic timing in patients with pneumonia: lessons from a flawed performance measure. Ann Intern Med 2008;149:29-32.
  8. Stulberg JJ, Delaney CP, Neuhauser DV, Aron DC, Fu P, Koroukian SM. Adherence to surgical care improvement project measures and the association with postoperative infections. JAMA. 2010;303:2479-85.
  9. Available at: http://www.va.gov/health/docs/HospitalReportCard2010.pdf (accessed 9-28-11).
  10. Ross JS, Maynard C, Krumholz HM, Sun H, Rumsfeld JS, Normand SL, Wang Y, Fihn SD. Use of administrative claims models to assess 30 day mortality among Veterans Health Administration hospitals. Medical Care 2010; 48: 652-658.
  11. VanSuch M, Naessens JM, Stroebel RJ, Huddleston JM, Williams AR. Effect of discharge instructions on readmission of hospitalised patients with heart failure: do all of the Joint Commission on Accreditation of Healthcare Organizations heart failure core measures reflect better care? Qual Saf Health Care 2006;15:414-7.
  12. Fonarow GC, Abraham WT, Albert NM, Stough WG, Gheorghiade M, Greenberg BH, O'Connor CM, Pieper K, Sun JL, Yancy C, Young JB; OPTIMIZE-HF Investigators and Hospitals. Association between performance measures and clinical outcomes for patients hospitalized with heart failure. JAMA 2007;297:61-70.
  13. Fine MJ, Smith MA, Carson CA, Meffe F, Sankey SS, Weissfeld LA, Detsky AS, Kapoor WN. Efficacy of pneumococcal vaccination in adults. A meta-analysis of randomized controlled trials. Arch Int Med 1994;154:2666-77.
  14. Dear K, Holden J, Andrews R, Tatham D. Vaccines for preventing pneumococcal infection in adults. Cochrane Database Sys Rev 2003:CD000422.
  15. Huss A, Scott P, Stuck AE, Trotter C, Egger M. Efficacy of pneumococcal vaccination in adults: a meta-analysis. CMAJ 2009;180:48-58.
  16. Rigotti NA, Munafo MR, Stead LF. Smoking cessation interventions for hospitalized smokers: A systematic review. Arch Intern Med 2008;168:1950-1960.
  17. Gottlieb SS, McCarter RJ, Vogel RA. Effect of beta-blockade on mortality among high-risk and low-risk patients after myocardial infarction. N Engl J Med 1998;339:489-97.
  18. Pitt B, Zannad F, Remme WJ, Cody R, Castaigne A, Perez A, Palensky J, Wittes J for the Randomized Aldactone Evaluation Study Investigators. The effect of spironolactone on morbidity and mortality in patients with severe heart failure. N Engl J Med 1999;341:709-17.
  19. Poole-Wilson PA, Swedberg K, Cleland JG, Di Lenarda A, Hanrath P, Komajda M, Lubsen J, Lutiger B, Metra M, Remme WJ, Torp-Pedersen C, Scherhag A, Skene A. Carvedilol Or Metoprolol European Trial Investigators. Comparison of carvedilol and metoprolol on clinical outcomes in patients with chronic heart failure in the Carvedilol or Metoprolol European Trial (COMET): randomised controlled trial. Lancet. 2003;362:7-13.
  20. Kanwar M, Brar N, Khatib R, Fakih MG. Misdiagnosis of community acquired pneumonia and inappropriate utilization of antibiotics: side effects of the 4-h antibiotic administration rule. Chest 2007;131:1865-9.
  21. Welker JA, Huston M, McCue JD. Antibiotic timing and errors in diagnosing pneumonia. Arch Intern Med 2008;168:351-6.
  22. Yu KT, Wyer PC. Evidence-based emergency medicine/critically appraised topic. Evidence behind the 4-hour rule for initiation of antibiotic therapy in community-acquired pneumonia. Ann Emerg Med 2008;51:651-62.
  23. Angus DC, Abraham E. Intensive insulin therapy in critical illness: when is the evidence enough? Am J Resp Crit Care 2005;172:1358-9

Click here for Excel version of Table 1

Click here for Excel version of Table 2

Click here for Excel version of Table 3

Click here for Excel version of Table 4

Click here for Excel version of Table 5

Click here for Excel version of Table 6

Click here for Excel version of Table 7

Click here for Appendix 1

Click here for Appendix 2

Click here for Appendix 3

Click here for Appendix 4

Reference as: Robbins RA, Gerkin R, Singarajah CU. Relationship between the Veterans Healthcare Administration Hospital Performance Measures and Outcomes. Southwest J Pulm Crit Care 2011;3:92-133. (Click here for PDF version of manuscript)

Page 1 ... 4 5 6 7 8