Chat with us, powered by LiveChat

Risk-Adjusted Models for Measuring Hospital Quality of Care

Two essential components of creating patient value is measurement of cost and quality. In a new article Quantros’ Lindsey Klein, Vice President of Product, breaks down why Risk-Adjusted Models for Measuring Hospital Quality of Care are essential.

Not all patients are created equal

By Lindsey M. Klein

Publicly available hospital quality and safety information has diversified and proliferated significantly during the past two decades. Consumers, commercial insurance providers, self-insured companies, and hospitals are relying on these measurements more and more to make better cost and quality decisions.

Today, hospitals are increasingly reimbursed based on quality performance data. They also use it to assess performance, set benchmarks, and drive quality improvement initiatives.

A history in healthcare

Quality measurements have a long history in healthcare. However, the past two decades have seen a dramatic increase in measurement initiatives for hospitals’ quality of care. In 1997, the Joint Commission on Accreditation of Healthcare Organizations launched the ORYX initiative to integrate performance measurement data into the hospital accreditation process as part of its Agenda for Change. By 2003, the Centers for Medicare & Medicaid Services (CMS) had already introduced a pay-for-performance pilot program to incentivize hospital quality performance. Today, CMS supports the National Quality Forum, which develops quality indicators for reporting and measuring nationally as well as a wide range of other programs.

Numerous associations and commercial companies also collect and report on healthcare quality measures. A growing number of media, in particular, publish lists that rank hospitals on various quality criteria, bringing healthcare decision-making into the public consciousness. CMS rewards and penalties, as well as new pay-for-performance reimbursement models, provide significant motivation for hospitals to assess their performance and focus on quality improvements.

Despite the growing emphasis on and sophistication of quality and safety measurements, few measurement systems take into account one of the most important realities of healthcare: All patients are not created equal, even those with the same diagnosis. The result is a major shortcoming in these analyses, rendering them potentially imprecise or inaccurate. Clearly, demographic and clinical risk factors differ among patients as well as patient groups treated across providers. In fact, some hospitals (by virtue of their location, clinical specializations, areas of excellence, or even high visibility) may specifically attract patients with greater risks, while other types of hospitals may tend to attract those at lower risk. Yet most quality measure and ranking systems fail to meaningfully adjust for risks when assessing outcomes.

Reliable risk-adjusted models

This article presents a reliable and validated approach to account for differences in patient risk when measuring hospital and physician quality of care. Discussed below are four risk models developed to measure significant patient outcomes that should be assessed when considering care quality. These are a risk-adjusted patient mortality index (RAMI), risk-adjusted complications index (RACI), risk-adjusted readmissions index (RARI), and risk-adjusted patient safety index (RAPSI).

Notably, these risk measures are distinct from severity adjustment, which assesses a patient’s level of need for resource consumption and affects length of stay and costs, not clinical outcomes.

The risk models that follow utilize readily available administrative data used across all patient types. In all models, patient-specific predictive variables are considered along with clusters of Medicare Severity-Diagnosis Related Groups (MS-DRG) for more homogenous patient types.

Each model was calculated using binary logistic regression, a widely used form of statistical analysis, to determine the probability for each outcome based on the history of patients with similar clinical and demographic attributes.

A nationally representative database of 26 million discharges from general, short-term, acute care, and non-federal hospitals throughout 50 states was used for the logistic regression estimates of the risk of each outcome for each patient.

Risk factors considered

The patient risk factors considered for RAMI, RACI, and RARI are related to the patient’s age, gender, presence of major chronic conditions, and other significant comorbidities. The major chronic conditions considered cover 2,691 diagnosis codes and represent illnesses such as emphysema, diabetes, and cancer. The significant comorbidities comprise 1,193 diagnosis codes and illnesses such as acute appendicitis, bacterial pneumonia, and encephalitis.

For RAPSI, risk factors are MS-DRG cluster, age, gender, and many of the Agency for Healthcare Research and Quality’s (AHRQ) 3,417 specified comorbidities that play a role in patient safety events. Among them are such conditions as aortic valve disorders, endocarditis, and congestive heart failure.

For all measures, actual hospital rates are compared with national rates for patients with similar demographic and clinical characteristics. These were obtained from such databases as CMS Medicare Standard Analytical Files, public domain all-payer statewide databases, and proprietary universal billing data from individual hospitals and consortia.

Score calculation

A hospital’s risk-adjusted scores are calculated by taking the actual hospital rates and dividing them by the expected rates generated from the respective regression models described above. An index greater than 1.0 indicates that the actual rate is higher than expected (for example, a score of 1.20 means the actual rate is 20% higher than expected), while an index less than 1.0 indicates that the actual rate is lower than expected. A 95% confidence interval was also calculated for each index to determine whether the difference between a hospital’s performance and the national norm was statistically significant or merely due to normal variation in the data.

MS-DRG clusters with fewer than 300 cases nationally were excluded because a meaningful comparison could not be made.

Below is a summary of each index.

The indices

Risk-adjusted mortality index (RAMI)

RAMI is used to assess whether inpatient mortality across all medical and surgical patients deviates from the expected, taking risk factors into consideration. Care complications were excluded to focus on the patient’s risk for mortality on admission. All patients with do-not-resuscitate and palliative care codes were excluded.

Risk-adjusted complications index (RACI)

RACI shows whether a provider’s postsurgical, post-obstetrical, and medical complications deviate from the expected, given the patient’s risk factors. The model excluded newborns, as well as all patients who died or were transferred to other short-term hospitals. Clinicians identified 482 postsurgical, 81 post-obstetrical, and 3,861 medical conditions on CMS lists as complications and iatrogenic events. This included accidental operative laceration, postoperative infection, and obstetrical shock. Complications included infected postoperative seroma, surgical complication-hypertension, iatrogenic pulmonary embolism/infarction, and postoperative respiratory failure.

Risk-adjusted readmissions index (RARI)

RARI examines the hospital readmissions rate for all medical or surgical patients relative to the expected, adjusted for patient risk factors. It considers only unanticipated readmissions to any inpatient facility within 30 days of discharge for the same MS-DRG or related service line as the first admission. Unavoidable readmissions, such as for AIDS or cancer treatments, were not considered. Also not considered were newborns or patients who died during the first admission or were transferred to another short-term hospital.

Risk-adjusted patient safety index (RAPSI)

RAPSI measures the rate of patient safety events for a specific diagnosis or procedure relative to expected frequencies, given patient population risk factors. It includes 11 patient safety events ranging from postoperative hematoma to postoperative sepsis and failure to rescue. RAPSI enables comparison across all at-risk Patient Safety Indicators (PSI) at both the MS-DRG and clinical category level (for example, cardiac and orthopedic services). By contrast, the AHRQ PSI indicator methodology focuses exclusively on an individual type of PSI across unrelated MS-DRG clusters.

Comparison and benchmarking

Benchmarking of scores on the above indices is useful to help hospitals and physicians understand how they compare to similar facilities and for patients to judge quality of care and make healthcare decisions. Accordingly, benchmarks were created by ranking all hospitals on one of the national databases cited above and calculating hospitals scoring in the 75th percentile. Performance at this upper level suggests that a facility leads the nation in the service area examined and provides a target for lower-performing hospitals. It provides more relevant information for consumer healthcare decision-making than measures that do not consider patient populations.

Conclusion

Today, quality measurement and improvement are necessities for hospitals to maximize reimbursements. For patients, quality index can be a useful tool for judging provider quality and value and making the best healthcare choices. As a result, scores also serve as a marketing tool for hospitals.

Given the above, providers would be well served to strive for the most accurate indices to assess their performance. Patient risk is a significant factor often overlooked in gauging these scores. The RAMI, RACI, RARI, and RAPSI models discussed above provide a statistically reliable and well-validated way to control for these factors in computing performance ratings.

 
2020 CareChex Awards are Here!

Quantros prides itself in providing the most comprehensive and objective calculation of hospital and system quality performance in the industry. Take a minute and check out if your hospital or system won any awards.

Request Award Summary