Tag Archives: Big data

DRG Downcoding

BIG DATA AND THE FUTURE OF MEDICARE AUDITS

Edward M. Roche, Ph.D.,J.D. — Barraclough NY LLC

Part I – DRG Downcoding in Hospitals

The number of Medicare audits is increasing. In the last 5 years, audits have grown by 936%.  As reported previously in RACmonitor, this increase is overwhelming the appeals system. Less than three percent (3%) of appeal decisions are given on time within the statutory framework.

It is peculiar that the number of audits has grown rapidly, but without a corresponding growth in the number of employees for RACs. How can this be? Have the RAC workers become more than 900% more efficient? Well, in a way they have. They have learned to harness the power of Big Data.

Since 1986, the world’s ability to store digital data has grown from 0.02 exabytes to 500 exabytes today. An exabyte is one quintillion bytes or 10e+18 bytes. Every day the equivalent 30,000 Library of Congresses is put into storage. Lots of data.

Auditing by RACs has morphed into using computerized techniques to pick targets for audits. An entire industry has grown up that specializes in processing Medicare claims data and finding “sweet spots” on which the RACs may focus their attention. In a recent audit, the provider was told that a “Focused Provider Analysis Report” had been obtained from a subcontractor. Based on that report, the auditor was able to target the provider.

A number of hospitals have been hit with a slew of Diagnosis-Related Group (DRG) downgrades from Internal Hospital RAC Teams camping out in their offices, continually combing through their claims data. DRG is a system that classifies any inpatient stay into groups for purposes of payment.

The question then becomes: How is this work done? How is so much data analyzed? Obviously these audits are not manual. They are Cyber Audits. But how?

An examination of patent data begins to shed light on the answer. For example, Optum, Inc. of Minnesota (associated with United Healthcare) has applied for a patent on “Computer implemented systems and methods of health care claim analysis.” (Application Number 14/449,461, Feb. 5, 2015) These are complex processes, but what they do is analyze claims based on a Diagnosis-Related Group (DRG).

The information system envisaged in this patent appears to be specifically designed to downgrade codes. It works by running a simulation that switches out billed codes with cheaper codes, and then measures if the resulting code configuration is within the statistical range averaged from other claims.

If it is, then the DRG can be down-coded so that the revenue for the hospital correspondingly is reduced. This same algorithm can be applied to hundreds of thousands of claims in only minutes.  And the same algorithm can be adjusted to work with different DRGs. This is only one of many patents in this area.

When this happens, the hospital may face many thousands of down-graded claims.  If it doesn’t like it, then it must appeal.

Medicare Audits as Asymmetric “Warfare”

Here, there is a severe danger for the hospital.  The problem is that the cost of the RAC running the audit is thousands of time less expensive that what the hospital must spend to refute the DRG coding downgrade.

This is the nature of asymmetric warfare.  In military terms, the cost of your enemy’s offense is always much smaller than the cost of your defense. That is why guerrilla warfare is successful against nation states. That is why the Soviet Union and United States decided to stop building Anti-Ballistic Missile (ABM) systems — the cost of defense is disproportionately greater than the cost of offense.

Hospitals face the same problem. Their claims data files are a giant forest in which these big data algorithms can wander around down-coding and picking up a substantial revenue stream for the auditor.

By using Artificial Intelligence (advanced statistical) methods of reviewing Medicare claims, the RACs can bombard hospitals with so many DRG downgrades (or other claim rejections) that it quickly will overwhelm the provider’s defenses.

We should note that the use of these algorithms is not really an “audit”.  It is a statistical analysis, but not done by any doctor or health care professional. The algorithm could just as well be counting how many bags of potato chips are sold with cans of beer. It doesn’t care.

If the patient is not an average patient, and the disease is not an average disease, and the treatment is not an average treatment, and if everything else is not “average”, then the algorithm will try to throw out the claim for the hospital to defend. This has everything to do with statistics and correlation of variables and very little to do with understanding whether the patient was treated properly.

And that is the essence of the problem with Big Data audits. They are not what they say they are because they substitute mathematical algorithms for medical judgment.

In Part II we will examine the changing appeals landscape and what Big Data will mean for defense against these audits. In Part III we will look at future scenarios for the auditing industry and the corresponding Public Policy agenda that will face lawmakers.

Originally published in RACmonitor.

Audits are a Failure and Data Mining Has Presumption of Guilt

Senator Bill Nelson (D-FL), Chairman of the Senate Special Committee on Aging has criticized the Recovery Audit Contractor (RAC) process.   His argument is, in part, that since the RACs receive between 9% – 12.5% of whatever they recover, then there is an inherent incentive to keep improper payment rates high.  Details regarding types of audits are found in a slides presentation of the RAC program.  “The Recovery Audit Program and Medicare: The Who, What, When, Where, How and Why?”  There are three types of review:  (1) Automated reviews; (2) semi-automated, which is listed as “claims review using data and potential human review of a medical record or other documentation); and (3) “complex” review, in which a medical record is required.

What does “automated review” mean?  In practice, it means the use of secretive and proprietary large data mining of records to discover patterns, leading to targeting of health care providers for audits.  It is important to keep in mind that this is not really auditing, it is merely finding targets based on patterns of data.  So, for example, if a physician or practice works overtime, and on the weekends, and therefore produces more billing that other practices that work a “normal” amount of time, then they will be targeted.

Data mining works on the assumption that if the billing records are out of the ordinary, then there is something wrong, and so the practice should be audited.  There are two sides to this.  On the one side, deviation from the norm might be a problem; on the other side, it might indicate honest health-care service providers who are working as hard as possible serving a disadvantaged market.   In that latter case, an audit based on nothing other than data mining, simply burdens the health care system, and cuts off deserving patients from the health care they are entitled to.   This is a non-incentive for the  hard workers in the health care space.

The use of big data mining to control health care costs is the world’s greatest pressure towards average performance, that it, towards mediocrity.

The use of big data mining should be abolished, or at a minimum big data mining and the issue of presumption of guilt should be investigated.