By Russ Banham, Contributor
Anyone who has ever had surgery or extensive medical care is likely stunned upon scanning the healthcare provider’s multi-page bill for medical services — rows of procedures accompanied by esoteric billing codes with charges that add up to a princely sum. More jaw-dropping still is that a sizable share of the bill may be fraudulent.
Healthcare fraud costs the U.S. $68 billion annually, working out to about three percent of the nation’s $2.26 trillion in healthcare expenditures, according to a conservative estimate by the National Health Care Anti-Fraud Association.
For decades, identifying these scams to reduce their financial severity has been an elusive goal for health insurers and hospitals, given the sheer breadth of data populating a typical medical bill. Now, thanks to predictive data analytics and machine learning technologies, fraud investigators are able to dig through a mountain of billing data much quicker to unearth anomalies and indicate evidence of potential fraud. By pinpointing which bills or line items are phony, investigators can narrow their examinations to these charges, making better use of their time.
The technology, many would concur, is long overdue. From 1980 through 2017, fraud investigators at health insurance provider Blue Cross Blue Shield of Michigan closed more than 50,000 cases of fraudulent claims, resulting in a recovery of more than $400 million. And that is just one insurer.
A Vast Ecosystem
A primary reason why healthcare fraud is so ubiquitous and costly is that fraudsters are aware of how tough it is to identify their scams in time to do much about them. The effort required for medical auditors to review every line item in bills from internal hospital staff and external third party providers — like physical therapists, radiologists, pharmacists and medical equipment providers — is overwhelming.
“Although medical procedures are codified and rules-based, detecting fraud in the bills sent by all the different parties in the healthcare ecosystem is a steep uphill climb without the use of automated technology,” said Laks Srinivasan, chief operating officer at Opera Solutions, a provider of healthcare data analytics software and services.
Given the large number of healthcare services and related medical codes, people make all-too-human mistakes. “We worked with a large health insurer recently, analyzing three years’ worth of medical data to detect evidence of fraudulent bills,” said Srinivasan. “The company received approximately 10 million insurance claims worth about $3 billion per month. Altogether, our analytics software examined half a billion records to detect possible anomalies.”
There’s no way even teams of people could conduct a similar analysis, which explains why swindlers perpetrate billing-related crimes — they know there’s a very good chance they won’t get caught. Those that do get ensnared are looking at huge civil fines and penalties, not to mention criminal charges.
A case in point is the September 2018 settlement between the U.S. Department of Justice and Health Management Associates (HMA), a former hospital chain based in Naples, Florida. The company must pay more than $260 million to resolve false billing and kickback charges that it knowingly billed government healthcare programs like Medicare and Medicaid for inpatient services that should have been billed on an outpatient basis. HMA also paid physicians for patient referrals and submitted inflated claims for emergency department services.
Such cases are not shocking to medical professionals like David Levin, MD and chief medical officer at Sansoro Health, a provider of healthcare data integration services. “Clearly, we have been dealing for some time with Medicare and Medicaid `mills’ designed to intentionally defraud the payer,” said Levin, who previously led the clinical systems office of the Cleveland Clinic Health System. “In these cases, medical bills claim that certain patients were seen that weren’t, or services were provided that turned out not to be the case.”
In addition to fraudulent charges, healthcare bills are rife with unintentional service provider billing errors. “The billing rules are complex and byzantine, resulting in inaccurate coding,” Levin said. “Physicians and other service providers could be up-coding or down-coding bills, charging more for services or less, respectively, because it’s difficult to document the actual level of service provided.”
Both medical bill challenges — outright fraud and unintentional coding or billing errors — are addressable through the use of predictive data analytics and machine learning solutions.
Billing departments input data on customary hospital, physician, and other provider bills into the software, along with the charges for these services and the usual number of patient visits. The program them highlights bills that fall outside of these normal parameters, calling attention to unintended clerical errors.
“The model is trained to spot unusual patterns in a very large data set, making them a better microscope to spot and filter out potential fraud,” said Levin. Yet in some cases, the anomaly may turn out to be in error.
“The analytics may look at 10,000 rheumatologists’ bills and find that a particular code was used by only three rheumatologists,” Srinivasan explained. “That’s an anomaly that could be evidence of fraud, but on further investigation turns out to simply be a rarely used procedure in that physician specialty.”
A medical non-necessity — a service performed by a provider that is not typically required — is clearer evidence of fraud, as is over-utilization of services. For instance, the algorithm can detect if a patient is receiving more frequent service than other patients with the same diagnosis. “If 99 percent of patients with the same heart condition see their cardiologists every six months and one percent see their cardiologists every two months, the machine will detect this as an anomaly,” Srinivasan said.
Other anomalies involve the geographic distance of a patient to a provider, a provider with a high number of out-of-state patients, and the use of pharmacies that are dozens of miles away from a patient’s home when there are multiple pharmacies close by. The data analytics will cite these unusual circumstances as deserving further investigation.
“This is all about reducing the volume of false positives,” Srinivasan said.
The Human Hand
Despite the fraud-sniffing capabilities of machine learning algorithms, human beings are still necessary to determine whether or not an anomaly is just that — a mere deviation from the norm, or something that signals illegal behavior. “Machines can detect strange patterns that generally would elude people, but you still want someone to investigate whether or not these patterns represent actual fraud,” Srinivasan said.
Down the line, Levin projected that many hospitals and other healthcare entities will begin to use machine learning and other forms of artificial intelligence (AI) to prepare their medical bills. “I can see the clinician start with a template that automatically detects the possibility of a coding error as they complete the bill, thereby preventing them from making a mistake [in the first place],” he said.
Not only would this eliminate the possibility of a code blunder, it would make it extremely difficult for a fraudster to create a bogus bill. No matter how many times they inserted a fabricated service, the anomaly detection software in the AI tool would boot it out as likely to require further investigation.
If that doesn’t steer a would-be swindler away from fraud, then they deserve to be caught — to the betterment of our healthcare system.