Can the world’s fastest supercomputer combat health care waste?

Energy computer scientist proposes using Oak Ridge's Jaguar for real-time analysis of claims

A computer scientist at Oak Ridge National Laboratory, one of the facilities behind the Manhattan Project and now the home of the world’s fastest supercomputer, has proposed putting those resources to use for government health care reform.

Combining and analyzing health care data in real time could save as much as $50 billion a year by eliminating waste and preventing fraud in government-run health care programs, and also could improve the quality of medical care, said Andrew Loebl, a senior researcher in the lab’s Computational Science and Engineering Division.

“We have never put all of this data together,” Loebl said. “My idea is to use the storage capacity of the supercomputers at Oak Ridge to analyze the data.”

Using existing programs to analyze claims data generated by dozens of government programs could proactively identify fraudulent or inappropriate claims before they are paid, rather than requiring contractors to seek reimbursement months or years after payments have been made, Loebl said.

“It’s not just a supercomputer exercise,” he said. “It’s using the technology to do holistic analysis of all the data all the time. It’s a perfect problem for supercomputing because it can be easily parallelized. I think that by year two of a program we would probably be processing the claims in real time.”

Loebl made the proposal to the Centers for Medicare and Medicaid Services at the Health and Human Services Department, and he admits it sounds too good to be true.

“The idea is unbelievable to the decisionmakers,” he said.

Currently, the government uses five regional contractors to process claims for a variety of health care programs, including Medicare, Medicaid, and programs run by agencies including the Defense and Veterans Affairs departments, Indian Health Services, the Federal Employee Benefits Health Plan and others. The data is disaggregated for processing, and broken down geographically and to allow processing with limited capacity computers. Consequently, no one sees or understands all of the data.

“The only thing the government is able to do now is pay claims and chase down mispayments” later, Loebl said.

Estimates of those mispayments vary, but are substantial. The FBI estimates that 10 percent of payments are improper, amounting to $150 billion a year. A study by Price Waterhouse Coopers Health Research Institute puts the figure as high as 30 percent, which amounts to nearly $500 billion wasted annually. Contractors currently recover about $1 billion in improper payments a year by chasing them down after the fact.

Loebl estimates he could save $50 billion by looking at all of the data in real time. The ability to combine and analyze data also would allow tracking the outcome of health care practices, so that outcomes could be optimized over time and the quality of health care improved, he said.

The technology for the task is available. Oak Ridge houses Jaguar, a recently upgraded Cray XT5-based supercomputer now rated the fastest in the world with a processing speed of 2.3 petaflops (1,000 trillion floating point operations per second). Its 362 terabytes of memory could easily handle the health care claims data and could process a year’s worth of data in minutes. The processing would not interrupt the climate modeling or other advanced research being done on Jaguar, Loebl said.

“There’s nothing rocket science about this,” he said. “None of this will take any sophisticated software. What is complicated is doing it for all of the data all of the time, as we receive it.”

That is the hurdle faced by the proposed program. Convincing agencies to combine traditionally siloed data into a single flow for processing on a single computer could be tough. Another challenge is enforcement. Identifying waste, fraud and abuse on the supercomputer is one thing; enforcement is a separate issue that does not have anything to do with Oak Ridge. It would be up to Congress and the individual agencies to make use of what the supercomputer can find.

“We’re a long way from persuading people that this is practical,” Loebl said.

About the Author

William Jackson is a Maryland-based freelance writer.

inside gcn

  • health data

    Improving the VA patient journey with data transparency

Reader Comments

Mon, Jan 18, 2010

What a great proposal. I hope the President sees it and acts on it. I have said this for at least 3 years now and have told people I meet I only hope health care reform can be implemented properly.If not, we are in real trouble. Technology is a key component.

Thu, Jan 7, 2010 Miguel New Hampshire

I think this is a good idea. If there is 150 Billion in fraud, Medicare does have that good a fraud detection, and if you have to recover instead of not pay at all, all the better. Besides it looks like it could run using excess capacity, better utilizing the supercomputer.

Thu, Jan 7, 2010 darwin

Having 10% to 30% of the payments being improper makes it seem that the current system is broken. If so, then the fraud/abuse detection is a joke.

Thu, Jan 7, 2010

You're all idiots!
Mr. Lobel's idea is a good one with no buts about it. It's a simple solution to a major problem. As far as convincing agencies that they have to do this goes, that part should be a non-issue assuming the President has the balls to simply issue the order.

Thu, Jan 7, 2010 Washington DC

Mr. Loebi's ides is a good one. But the higher priority might be matching real time terrerist data from different government agencies including foreign intelligence data to prevent terrorist attacks on the United States and U. S. soldiers and citizens overseas.

Show All Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group