data center

Data prioritization, hybrid cloud help HHS auditors uncover fraud

The Department of Health and Human Services’ inspector general is tasked to find fraud, waste and abuse, but outdated methods of data collection have created problems.  In order to comb through data from the Center for Medicare and Medicaid Services, for example, OIG employees would need to visit CMS monthly to pick up datasets and take the information back to the IG office for different audits and studies

With the IG’s move into the cloud, however, remote exchanges of data and new analytical capabilities are simpler. And with a data lake, the IG can store its unstructured data while still making it searchable.

When CTO Evan Lee joined the IG’s office in 2016 and realized the average age of more than 90 legacy applications was 12 years, he knew that a traditional data center wasn’t going to be able to support his office’s data analytics needs. The team needed some help from the cloud.

“A hybrid infrastructure creates connectivity between our data center and a cloud service provider,” Lee told GCN. It lets OIG use the existing resources in the data center “and take advantage of the scalability and elasticity of the cloud,” he said.

Working with Excella Consulting, Lee’s team tackled the biggest pain points that would provide the largest benefit to their investigative work.  They started by creating a central dashboard through the Looker analytics platform so Lee and his managers could set access controls for specific datasets.

The Amazon Redshift data warehouse provided the foundational structure of the database, which included operational information on audits, evaluations and investigations.  Through agile development and data governance policies, the team created strategic roadmaps for different data components, technologies and tools.

“We are looking into … the data that they use to audit providers and patient information, to analyze and determine the different subsets,” said Claire Walsh, Excella's data and analytics practice lead at Excella.  “We are targeting our work to look at the common data elements and the most engaged users in specific areas.”

As more data is moved into the platform, Lee said his office will be able to build a more complex fraud analytics model with more “storage, processing and computing power” to improve accuracy.

“The fraud models are looking for outliers, but we know that there are a lot of fraud perpetrators out there as well as well-educated doctors and pharmacists who have access to CMS for their services,” Lee said.  The more accurate the model, the easier it will be to find the fraud outliers.

Once investigators can get a better idea of their data processing needs, Lee said he expects machine learning to play a role in the investigation process. But for now, the priority is the categorization and prioritization of data to create a foundation for the future.

About the Author

Sara Friedman is a reporter/producer for GCN, covering cloud, cybersecurity and a wide range of other public-sector IT topics.

Before joining GCN, Friedman was a reporter for Gambling Compliance, where she covered state issues related to casinos, lotteries and fantasy sports. She has also written for Communications Daily and Washington Internet Daily on state telecom and cloud computing. Friedman is a graduate of Ithaca College, where she studied journalism, politics and international communications.

Friedman can be contacted at sfriedman@gcn.com or follow her on Twitter @SaraEFriedman.

Click here for previous articles by Friedman.


inside gcn

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group