Oct 4, 2023

A New Way of Looking at Payer Data Integration

Making sure payer data goes where it’s needed is one of the hidden costs that health plan leadership faces. I say “hidden” because data integration usually isn’t on payer and TPA leaders’ agendas every day. When I was the former Chief Strategy Officer of a regional Blues plan, I saw firsthand the impact of surmounting data integration needs to our cost structure. The non-technically inclined leader was often perplexed on why data exchange was so challenging and, thus, expensive: me included.

Without tracking the time and resources needed to solve data integration problems, it’s easy to overlook the costs. Purchasing and configuring multiple data tools is expensive. Building a solution yourself is expensive. Tying up engineering talent on either approach is expensive.

IT leaders and teams know the truth: data integration between legacy payer systems and the new crop of point solutions and vendors that make up the modern healthcare supply chain is a real headache. Establishing one-off data feeds gets the work done, for now, but does not support a scalable solution, particularly critical if one believes this modern supply chain expands and requires more payer-related data, such as eligibility, medical claims, engagement, accumulators, among others.

There’s a better way to look at this problem. When CIOs/CTOs take a strategic approach and follow a known path, they’re more likely to solve the problem efficiently, not to mention help make their IT organizations more future-proof and ready for whatever comes next.

At Flume Health, we’ve broken down the challenges into a decision framework for payer CIOs and CTOs. We cover the issue in our new white paper. This framework can be used to map out an approach based on payer/TPA goals and what they’re trying to achieve through improved data integration. It includes:

  • Single-source network. Many aggregators and platforms come with the promise of reducing complexity and the sheer volume of integrations. This is a great solution for key product segments. However, their simplicity means there are limitations.
  • Optimized labor model. Outsourcing parts of the data integration function is a way to save money, but outsourcing carries multiple risks, including reputational risk of relying heavily on offshore labor which generates the best business case.
  • Consolidated integration tool. This approach allows flexibility and lets payers adopt virtually any point solution or vendor. Payers that are wedded to existing workflows might hesitate to adopt a consolidated tool.

Over the past few years we’ve taken this strategic view and we’ve even developed and tested integration approaches that we think can solve the problem. The result of this work and validation is Flume Relay, which offers a single platform for scalable integrations.

In the above decision framework, Relay sits in the consolidated integration tool category, but it also meets the goals associated with the other two approaches, including simplification and reduced labor costs.

We see Relay as a breakthrough in data exchange for payers looking for a simple, effective means of achieving integration with the thousands of possible point solutions, vendors and systems in the health plan marketplace. You can learn more through our white paper.

Schedule a Demo

Fill out the form with your details and a team member will get in touch shortly