With Confluent, we’ve achieved even more than what we’d hoped for just by being able to organize our information and make it flow without interruption.
Reynaldo Perez
R&D Leader, Solutions Architecture, SulAmérica
One of the largest insurance companies in Brazil, the nearly 130-year-old SulAmérica has more than 6.3 million customers, a large portion of which are hospitals and other types of healthcare facilities.
The company runs its business through a broad and diversified distribution network that includes more than 29,000 active brokers, employers, joint ventures, and strategic alliances. With rapid changes occurring in the insurance industry itself, the way the Brazilian government regulates insurance, and the technologies behind insurance, SulAmérica saw a need to begin rethinking the way it handles its massive volumes of incoming data produced by more than five million daily events in order to scale its data needs and take full advantage of streaming data to eventually better serve SulAmérica customers.
This transformation would entail completely rethinking its IT infrastructure and moving a large community of internal developers away from the tools they had used for many years, in addition to training IT staff to re-imagine their approach to data and message brokering.
“With Confluent, we’ve achieved even more than what we’d hoped for just by being able to organize our information and make it flow without interruption. That was something mind-blowing for us, because our reality before was just the opposite—everything was difficult and manual and everything eventually broke. So we’ve gained a marvelous tool to solve our problems and we’re now trying to demonstrate the power and value of Confluent to other teams in our organization.”
— Reynaldo Perez, R&D Leader, Solutions Architecture at SulAmérica
Technical Solution
As a company that’s been around for a very long time, SulAmérica faces the same challenges many older and entrenched companies face: legacy technology’s ever-decreasing effectiveness and difficulty scaling to meet new customer demands, while modern, cloud-native competitors jump in to disrupt the market.
The company relied on batch processing and handling events through databases, but this wasn’t working well to handle the rapidly increasing amount of continuous streams of data being generated by the doctors and patients served by SulAmérica’s clients, which are primarily hospitals. The data flows were constantly getting broken and interrupted, and the data wasn’t getting to the CRM fast enough to maintain its value.
In mid-2019, SulAmérica started working with Confluent Cloud to improve its IT architecture and make the most of its data. The company had been working with brokers such as Apache ActiveMQ for a long time but needed something more powerful.
They needed to create an interoperability platform as a new architecture standard and use that platform to both tap into the full potential of streaming data and eliminate the reliance on databases and batch processing.
For the technological backbone of this new platform, SulAmérica chose to use Confluent Cloud combined with Google Cloud Platform and AWS. “We supported our decision in our long and broad relationship with those partners allowing us to use the expertise we have accumulated in cloud computing, event-driven architecture, and cloud-based products.” says Reynaldo Perez, R&D Leader, Solutions Architecture at SulAmérica.
Business Results
Rapid, seamless setup of a new interoperability platform.
“We started with Confluent Cloud right from the start,” says Reynaldo Perez, R&D Leader, Solutions Architecture at SulAmérica. “It was a good decision because we didn’t have any problems trying to organize the functions or the governance or deal with all the usual work you need to set up a tool of such complexity. Things like Apache ZooKeeper don’t exist for us and that’s good because we need to be users and don’t want to be dealing with installations. We’ve been using the interoperability platform since April and it began to work almost flawlessly, even with the requirement of not losing a single event.”
Increased flexibility, scalability, and responsiveness.
“We needed an event-driven solution because our architecture was batch-based and our batch-based solution wasn’t working well because when the batches break you have to reprocess everything and it was a constant mess,” says Perez. “So we needed a more flexible and scalable solution that could put the databases to rest. Now, we can study the data sources and schema and guarantee that the topic will be clean, stable, and described. So if anyone needs to access the data from our customers, the connection to the topic is clean, fast, and guaranteed, and now we’ve begun to see multiple teams consuming information from the topics. Also, the information from the broker can scale automatically without issues. The events get processed automatically without a problem and without losing a single event. And we have managed to achieve that without any effort.”
Better performance for better customer experience.
“On average it takes only 3 to 5 minutes to get the events into the CRM now. Previously, it took almost 24 hours to get event information into our CRM using batch processes and there were a lot of problems like having to restart and reprocess data without knowing what had already been processed and what hadn’t. Now, the data gets to the CRM very quickly and can be used by the doctors to understand crucial information about the patient, including prior visits, the timeline of illness, prescriptions, exams, and much more.”
Single source of truth.
“We’ve achieved even more than what we’d hoped just by organizing the information and making it flow without interruption. Everything was really slow. Now the database is gone and we’re focused on integrating with new partners and on formatting and mapping new information. Every time we discover a usable piece of information to be incorporated into our standard, we organize a connector to that data source and put that information into a topic, and that way we guarantee that the information will be flowing from that single source of truth.”
What's Next?
Moving forward, Perez says he’d like to begin to use Confluent Cloud and ksqlDB to drive home the importance and value of streaming data to the company’s 700 developers. “Now we are trying to demonstrate the value of Confluent Cloud to the various teams,” says Perez. “We have 10 teams now starting to use Kafka as part of their solutions because they are seeing the advantages. We’re also working to improve data observability and using different tools to manage and monitor the Kafka environment.”
“A lot of things can be solved without having to manually create business rules,” Perez continues. “Just putting it into ksqlDB and only having to use the stream processing makes everything so much easier. With that, Confluent Cloud will be solving real problems for us in a very efficient and organized way, and at the beginning of the chain.”
Another thing changing the market is the advent of open insurance and open banking in Brazil, which means the playing field is being leveled and things are going to become much more competitive, Perez says.
“Only with an event brokering tool can we have immediate scalability to provide full functionality quickly for any type of campaign,” says Perez. “So with Confluent, we will be able to stay competitive in the open insurance and open banking market. Now we can fight.”
Empieza a usar Confluent hoy mismo
Recibe 400 dólares para gastar en Confluent Cloud durante los primeros 30 días.