Ahorra un 25 % (o incluso más) en tus costes de Kafka | Acepta el reto del ahorro con Kafka de Confluent

logo-GEP-Worldwide

GEP Boosts AI-Powered Supply Chain, Processing 1B Events per Month

Scaled from 500K to one billion events per month

0% data loss in streaming 1 billion events

Connected 500 microservices

GEP improved predictive performance by 1,400%

Watch the Video

“When we're using services like Confluent for data delivery, we don't need to think about it. It just gets delivered.”

Nithin Prasad

Senior Engineering Manager, GEP Worldwide

As an AI-first organization specializing in supply chain and procurement solutions, GEP depends on real-time data to drive its operations. Previously, the company relied on quarterly batch processing, which delayed reporting and decision-making for its customers. 

“For a supply-chain company like GEP, offering insights with a day-long delay isn’t an option. Many decisions need to be made quickly to ensure on-time data delivery,” said Nithin Prasad, Senior Engineering Manager at GEP Worldwide.

A Technical Solution That Met 5 Key Business Requirements 

To solve data infrastructure challenges and the large cost of delayed processing time, GEP needed to find an alternative for its legacy data delivery pipelines and self-managed Apache Kafka® environment. Prasad’s team decided to adopt Confluent’s fully-managed, cloud-native native streaming platform to connect operational, analytical, and AI systems across their data architecture.

The shift to Confluent was driven by five key business requirements: reliability, scalability, pricing, storage and the human element of customer engagement. “Confluent is the best fit for us across all the five points,” Prasad said. “With every customer engagement we’ve experienced, we’ve always had positive feedback.”

Of particular interest to Prasad were Confluent’s pre-built, fully managed connectors, which eliminated the need to custom-build integrations. As a result, Prasad’s team was able to reduce pipeline sprawl and point-to-point connections, simplifying data pipeline management and freeing up developers to focus on building core features rather than micro-managing data logistics.

Confluent’s continuous data delivery further enabled GEP to build reliable data pipelines that connected all their systems, including key data storage systems.

Seamless Scaling and Integration Across Systems

GEP initially started by processing 500K events in a single cluster. After adopting Confluent, the company quickly scaled to handling a billion events per month across multiple clusters—all with zero data loss. This leap in scalability ensured seamless operation of GEP’s 500 microservices. “Every piece of data that flows through GEP passes through Confluent,” Prasad said.

With Confluent’s multi-cloud support, GEP integrated real-time data from multiple systems, including SQL database, MongoDB, and Elasticsearch, directly into their AI stack. These real-time data streams enabled GEP to predict outages 30 minutes in advance—1,400% earlier than their previous two-minute lead time. This proactive approach to outage prevention was key to minimizing downtime and enhancing clients’ access to data, boosting GEP’s customer satisfaction (CSAT) scores.

By adopting Confluent’s fully-managed data streaming platform, GEP reduced both maintenance time and costs. Automating tasks like infrastructure monitoring also allowed GEP’s teams to focus on building new features, leading to optimized resource allocation.

Empowering GenAI Innovation With Real-Time Data Streaming

Prasad explained that GEP’s AI-first solutions rely on real-time data, saying, “Confluent fits seamlessly into our data stack. It’s natively integrated with Azure Open AI services and other GPT models.”

GEP has successfully embedded AI into its daily operations. What began as internal, AI-driven initiatives across teams and leadership has now evolved into a generative AI chatbot that provides document summaries and insights, and flags risks in procurement and supply chain operations for customers.

GEP also offers its customers the ability to build their own tools using generative AI, simply by typing in natural language. With Confluent’s real-time streaming data pipelines in place, GEP can feed models with real-time context, allowing them to respond to the specific, in-the-moment needs of users and solve their unique challenges. “It's a huge plus for GEP’s customers, greatly boosting user experience, customer experience, and CSAT scores,” Prasad emphasized. 

Having the right data in the right format, at the right time, is critical. As Prasad put it, “Data delivery is as important as machine learning. Everything we’re doing with AI is only possible if we have the right data in place. And that's where Confluent really helps us." 

When advising other companies on selecting a data streaming platform, Prasad recommended choosing “a partner who will accompany you on a multi-year journey,” emphasizing the importance of aligning on core values and customer engagement.

By transforming its data management capabilities with Confluent, GEP now delivers faster, more reliable, and scalable AI-powered solutions to its clients.

Learn More About GEP Worldwide

Empieza a usar Confluent hoy mismo

Recibe 400 dólares para gastar en Confluent Cloud durante los primeros 30 días.

Ver más historias de clientes

logo-Generali

Generali

Driving Modernization and Digital Transformation at Generali Switzerland with an Event-Based Architecture.

Confluent Platform
Globe Group Logo (1)

Globe Group

Globe Group Slashes Infra Costs and Fuels Personalized Marketing with Confluent

Confluent Cloud
Confluent Platform
logo-Humana

Humana

Humana is at the forefront of an industry-wide effort to raise health outcomes and create the best possible patient experience while keeping costs in check through improved interoperability

Confluent Platform