Ahorra un 25 % (o incluso más) en tus costes de Kafka | Acepta el reto del ahorro con Kafka de Confluent
The Data Streaming Awards helps recognize and celebrate organizations that are harnessing the power of this innovative technology to transform their businesses—and provide increased value to their customers and communities.
Hosting the awards gives us—and you—a glance into what cool new things companies, teams, and people are doing with data streaming—be it building early-stage prototypes or driving massively scalable use cases.
Let’s look at our five winners this year—and the incredible use cases and business outcomes they are driving with data streaming.
Ever wondered how Uber delivers such a seamless customer experience?
The answer is Apache Kafka®. Uber uses Kafka to process 10 T messages/day for fast, reliable, and real-time data processing and messaging for ride and delivery services, data ingestion, logging, change data capture, and pub/sub.
But what problem were they looking to solve with data streaming technology?
Uber needed a reliable (>99.99%), fast (<10 ms), and scalable (10 T message a day) data processing and messaging platform to power daily business like rides, deliveries, operations, offline data ML, analytics, and more.
Uber used Kafka to build a mission-critical data processing and messaging platform. Today, Uber relies on Apache Kafka for uninterrupted data flow and business continuity. Using Kafka's messaging platform, Uber has achieved:
Faster, reliable data processing, enabling real-time decisions and dynamic pricing
Timely updates improved communication between passengers and drivers
Valuable insights from data analysis, optimizing operations, and enhancing the user experience
Penske is more than the yellow trucks you see on the roads. Penske delivers a full range of innovative transportation and logistics solutions that are vital to the success of the business and the customers they serve.
And given vehicle uptime is the number one thing which Penske’s customers look to them for, driving use cases like predictive maintenance or vehicle remote health monitoring has become mission critical for Penske.
Today, Penske processes around 190 million IoT messages a day using Confluent as their messaging queue. They run those messages through their proactive diagnostics AI engine, to help predict when a truck is going to fail. With that information, plus the knowledge of where the truck is right now, Penske can call the customer and ask them to bring the truck in, or schedule the maintenance that’s needed.
With technologies built around data streaming, this year alone Penske estimates to be successfully preventing over 90,000 vehicles from encountering roadside breakdowns.
Cross-border payments fintech company Wise makes transferring money a breeze for its customers. It provides customers with the ability to send money over wires, hold money in different currencies, spend money with a debit card, and invest.
What’s powering this? Use of technologies like Apache Kafka® and Apache Flink®.
Wise uses Kafka streams and Flink-based engines for their stream processing platform to enable real-time aggregations, data replication, and enrichments for their financial data.
Wise implemented a stream processing platform—which is a distributed and scalable system designed to handle large amounts of streaming data in real time. The platform is designed to be self-service, which means teams can access and use the platform without extensive technical knowledge.
The stream processing platform enabled Wise to:
Serve tens of millions of customers across the world and move around £27 billion in a single quarter
Provide customers with more efficient secure money transfers—without having to wait for long periods of time for the transfers to be processed
The Postal Service's mission is to provide the nation with reliable, affordable, universal mail service. And it’s the use of technologies like data streaming that helps USPS power its real-time mail operations.
When batch-based legacy systems were holding them back, USPS implemented data streaming technology to transform operations with a focus on customer service, reliability, and innovation. Today, use of data streaming helps USPS track 421.4 million pieces of mail daily, including 23.8 million packages.
USPS implemented event-driven architecture and data streaming in their Informed Visibility program to determine where flat mail is throughout the system, where packages and bar-coded mail pieces are, and how to protect revenue. USPS is also adding new features like average delivery time between ZIP codes. Event streaming technology is also used in other USPS programs, including the initiative to provide free Covid test kits to Americans—an end-to-end solution of ordering, kitting, staging, distribution, and delivery—that was deployed in just three weeks.
PAUL is changing the world of building technology through digital technology and AI. PAUL uses data streaming technology to process data from sensors and actuators in buildings to optimize energy consumption and make buildings more efficient.
PAUL's goal is to make buildings more efficient and reduce energy consumption in buildings by up to 40%—with minimal physical modifications to the existing infrastructure.
How did they make it happen? By integrating IoT devices via Sparkplug Edge of network nodes with Kafka using Kafka Connect. This helped optimize energy consumption, making buildings more efficient by processing and optimizing data from sensors and actuators with PAUL's ML models. The result? Reduced energy costs and sustainable use of resources—benefiting both their business operations and the environment.
Interested in learning how you can use data streaming to power innovation and drive unparalleled customer experiences? Explore how Confluent can help kickstart your data streaming journey.
This blog post announces the launch of the APAC deep dive of the data streaming report.
Mike Wallace is the new GM for the Public Sector, bringing 25 years of experience to lead the expansion of data streaming use by government agencies.