Ahorra un 25 % (o incluso más) en tus costes de Kafka | Acepta el reto del ahorro con Kafka de Confluent
Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Learn how the latest innovations in Kora enable us to introduce new Confluent Cloud Freight clusters, which can save you up to 90% at GBps+ scale. Confluent Cloud Freight clusters are now available in Early Access.
Learn how to contribute to open source Apache Kafka by writing Kafka Improvement Proposals (KIPs) that solve problems and add features! Read on for real examples.
At Treehouse Software, when we speak with customers who are planning to modernize their enterprise mainframe systems, there’s a common theme: they are faced with decades of mission-critical and historical legacy mainframe data in disparate databases,
Building data streaming applications, and growing them beyond a single team is challenging. Data silos develop easily and can be difficult to solve. The tools provided by Confluent’s Stream Governance platform can help break down those walls and make your data accessible to those who need it.
Change data capture (CDC) converts all the changes that occur inside your database into events and publishes them to an event stream. You can then use these events to power analytics, drive operational use cases, hydrate databases, and more. The pattern is enjoying wider adoption than ever before.
I'm excited to share our intent to acquire Immerok! Together, we’ll build a cloud-native service for Apache Flink that delivers the same simplicity, security, and scalability that you expect from Confluent for Kafka.
Capturing tech trends has become a bit tricky these days: whatever industry you’re in, uncertainty abounds. That’s made planning more difficult, but businesses are finding new ways to innovate with emerging technology and respond quickly to fast-changing market conditions.
In this post, we introduce how to use .NET Kafka clients along with the Task Parallel Library to build a robust, high-throughput event streaming application...
Learn what a Kafka consumer group ID is and how assigning one to Kafka consumers during configuration helps with detecting new data, work sharing, and data recovery.
Self-managing connectors come with major time and resource challenges and taking on unnecessary risks of downtime that shift your team’s focus away from working on more strategic projects and innovations...
Today, 92% of the world’s top 100 banks and 72% of the top 25 retailers use mainframes to deliver secure, highly reliable data for their customers. Citigroup even estimates that while banks spend over $200 billion a year on IT, nearly 80% of that money goes towards maintaining mainframe-dependent
Over the last decade, there’s been a massive movement toward digitization. Enterprises are defining their business models, products, and services to innovate, thrive, and compete by being able to quickly discover, understand, and apply their data assets to power real-time use cases.
If you’ve used Kafka for any amount of time you’ve likely heard about connections; the most common place that they come up is in regard to clients. Sure, producer and consumer clients connect to the cluster to do their jobs, but it doesn’t stop there. Nearly all interactions across a cluster...
Setting up proactive, synthetic monitoring is critical for complex, distributed systems like Apache Kafka®, especially when deployed on Kubernetes and where the end-user experience is concerned, and is paramount for healthy real-time data pipelines...
Businesses are generating more data than ever on a daily basis. As a result, many enterprises are undergoing a digital transformation that centers on their ability to contextualize and harness the value of their data in real time.
This Thanksgiving-themed blog post walks through a brand new stream processing use case recipe for analyzing survey responses in real-time and gives ideas for how to spice it up and make the recipe your own!