Ahorra un 25 % (o incluso más) en tus costes de Kafka | Acepta el reto del ahorro con Kafka de Confluent
Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Learn how the latest innovations in Kora enable us to introduce new Confluent Cloud Freight clusters, which can save you up to 90% at GBps+ scale. Confluent Cloud Freight clusters are now available in Early Access.
Learn how to contribute to open source Apache Kafka by writing Kafka Improvement Proposals (KIPs) that solve problems and add features! Read on for real examples.
In the past, technology served as a supportive function for business. Over time, it has become the business itself. A similar shift is happening with data streaming—data streaming is now a critical foundation of modern business. And this year is an inflection point for data streaming platforms
In part 1 of this series, we’ll make an app, powered by Kafka and FlinkSQL in Confluent Cloud and visualized with Streamlit, that allows a user to select a stock, in this case SPY, or the SPDR S&P 500 ETF Trust. Upon selection, a live chart of the stock’s bid prices, calculated every five seconds...
Businesses that are best able to leverage data have a significant competitive advantage. This is especially true in financial services, an industry in which leading organizations are in constant competition to develop the most responsive, personalized customer experiences.
The post discusses the Dual-Write Problem in distributed systems, where atomic updates across multiple systems like databases and messaging systems (e.g., Apache Kafka) are challenging, leading to potential inconsistencies. It outlines common anti-patterns that fail to address the issue...
Find out how Zhibo handles tricky conversations with customers who aren’t quite sure what data problems they have and how Confluent can help.
If you know me, you know that I’m always looking for any excuse to bring the data streaming community together.
The blog post delves into best practices and recommendations for utilizing the Confluent Terraform Provider. It offers insights on efficiently provisioning resources within Confluent Cloud infrastructure while ensuring adherence to industry standards. Additionally, it provides a GitHub repository...
The Data Streaming Awards is back for its third year! Designed to bring the data streaming community together, this one-of-a-kind industry award event recognizes organizations that are harnessing the power of this revolutionary technology to drive business and customer experience transformation.
Analyzing Confluent Cloud audit logs is good, but being proactively informed once something suspicious is happening is better. This article provides a conceptual guide for developing a pipeline that transfers Confluent Cloud audit logs into Splunk and defines automatic alerts based on certain events
Apache Kafka® has become the de facto standard for data streaming, used by organizations everywhere to anchor event-driven architectures and power mission-critical real-time applications.
The payments industry is evolving rapidly, fueled by technological advancements, changing consumer behaviors, and a growing appetite for real-time transactions. As this transformation unfolds, new standards have been introduced to ensure the payments ecosystem's safety, security, and efficiency.
Confluent’s OpenSearch Sink Connector lets you easily send events to AWS OpenSearch and others—enabling fraud detection, log analytics, social media monitoring & GenAI w/RAG.
In Nigeria, small and medium-sized businesses (SMBs) make up 48% of the national GDP. Moniepoint provides financial solutions to power the aspirations of these SMBs from payment, credit, business management to banking services.