Ahorra un 25 % (o incluso más) en tus costes de Kafka | Acepta el reto del ahorro con Kafka de Confluent
Change data capture is a popular method to connect database tables to data streams, but it comes with drawbacks. The next evolution of the CDC pattern, first-class data products, provide resilient pipelines that support both real-time and batch processing while isolating upstream systems...
Learn how the latest innovations in Kora enable us to introduce new Confluent Cloud Freight clusters, which can save you up to 90% at GBps+ scale. Confluent Cloud Freight clusters are now available in Early Access.
Learn how to contribute to open source Apache Kafka by writing Kafka Improvement Proposals (KIPs) that solve problems and add features! Read on for real examples.
How Confluent’s new Cloud SQL Google Cloud Ready designation can help you accelerate business transformation
Senior Software Engineer Yash Mayya shares his path to Confluent, his work on Kafka Connect, and how he plans to keep growing his career.
Versioned key-value state stores, introduced to Kafka Streams in 3.5, enhance stateful processing capabilities by allowing users to store multiple record versions per key, rather than only the single latest version per key as is the case for existing key-value stores today...
See how Powerledger uses data streaming to facilitate peer-to-peer trading of renewable electricity.
Capturing and using streaming data in real time is essential today. Industry expert Sumit Pal explains why, and where managed services fit in.
Most people are not jumping for joy at the prospect of taking out a loan, or even going to the bank. If anything, banking is a chore (and not a very exciting one). But what if banking was fast, simple, and easier to understand?
Learn why stream processing is such a critical component of the data streaming stack, why developers are choosing Apache Flink as their stream processing framework of choice, and how to use Flink with Kafka.
For Niki Kapsi, commercial account executive at Confluent, it’s the “entrepreneurial” aspect of her role that she’s the most excited about.
Let’s learn more about how Niki got to Confluent—and how the company fosters a culture of learning and growth that keeps her driven and motivated.
Confluent Cloud has chosen Let’s Encrypt as its Certificate Authority and leverages its automation features to spend less time managing certificates and more time building private networking features.
When I say summer, you say… beach, popsicles, and camping.
And this summer, I went camping (albeit virtually) with Camp Confluent—a three-week (July 10-28) immersive learning experience focused on all things data streaming.
I’m thrilled to announce that Confluent has received the Financial Services Competency from AWS. The AWS Competency Program recognizes and promotes AWS Partners who exhibit technical expertise and customer success, enabling them to market and differentiate their businesses to AWS customers...
Real-time AI is the future, and AI/ML models have demonstrated incredible potential for predicting and generating media in various business domains. For the best results, these models must be informed by relevant data.
Learn the basics of what an Apache Kafka cluster is and how they work, from brokers to partitions, how they balance load, and how they handle replication, and leader and replica failures.
Confluent recently released its 2024 Data Streaming Report: Breaking Down the Barriers to Business Agility & Innovation. The report found that data streaming is delivering business value with 41% of IT leaders, driving up to 5x or more return on their data streaming investments.