Category: Database, Data, Kubernetes, Docker, shell

In Kafka Connect on Kubernetes, the easy way!, I had demonstrated Kafka Connect on Kubernetes using Strimzi along with the File source and sink connector. This blog will showcase how to build a simple data pipeline with MongoDB and Kafka with the MongoDB Kafka connectors, which will be deployed on Kubernetes with Strimzi.

It reduces the complexity and operational overhead of managing Kubernetes by offloading much of that responsibility to Azure.

Kafka Connect will need to reference an existing Kafka cluster (which in this case is Azure Event Hubs).

If you had initially created items in source Azure Cosmos DB collection, they should have been copied over to Kafka topic (by the source connector) and subsequently persisted to the sink Azure Cosmos DB collection by the sink connector - to confirm this, query Azure Cosmos DB using any of the methods mentioned previously

Related Articles