Category: Database, Data, Docker, container, github

In this blog, we will see it in action using an example. With the help of a sample app, you will see how to combine real-time data ingestion component with a Serverless processing layer.

This is pretty straightforward - it is a Go app which uses the Sarama Kafka client to send (simulated)"orders" to Azure Event Hubs (Kafka topic).

It leverages the following capabilities: The Trigger allows the Azure Functions logic to get invoked whenever an order event is sent to Azure Event Hubs.

and all that's left for us to build is the business logic, which in this case has been kept pretty simple - on receiving the order data from Azure Event Hubs, the function enriches it with additional info (customer and product name in this case), and persists it in an Azure Cosmos DB container.

Related Articles