Integrations using MuleSoft Anypoint and Apache Kafka

Mulesoft’s Anypoint Platform in the cloud has its own Advanced Queue as a convenience messaging layer but what if you wanted to use a streaming platform like Apache Kafka? Mulesoft has documented how to use their Kafka Connector but here is an example of how to use it in four easy steps using Kafka on Docker.

1. Setup Kafka

For my developer Kafka setup I have used Confluent’s single node docker image which I find rather convenient to use Kafka. Since I use a Windows Laptop I needed to have Docker Desktop installed first.

Using the Windows Powershell, I change directory into the examples\kafka-single-node folder and start up the Zookeeper and Kafka containers using the Docker compose “Up” command

PS C:\Users\vikram\Documents\GitHub\cp-docker-images\examples\kafka-single-node> docker-compose up -d                                                            

Starting kafka-single-node_zookeeper_1 ... done                                                                                                                  
Starting kafka-single-node_kafka_1 ... done

Using Windows Powershell, exploring the running containers I can see that two are up and running.    

PS C:\Users\vikram\Documents\GitHub\cp-docker-images\examples\kafka-single-node> docker container ps                                                             

4d7f3a2acad6        confluentinc/cp-kafka:latest       "/etc/confluent/dock…" 2 hours ago         Up 2 minutes>9092/tcp       kafka-single-node_kafka_1

b889d26803cb        confluentinc/cp-zookeeper:latest   "/etc/confluent/dock…" 2 hours ago         Up 2 minutes 2181/tcp, 2888/tcp, 3888/tcp   kafka-single-node_zookeeper_1

2. Create Kafka Topic

Using Windows Powershell, now I need to create a topic that I will use to publish and consume messages. Since Kafka is running inside a docker container I will have to run my commands inside the container using the docker exec bash command. Once I am inside the container bash it will be like I am working inside a linux server that has Kafka already installed and running.

PS C:\Users\vikram\Documents\GitHub\cp-docker-images\examples\kafka-single-node> docker exec -it  kafka-single-node_kafka_1 bash

Now I can proceed to use Linux style commands for kafka to create the topic called “topic1”. The assumption is that I am going to have three consumers for this topic hence I use three partitions.

root@4d7f3a2acad6:/# kafka-topics –bootstrap-server localhost:9092 –create –topic topic1 –replication-factor 1 –partitions 3

I will run a console consumer to see the items being published into the topic

root@4d7f3a2acad6:/# kafka-console-consumer –bootstrap-server localhost:9092 –topic topic1 –from-beginning

3. Create Mule Flow

Using Anypoint Studio, create a New Mule Project

create mule project

Create Mule Project

We will create a Mule flow that will Publish Messages to the Kafka topic. Drag and Drop the HTTP Listener component from the Mule Palette.

Add HTTP Listener to the Mule Flow

Add HTTP Listener to the Mule Flow

Next we add the Kafka Connector for Mule 4 by searching on the Exchange using the Mule Palette

Add Kafka Connector on Anypoint Studio

Add Kafka Connector on Anypoint Studio

Next, Drag and Drop the Kafka > Publish Message component, Create the connector configuration using the configurations from the Kafka Setup

Kafka Producer Configuration

Kafka Producer Configuration

The Mule Flow is now ready to publish messages to Kafka 

Mule Flow with Kafka Publisher

Mule Flow with Kafka Publisher

We will now create a Kafka Consumer on the same Mule Flow (Ideally you want this on a separate Mule flow) for demonstration only.

Drag and drop the Apache Kafka > Message Consumer to the Mule Flow, then configure the Consumer Configuration

Kafka Consumer Configuration

Kafka Consumer Configuration

The Mule Flow can now consume messages published to “topic1” on Apache Kafka. We will log the message payload to make sure they reach the consumers.

Mule Flow with Consumer that logs messages

Mule Flow with Consumer that logs messages

Run the Mule flow project on Anypoint Studio. This will create an HTTP Endpoint which can be invoked using Postman

On Starting up the mule flow we can see that the consumer has logged three partitions for the specific topic in the console output

Adding newly assigned partitions: topic1-1, topic1-0, topic1-2

4. Publishing the message to Kafka Topic from the Mule Flow

Using Postman, you can now send a REST payload to the HTTP Listener in the Mule Flow which will send the payload to the Kafka topic. The payload will appear automatically on the Kafka console consumer that we ran earlier as well as the Consumer in the Mule Flow.

Postman call to Mule Flow

Postman call to Mule Flow

Kafka Console Consumer Output

The console output shows us the message that was sent on the topic

root@4d7f3a2acad6:/# kafka-console-consumer --bootstrap-server localhost:9092 --topic topic1 --from-beginning

{'test': 'one'}

Mule Flow Kafka Consumer Console Output

INFO  2020-01-03 17:15:44,839 [[MuleRuntime].cpuLight.12: [mule-kafka].mule-kafkaFlow1.CPU_LITE @78c9c963] 
[event: 961ef011-2e76-11ea-9a10-e4a7a094075f] org.mule.runtime.core.internal.processor.LoggerMessageProcessor: {'test': 'one'}

As we can see it is relatively easy to get started on MuleSoft to use Kafka as the messaging layer and take advantage of the power of Kafka and MuleSoft.