Mulesoft’s Anypoint Platform in the cloud has its own Advanced Queue as a convenience messaging layer but what if you wanted to use a streaming platform like Apache Kafka? Mulesoft has documented how to use their Kafka Connector but here is an example of how to use it in four easy steps using Kafka on Docker.
1. Setup Kafka
For my developer Kafka setup I have used Confluent’s single node docker image which I find rather convenient to use Kafka. Since I use a Windows Laptop I needed to have Docker Desktop installed first.
Using the Windows Powershell, I change directory into the examples\kafka-single-node folder and start up the Zookeeper and Kafka containers using the Docker compose “Up” command
1 2 3 4 |
PS C:\Users\vikram\Documents\GitHub\cp-docker-images\examples\kafka-single-node> docker-compose up -d Starting kafka-single-node_zookeeper_1 ... done Starting kafka-single-node_kafka_1 ... done |
Using Windows Powershell, exploring the running containers I can see that two are up and running.
1 2 3 4 5 6 |
PS C:\Users\vikram\Documents\GitHub\cp-docker-images\examples\kafka-single-node> docker container ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 4d7f3a2acad6 confluentinc/cp-kafka:latest "/etc/confluent/dock…" 2 hours ago Up 2 minutes 0.0.0.0:9092->9092/tcp kafka-single-node_kafka_1 b889d26803cb confluentinc/cp-zookeeper:latest "/etc/confluent/dock…" 2 hours ago Up 2 minutes 2181/tcp, 2888/tcp, 3888/tcp kafka-single-node_zookeeper_1 |
2. Create Kafka Topic
Using Windows Powershell, now I need to create a topic that I will use to publish and consume messages. Since Kafka is running inside a docker container I will have to run my commands inside the container using the docker exec bash command. Once I am inside the container bash it will be like I am working inside a linux server that has Kafka already installed and running.
1 2 |
PS C:\Users\vikram\Documents\GitHub\cp-docker-images\examples\kafka-single-node> docker exec -it kafka-single-node_kafka_1 bash root@4d7f3a2acad6:/# |
Now I can proceed to use Linux style commands for kafka to create the topic called “topic1”. The assumption is that I am going to have three consumers for this topic hence I use three partitions.
root@4d7f3a2acad6:/# kafka-topics --bootstrap-server localhost:9092 --create --topic topic1 --replication-factor 1 --partitions 3
I will run a console consumer to see the items being published into the topic
root@4d7f3a2acad6:/# kafka-console-consumer --bootstrap-server localhost:9092 --topic topic1 --from-beginning
3. Create Mule Flow
Using Anypoint Studio, create a New Mule Project
We will create a Mule flow that will Publish Messages to the Kafka topic. Drag and Drop the HTTP Listener component from the Mule Palette.
Next we add the Kafka Connector for Mule 4 by searching on the Exchange using the Mule Palette
Next, Drag and Drop the Kafka > Publish Message component, Create the connector configuration using the configurations from the Kafka Setup
The Mule Flow is now ready to publish messages to Kafka
We will now create a Kafka Consumer on the same Mule Flow (Ideally you want this on a separate Mule flow) for demonstration only.
Drag and drop the Apache Kafka > Message Consumer to the Mule Flow, then configure the Consumer Configuration
The Mule Flow can now consume messages published to “topic1” on Apache Kafka. We will log the message payload to make sure they reach the consumers.
Run the Mule flow project on Anypoint Studio. This will create an HTTP Endpoint which can be invoked using Postman
On Starting up the mule flow we can see that the consumer has logged three partitions for the specific topic in the console output
1 |
Adding newly assigned partitions: topic1-1, topic1-0, topic1-2 |
4. Publishing the message to Kafka Topic from the Mule Flow
Using Postman, you can now send a REST payload to the HTTP Listener in the Mule Flow which will send the payload to the Kafka topic. The payload will appear automatically on the Kafka console consumer that we ran earlier as well as the Consumer in the Mule Flow.
Kafka Console Consumer Output
The console output shows us the message that was sent on the topic
1 2 3 |
root@4d7f3a2acad6:/# kafka-console-consumer --bootstrap-server localhost:9092 --topic topic1 --from-beginning {'test': 'one'} |
Mule Flow Kafka Consumer Console Output
1 2 |
INFO 2020-01-03 17:15:44,839 [[MuleRuntime].cpuLight.12: [mule-kafka].mule-kafkaFlow1.CPU_LITE @78c9c963] [event: 961ef011-2e76-11ea-9a10-e4a7a094075f] org.mule.runtime.core.internal.processor.LoggerMessageProcessor: {'test': 'one'} |
As we can see it is relatively easy to get started on MuleSoft to use Kafka as the messaging layer and take advantage of the power of Kafka and MuleSoft.
- Integrations using MuleSoft Anypoint and Apache Kafka - January 3, 2020
- How to Auto Dismiss an Oracle BPM FYI Task - February 18, 2016
- Oracle OpenWorld 2015: The Adaptive Case Management Feature—Federal Government Healthcare Use Case [CON1568] - October 22, 2015