Connect Flow Director to Confluent Cloud

22 Mar 2020 | Andreas Müller

Prerequisites

Ensure that you have finished the following steps before you continue here:

  • Create a Confluent Cloud account.
  • Create a Cluster
  • Create a topic "test".
  • Create an API key and save the credentials

In the Cloud Management Console switch to your cluster, "CLI & Client Configuration" and then "Java". You'll find this client configuration template (you don't need the schema properties mentioned there):

Substitute server name, cluster API key and cluster secret and save it.

Last step is to download the latest Apache Kafka Client jar file to your local disk.

Kafka Flow Components

Our Kafka flow components are located here:

Send to Confluent Cloud

Attach the Client jar File

Create a new flow "KafkaOutbound". Then attach the Apache Kafka Client jar file to this flow:

Click on the "Attach" icon in the toolbar:

Drop the jar file here:

Configure the Kafka Producer

Drag the Kafka Producer component into the flow and click on it to configure:

Click on "Additional Properties" and add all properties you have saved from your prerequisite step:

Outbound Flow

Add flow components to send messages in an interval as I did it here:

  • Interval Timer to fire each 20 secs.
  • Convert a timer event to a message.
  • Create a Text Message.
  • Set the body to some text.
  • Use a Property Setter Sequence to generate a sequence number and store it into the "key" property.
  • Pass the message to the Kafka Producer.

Receive from Confluent Cloud

To receive from Confluent Cloud you use the Kafka Consumer component instead. The configuration of it regarding "Additional Properties" is exactly the same so I don't repeat it here.

Inbound Flow

You can create a new flow or can use the same flow to receive messages from Confluent Cloud. I have created a new flow and attached the Apache Kafka Client jar file to it.

The flow uses an Interval Timer to initiate a poll every 20 secs, calls the Kafka Consumer and sends the received messages to the flow's log file:

Summary

Producing and consuming messages to/from Confluent Cloud is exactly the same as if you do it from a local broker. The only difference are the hostname and the "Additional Properties" you need to define. You don't need to handle any TLS certificates because Confluent Cloud uses a CA certificate that is automatically verified.

Photo of Andreas Müller

Andreas Müller, CEO & CTO

Andreas is a well-known messaging expert, creator of SwiftMQ and in his side job CEO of IIT Software GmbH. He leads the overall development of Flow Director and has an eye on any piece to fit into the whole picture.

Data Privacy

Copyright ©2020 Edge Broker GmbH. All rights reserved.

Some icons by Icons8.