22 Mar 2020 | Andreas Müller
Ensure that you have finished the following steps before you continue here:
In the Cloud Management Console switch to your cluster, "CLI & Client Configuration" and then "Java". You'll find this client configuration template (you don't need the schema properties mentioned there):
Substitute server name, cluster API key and cluster secret and save it.
Last step is to download the latest Apache Kafka Client jar file to your local disk.
Our Kafka flow components are located here:
Create a new flow "KafkaOutbound". Then attach the Apache Kafka Client jar file to this flow:
Click on the "Attach" icon in the toolbar:
Drop the jar file here:
Drag the Kafka Producer component into the flow and click on it to configure:
Click on "Additional Properties" and add all properties you have saved from your prerequisite step:
Add flow components to send messages in an interval as I did it here:
To receive from Confluent Cloud you use the Kafka Consumer component instead. The configuration of it regarding "Additional Properties" is exactly the same so I don't repeat it here.
You can create a new flow or can use the same flow to receive messages from Confluent Cloud. I have created a new flow and attached the Apache Kafka Client jar file to it.
The flow uses an Interval Timer to initiate a poll every 20 secs, calls the Kafka Consumer and sends the received messages to the flow's log file:
Producing and consuming messages to/from Confluent Cloud is exactly the same as if you do it from a local broker. The only difference are the hostname and the "Additional Properties" you need to define. You don't need to handle any TLS certificates because Confluent Cloud uses a CA certificate that is automatically verified.
Andreas is a well-known messaging expert, creator of SwiftMQ and in his side job CEO of IIT Software GmbH. He leads the overall development of Flow Director and has an eye on any piece to fit into the whole picture.