Thursday, February 7, 2019

Using the Apache Camel Kafka component with Kerberos

Apache Camel is a well-known integration framework available at the Apache Software Foundation. It comes with a huge number of components to integrate with pretty much anything you can think of. Naturally, it has a dedicated component to communicate with the popular Apache Kafka project. In this blog entry, we'll show first how to use Apache Camel as a consumer for a Kafka topic. Then we will show how to configure things when we are securing the Kafka broker with kerberos, something that often causes problems.

1) Setting up Apache Kafka

First let's set up Apache Kafka. Download and install it (this blog post uses Kafka 2.0.0), and then start up Zookeeper and the broker, as well as creating a "test" topic and a producer for that topic as follows:
  • bin/zookeeper-server-start.sh config/zookeeper.properties
  • bin/kafka-server-start.sh config/server.properties
  • bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test
  • bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test --producer.config config/producer.properties
Type a few messages into the producer console to make sure that it is working.

2) Consuming from Kafka using Apache Camel

Now we'll look at how to set up Apache Camel to consume from Kafka. I put a project up on github here for this purpose. The Camel route is defined in Spring, and uses the Camel Kafka component to retrieve messages from the broker, and to write them out to the target/results folder:
Simply run "mvn clean install" and observe the logs indicating that Camel has retrieved the messages you put into the topic with the producer above. Then check "target/results" to see the files containing the message bodies.

3) Securing Apache Kafka with Kerberos

So far so good. Now let's look at securing the Kafka broker using kerberos. I wrote a previous blog post to show how to use Apache Kerby as a KDC with Kafka, so please follow the steps outlined here, skipping the parts about configuring the consumer.

4) Consuming from Kafka using Apache Camel and Kerberos

To make our Camel route work with Kafka and Kerberos, a few changes are required. Just as we did for the Kafka producer, we need to set the "java.security.auth.login.config" and "java.security.krb5.conf" system properties for Camel. You can do this in the example by editing the "pom.xml" and adding something like this under "systemPropertyVariables" of the surefire configuration:
  • <java.security.auth.login.config>/path.to.kafka.project/config/client.jaas</java.security.auth.login.config
  • <java.security.krb5.conf>/path.to.kerby.project/target/krb5.conf</java.security.krb5.conf>
Replacing the paths to Kafka and Kerby appropriately (refer to the previous blog post on Kafka + Kerberos if this does not make sense). Next we need to make some changes to the Camel route itself. Add the following configuration to the Camel configuration for the Kafka component:
  • &amp;saslKerberosServiceName=kafka&amp;securityProtocol=SASL_PLAINTEXT
Camel uses "GSSAPI" as the default SASL mechanism, and so we don't have to configure that. Now re-run "mvn clean install" and you will see the Camel route get a ticket from the Kerby KDC and consuming messages successfully from the Kafka topic.

No comments:

Post a Comment