Security Configurations With Kakfa Connect and ElasticSearch

How to specify SSL certificates and passwords in Kafka(Local And Docker)

Akash Agrawal
4 min readJul 3, 2020
Photo by John Salvino on Unsplash

Don’t you think the available documents and articles are filled with setup configuration for simplest use case and rarely does any one talk about Security

While many great resources are available from the Creators itself as well as avid writers across the globe on understanding the Kafka, Connect and ElasticSearch, rarely does anyone talk about Security. Security can be overlooked on the first day of bringing up the infrastructure for local testing, however it needs to be considered and tested diligently before your POC(proof of concept) goes into Production.

In this article, I will like to place a magnifying lens on the configuration setup and explain how the realm changes or how a simple config file bloats when considering security parameters for server and clients. Security and encryption is required while publishing or consuming from Kakfa Broker and when shipping the data to ElasticSearch. To cover more generic scenarios, I am going to have a slightly unusual setup where any API event on the application is published to the Kafka Broker from the application itself. This message will be received by Kafka Connect which will have a Worker (sink consumer) shipping the data to ElasticSearch database. Any communication with Kafka Broker is ONLY allowed on port 9093.

High Level Design to focus on SSL

You can have your entire data flow pipeline running in your local, or even better Docker to allow easy sharing and quicker deployment. We will touch both wherever necessary.

App publishing to Kafka Broker on SSL port 9093

App pseudocode for Publishing to Kafka written in Golang

You will need client’s private key and certificate file for your application along with CA file to use in certificate validation. These are all provided to you by your Kafka administrator. In the above code snippet, do notice the change in port number from 9092 to 9093 for SSL and the extra parameters for SSL; like security protocol, ssl key location, ssl ca location and ssl certification location. The exact parameter naming will differ based on the programming language.

Congratulations, at this point your data has securely made it to Kafka.

ElasticSearch Sink Connector

Packaging your Certificates as Java Keystore

Java Keystore files are a more standard way of packaging your certificate and any private key into one password secured file earning the extension .jks

Kakfa Client and Kafka Server 2-way authentication

Keystore is used to store private key and identity certificates that a client application like ours will present to server for verification. The below mentioned command when run on Terminal will generate keystore.jks encrypted by the password you specify in the prompt.

$ openssl pkcs12 -export -out bundle.p12 -in app.crt -inkey app.key
$ keytool -destkeystore keystore.jks -importkeystore -srckeystore bundle.p12 -srcstoretype PKCS12

Truststore is used to store certificates from Certified Authorities (CA) that verify the certificate presented by the server in SSL connection. Similar to keystore, generate the truststore.jks file as below.

$ keytool -keystore truststore.jks -import -file root-ca-chain.pem -alias cacert

Kafka Connect(Admin)

Kafka Connect configuration is used by the Connect Workers for cluster group coordination and to read and write to the internal topics that are used to track the cluster’s state (for example, configs and offsets).

In your connect.properties configuration file for Connect, you will be specifying the below SSL related parameters to secure all kinds of read and write operations. Since in our case, we have a Consumer associated via Connect, we need to explicitly specify the same SSL properties for that as well by adding the consumer prefix as seen below.

connect.properties configuration file

In case you are running your Kafka Connect on Docker, your docker compose file will declare all the above mentioned SSL config parameters.

Docker SSL parameters described in compose.yml file

ES Sink Connector(Worker)

Connect workers in a nutshell is a predefined plugin written for a specific purpose:

  • Produce data from a specific source to Kafka OR consume from Kafka to be shipped to a specific sink
  • (Optional) Allow primitive data manipulation and transformation
  • (Optional) Schema Validation

In this case, we are using an Elastic Search Sink Connector which is consuming the message from Kafka and writing it to ES database. In the configuration, we also pass all the security parameters for interacting with DB; the user credentials for read/write and certificates for client-server validation.

ElasticSearch Sink Connector SSL Configuration

That’s it ! We looked specifically at all the SSL related configuration at the level of application, Kafka Connect(Local and Docker) and Elastic Search. Hope it helps to clear out confusions lurking around SSL and allows transcending to the next stage of your software development lifecycle.

--

--

Akash Agrawal
Akash Agrawal

Written by Akash Agrawal

Software Developer by profession, writing experiences is my digression !

No responses yet