Configuring SASL

Many of the concepts applied here come from the Kafka Security documentation. Reading though and understanding that documentation will be useful in configuring the Control Center for SASL.

The following assumes that this is only a development setup and generically followed the Control Center quickstart guide. While the specifics are for development purposes only, securing a production cluster follows the same concepts.

ZooKeeper

For the purposes here, ZooKeeper will not be secured. This guide is targetted to securing the immediately dependent pieces of the Control Center. If you would like to secure ZooKeeper as well, you can check out the documentation

Kafka Broker

Create a new file to and put the KafkaServer configuration into it. The KafkaServer section is for the authentication on brokers. For this example, create it at /tmp/kafka_server_jaas.conf.

KafkaServer {
  org.apache.kafka.common.security.plain.PlainLoginModule required
  username="admin"
  password="admin-secret"
  user_admin="admin-secret"
  user_confluent="confluent-secret"
  user_metricsreporter="metricsreporter-secret";
};

KafkaClient {
  org.apache.kafka.common.security.plain.PlainLoginModule required
  username="metricsreporter"
  password="metricsreporter-secret";
};
This configures several users on the server:
  • an admin user for internal inter-broker traffic
  • a confluent user, for Confluent Control Center, Kafka Connect, and Schema Registry
  • a metricsreporter user for Metrics Reporter to publish Kafka metrics

In this example, Metrics Reporter publishes metrics to the same cluster it is configured on, so we also need to include the corresponding KafkaClient client configuration in the same file.

It is possible to pass the JAAS config file location as JVM parameter to each client JVM as

-Djava.security.auth.login.config=/tmp/kafka_server_jaas.conf

Next, secure the Kafka broker, the monitoring interceptor and the metrics reporter. There are more options for security, but this broker will be secured using SASL_PLAINTEXT.

Note

These values should be updated or added in your Kafka broker properties file <path-to-confluent>/etc/kafka/server.properties

############# Broker Security ##############
security.inter.broker.protocol=SASL_PLAINTEXT
sasl.mechanism.inter.broker.protocol=PLAIN
sasl.enabled.mechanisms=PLAIN
listeners=SASL_PLAINTEXT://:9092

############# Confluent Metrics Reporter Security ##############
confluent.metrics.reporter.sasl.mechanism=PLAIN
confluent.metrics.reporter.security.protocol=SASL_PLAINTEXT

In this example, using the Control Center quick start is assumed, so the metrics reporter and the monitoring interceptor are configured.

Note

Anything that is set for the confluent.metrics.reporter. prefix must be set explicitly. For more information on what options are available see the interceptor configuration documentation

Start the broker with the updated config with KAFKA_OPTS pointing to the JAAS file.

$ KAFKA_OPTS=-Djava.security.auth.login.config=/tmp/kafka_server_jaas.conf \
<path-to-confluent>/bin/confluent start kafka

Control Center Configuration

Create a file with a KafkaClient entry at /tmp/kafka_client_jaas.conf. The KafkaClient section of is where the principal for the client needs to be specified. This will be used later to authenticate the Control Center and Kafka Connect.

KafkaClient {
  org.apache.kafka.common.security.plain.PlainLoginModule required
  username="confluent"
  password="confluent-secret";
};

It is possible to pass the JAAS config file location as JVM parameter to each client JVM as

-Djava.security.auth.login.config=/tmp/kafka_client_jaas.conf

This will allow the confluent.monitoring.interceptor. and confluent.metrics.reporter. to communicate with the secured Kafka broker. Any broker with the confluent.monitoring.interceptor. or confluent.metrics.reporter. will need to have a valid KafkaClient section in the JAAS config.

The Control Center needs to know that security is enabled. Internally, the Control Center uses Kafka Streams as a state store, so with a secured broker, they also need to be secured.

Edit the <path-to-confluent>/etc/confluent-control-center/control-center.properties:

########### Control Center security ###########
confluent.controlcenter.streams.sasl.mechanism=PLAIN
confluent.controlcenter.streams.security.protocol=SASL_PLAINTEXT

The Control Center can be now be started

$ CONTROL_CENTER_OPTS=-Djava.security.auth.login.config=/tmp/kafka_client_jaas.conf \
<path-to-confluent>/bin/control-center-start <path-to-confluent>/etc/confluent-control-center/control-center.properties

Schema Registry Configuration

If you followed the quickstart, Connect relies on Schema Registry, so we first need to update Schema Registry to use SASL authentication.

Edit the Schema Registry configuration (<path-to-confluent>/etc/schema-registry/schema-registry.properties) and add the following settings.

kafkastore.security.protocol=SASL_PLAINTEXT
kafkastore.sasl.mechanism=PLAIN

Start schema registry with the additional SCHEMA_REGISTRY_OPTS parameter with the JAAS file created ealier.

SCHEMA_REGISTRY_OPTS=-Djava.security.auth.login.config=/tmp/kafka_client_jaas.conf \
<path-to-confluent>/bin/confluent start schema-registry

Connect Configuration

The Connect properties file (/<path-to-confluent>/etc/schema-registry/connect-avro-distributed.properties) needs to be setup with the same security protocol as the broker. In our example, we will set it up to use SASL_PLAINTEXT for the connect producer, connect consumer, the producer confluent monitoring interceptor, and the consumer confluent monitoring interceptor.

#### Base connect security ####
security.protocol=SASL_PLAINTEXT
sasl.mechanism=PLAIN

#### Connect producer ####
producer.sasl.mechanism=PLAIN
producer.security.protocol=SASL_PLAINTEXT

#### Connect consumer ####
consumer.sasl.mechanism=PLAIN
consumer.security.protocol=SASL_PLAINTEXT

#### Monitoring producer interceptor ####
producer.confluent.monitoring.interceptor.sasl.mechanism=PLAIN
producer.confluent.monitoring.interceptor.security.protocol=SASL_PLAINTEXT

#### Monitoring consumer interceptor ####
consumer.confluent.monitoring.interceptor.sasl.mechanism=PLAIN
consumer.confluent.monitoring.interceptor.security.protocol=SASL_PLAINTEXT

Note

For any custom clients on the Control Center, these settings can be set for that producer’s or consumer’s prefix.

We can now start Connect with the CONNECT_OPTS parameter with the JAAS file that was created here.

$ CONNECT_OPTS=-Djava.security.auth.login.config=/tmp/kafka_client_jaas.conf \
<path-to-confluent>/bin/confluent start connect