How to configure clients to connect to Apache Kafka Clusters securely – Part 3: PAM authentication

In the previous posts in this series, we have discussed Kerberos and LDAP authentication for Kafka. In this post, we will look into how to configure a Kafka cluster to use a PAM backend instead of an LDAP one.

The examples shown here will highlight the authentication-related properties in bold font to differentiate them from other required security properties, as in the example below. TLS is assumed to be enabled for the Apache Kafka cluster, as it should be for every secure cluster.

security.protocol=SASL_SSL

ssl.truststore.location=/opt/cloudera/security/jks/truststore.jks

We use the kafka-console-consumer for all the examples below. All the concepts and configurations apply to other applications as well.

PAM Authentication

When a Kafka cluster is configured to perform PAM (Pluggable Authentication Modules) authentication, Kafka will delegate the authentication of clients to the PAM modules configured for the Operating System where it is running. 

The Kafka client configuration is identical to the one we used for LDAP authentication, as we have seen in the previous post:

# Uses SASL/PLAIN over a TLS encrypted connection
security.protocol=SASL_SSL
sasl.mechanism=PLAIN
# LDAP credentials
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="alice" password="supersecret1";
# TLS truststore
ssl.truststore.location=/opt/cloudera/security/jks/truststore.jks

The configuration above uses SASL/PLAIN for authentication and TLS (SSL) for data encryption. The choice of PAM authentication is configured on the server-side handler for SASL/PLAIN, as we will see cover later in this section.

Ensure TLS/SSL encryption is being used

Similarly to the LDAP authentication case, since username and passwords are sent over the network for the client authentication, it is very important that TLS encryption is enabled and enforced for all the communication between Kafka clients. This will ensure that the credentials are always encrypted over the wire and don’t get compromised.

All the Kafka brokers must be configured to use the SASL_SSL security protocol for their SASL endpoints. The SASL_PLAINTEXT security protocol is a no no.

Additional requirements

Depending on the PAM modules configured in the system there may be some other requirements that need to be configured correctly for the PAM authentication to work. The exact configurations are dependent on the modules used and are out of the scope of this document.

The following are two simple examples of additional configuration that may be required if certain PAM modules are used:

  • If the pam_unix module of the login service is to be used, the kafka user, which is the user running the Kafka brokers, must have access to the /etc/shadow file for the authentication to work.

The commands below are just a simple example on how to achieve this on a single node. There may be better ways to ensure this requirement is met across the entire cluster.

usermod -G shadow kafka
chgrp shadow  /etc/shadow
chmod 440 /etc/shadow
  • If the pam_nologin module is being used, the existence of the file /var/run/nologin on the brokers will prevent the PAM authentication for Kafka from working correctly. For PAM authentication to work the file /var/run/nologin would have to be removed from all the brokers or the pam_nologin module would have to be disabled.

Enabling PAM authentication on the Kafka Broker

PAM authentication is not enabled by default for the Kafka brokers when the Kafka service is installed but it is pretty easy to configure it through Cloudera Manager:

  1. In Cloudera Manager, set the following properties in the Kafka service configuration to match your environment:
    SASL/PLAIN Authentication
    By selecting PAM as the SASL/PLAIN Authentication option above, Cloudera Manager configures Kafka to use the following SASL/PLAIN Callback Handler:

    org.apache.kafka.common.security.pam.internals.PamPlainServerCallbackHandler
  2. Configure the PAM service that you want to be used for authentication:
    PAM Service
  3. Click on Kafka > Actions > Restart to restart the Kafka service and make the changes effective.

Example

NOTE: The information below contains sensitive credentials. When storing this configuration in a file, ensure that the files permissions are set so that only the file owner can read it.

The following is an example using the Kafka console consumer to read from a topic using PAM authentication. Note that this example’s configuration is identical to the LDAP example in the previous section.

$ cat pam-client.properties
security.protocol=SASL_SSL
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="alice" password="supersecret1";
ssl.truststore.location=/opt/cloudera/security/jks/truststore.jks

$ kafka-console-consumer 
    --bootstrap-server host-1.example.com:9093 
    --topic test 
    --consumer.config ./pam-client.properties

More to come

All these authentication methods that we are reviewing in this blog series provide you flexibility to configure your Kafka cluster to integrate with the authentication mechanisms you have in your environment.

In the next post, we will continue to explore other alternatives and take a look at Mutual TLS authentication for Kafka. In the meantime, if you are interested in understanding Cloudera’s Kafka offering, download this white paper.

The post How to configure clients to connect to Apache Kafka Clusters securely – Part 3: PAM authentication appeared first on Cloudera Blog.

Leave a Comment

Your email address will not be published. Required fields are marked *