Kafka Security
SSL Encryption
keyStore is to provide
credentials. keyStore contains
private keys and required only if you are running a Server in SSL connection or
you have enabled client authentication on server side. If you are an SSL Server
you will use private key during key exchange algorithm and send certificates
corresponding to your public keys to client, this certificate is acquired from
keyStore.
trustStore is to verify
credentials. trustStore stores public key or certificates from
CA (Certificate Authorities) which is used to trust remote party or SSL
connection. On SSL client side, if its written in Java, it will use
certificates stored in trustStore to verify identity of Server.
Setup a demo CA (Certificate Authority)
mkdir ssl
cd ssl
openssl req -new -newkey rsa:4096 -days 365 -x509 -subj
'/CN=Kafka-Security-CA' -keyout ca-key -out ca-cert -nodes
This will generate two
files: ca-key contains the private key and ca-cert contains public certificate
of the CA.
Setup on all Kafka brokers:
Generate keyStore to store server private
and public keys:
keytool -genkey
-keystore kafka.server.keystore.jks -validity 365 -storepass password -keypass
password -dname "CN=broker,OU=pan" -storetype pkcs12
Check such keyStore:
keytool -list -v -keystore kafka.server.keystore.jks
Enter keystore password:
Keystore type: jks
Keystore provider: SUN
Your keystore contains 1 entry
Alias name: mykey
Creation date: Oct 9, 2018
Entry type: PrivateKeyEntry
Certificate chain length: 1
Certificate[1]:
Owner: CN=broker, OU=pan
Issuer: CN=broker, OU=pan
Serial number: 89ddb4a
Valid from: Tue Oct 09 23:53:29 UTC 2018 until: Wed Oct 09
23:53:29 UTC 2019
Certificate fingerprints:
MD5:
E2:5D:08:C2:7B:A3:EF:38:87:36:0C:CA:BA:3E:67:CE
SHA1:
FC:2A:CF:E5:8F:48:80:9D:09:A9:3D:33:B3:44:C1:3C:CF:B9:AF:07
SHA256:
27:3C:AF:65:0C:FE:F9:8F:47:F4:62:B0:60:43:8A:5F:81:99:48:EC:DA:1E:51:FD:7F:33:48:99:A5:A7:5A:2E
Signature algorithm name: SHA256withDSA
Subject Public Key Algorithm: 2048-bit DSA key
Version: 3
Extensions:
#1: ObjectId: 2.5.29.14 Criticality=false
SubjectKeyIdentifier [
KeyIdentifier [
0000: 0B D9 6C B1 91 31 EF 52
49 11 32 43 08 81 B6 CD
..l..1.RI.2C....
0010: 7A 28 AA 9E z(..
]
]
Getting a signing certificate request:
root@broker:/ssl#
keytool -keystore kafka.server.keystore.jks -certreq -file cert-file -storepass
password -keypass password
root@broker:/ssl# cat cert-file
-----BEGIN NEW CERTIFICATE REQUEST-----
MIID8TCCA50CAQAwHzEMMAoGA1UECxMDcGFuMQ8wDQYDVQQDEwZicm9rZXIwggND
MIICNQYHKoZIzjgEATCCAigCggEBAI95Ndm5qum/q+2Ies9JUbbzLsWeO683GOjq
xJYfPv02BudDUanEGDM5uAnnwq4cU5unR1uF0BGtuLR5h3VJhGlcrA6PFLM2CCii
L/onEQo9YqmTRTQJoP5pbEZY+EvdIIGcNwmgEFexla3NACM9ulSEtikfnWSO+INE
hneXnOwEtDSmrC516Zhd4j2wKS/BEYyf+p2BgeczjbeStzDXueNJWS9oCZhyFTkV
6j1ri0ZTxjNFj4A7MqTC4PJykCVuTj+KOwg4ocRQ5OGMGimjfd9eoUPeS2b/BJA+
1c8WI+FY1IfGCOl/IRzYHcojy244B2X4IuNCvkhMBXY5OWAc1mcCHQC69pamhXj3
397n+mfJd8eF7zKyM7rlgMC81WldAoIBABamXFggSFBwTnUCo5dXBA002jo0eMFU
1OSlwC0kLuBPluYeS9CQSr2sjzfuseCfMYLSPJBDy2QviABBYO35ygmzIHannDKm
J/JHPpGHm6LE50S9IIFUTLVbgCw2jR+oPtSJ6U4PoGiOMkKKXHjEeMaNBSe3HJo6
uwsL4SxEaJY559POdNsQGmWqK4f2TGgm2z7HL0tVmYNLtO2wL3yQ6aSW06VdU1vr
/EXU9hn2Pz3tu4c5JcLyJOB3MSltqIfsHkdI+H77X963VIQxayIy3uVT3a8CESsN
HwLaMJcyJP4nrtqLnUspItm6i+Oe2eEDpjxSgQvGiLfi7UMW4e8X294DggEGAAKC
AQEAhBeKJv/8vB/wFkEIHghJDeE7fyM0A/xoVuNliZ77U2NVajt1swiKXmj++TXo
884AUYBJVVJm+VlgvzcXgRaZeqFvQNqXJfgW9Lxr8zu44/liTDs7ps0b1qHEdnOj
hxAT/zOIkyCkXEaA9gOAWR2/hlP/MGk1aVTOqvfO8Xc5FvMdG1GutwVFRt0/0wuT
mP5K6ksew89Kt9OOBioFOw3RP+E/xAarFozlPOzoHohmjGFOfnmbNJbBtqSlQ9N6
T54LE127aFW9fku2x1Owj3VjcsoZKR/cwiF14xM4xCrweP0oRVtEG5h8R/55LNVu
T7xiUqZhGvKVT5i9X0SDTEzNGaAwMC4GCSqGSIb3DQEJDjEhMB8wHQYDVR0OBBYE
FAvZbLGRMe9SSREyQwiBts16KKqeMA0GCWCGSAFlAwQDAgUAAz8AMDwCHHN/3Em+
k4k0doTR+sa8B6wJic1pxKB6Wo1QRuUCHCsrWtUmfgcSIjD9PNKjHtXfZc8k/Uwe
1ARvhh0=
-----END NEW CERTIFICATE REQUEST-----
Send above file to CA admin to get a
signed certificate. In our case, we will use the demo CA to do
so:
root@broker:/ssl# openssl x509 -req -CA ca-cert -CAkey ca-key -in
cert-file -out cert-signed -days 365 -CAcreateserial -passin pass:password
Signature ok
subject=/OU=pan/CN=broker
Getting CA Private Key
root@broker:/ssl# cat cert-signed
-----BEGIN CERTIFICATE-----
MIIF2DCCA8ACCQCP+HYD3QJRhjANBgkqhkiG9w0BAQsFADAcMRowGAYDVQQDDBFL
YWZrYS1TZWN1cml0eS1DQTAeFw0xODEwMTAwMDA3NDdaFw0xOTEwMTAwMDA3NDda
MB8xDDAKBgNVBAsTA3BhbjEPMA0GA1UEAxMGYnJva2VyMIIDQzCCAjUGByqGSM44
BAEwggIoAoIBAQCPeTXZuarpv6vtiHrPSVG28y7FnjuvNxjo6sSWHz79NgbnQ1Gp
xBgzObgJ58KuHFObp0dbhdARrbi0eYd1SYRpXKwOjxSzNggooi/6JxEKPWKpk0U0
CaD+aWxGWPhL3SCBnDcJoBBXsZWtzQAjPbpUhLYpH51kjviDRIZ3l5zsBLQ0pqwu
demYXeI9sCkvwRGMn/qdgYHnM423krcw17njSVkvaAmYchU5Feo9a4tGU8YzRY+A
OzKkwuDycpAlbk4/ijsIOKHEUOThjBopo33fXqFD3ktm/wSQPtXPFiPhWNSHxgjp
fyEc2B3KI8tuOAdl+CLjQr5ITAV2OTlgHNZnAh0AuvaWpoV499/e5/pnyXfHhe8y
sjO65YDAvNVpXQKCAQAWplxYIEhQcE51AqOXVwQNNNo6NHjBVNTkpcAtJC7gT5bm
HkvQkEq9rI837rHgnzGC0jyQQ8tkL4gAQWDt+coJsyB2p5wypifyRz6Rh5uixOdE
vSCBVEy1W4AsNo0fqD7UielOD6BojjJCilx4xHjGjQUntxyaOrsLC+EsRGiWOefT
znTbEBplqiuH9kxoJts+xy9LVZmDS7TtsC98kOmkltOlXVNb6/xF1PYZ9j897buH
OSXC8iTgdzEpbaiH7B5HSPh++1/et1SEMWsiMt7lU92vAhErDR8C2jCXMiT+J67a
i51LKSLZuovjntnhA6Y8UoELxoi34u1DFuHvF9veA4IBBgACggEBAIQXiib//Lwf
8BZBCB4ISQ3hO38jNAP8aFbjZYme+1NjVWo7dbMIil5o/vk16PPOAFGASVVSZvlZ
YL83F4EWmXqhb0DalyX4FvS8a/M7uOP5Ykw7O6bNG9ahxHZzo4cQE/8ziJMgpFxG
gPYDgFkdv4ZT/zBpNWlUzqr3zvF3ORbzHRtRrrcFRUbdP9MLk5j+SupLHsPPSrfT
jgYqBTsN0T/hP8QGqxaM5Tzs6B6IZoxhTn55mzSWwbakpUPTek+eCxNdu2hVvX5L
tsdTsI91Y3LKGSkf3MIhdeMTOMQq8Hj9KEVbRBuYfEf+eSzVbk+8YlKmYRrylU+Y
vV9Eg0xMzRkwDQYJKoZIhvcNAQELBQADggIBAJ3NvbNLvqkUS6JcolhIqunLovbY
vpCdylBNfPUvSQQz760MB0Uj4YWg0Z/GJ+OKhXWxm4qae+qRIJsscfakZuBhp114
O7QJwx+hLWrKx57NACnxVsqtMGB87fLtQtGeMpyow8GHSvM9Z4Xhx2b6q4Wkszvb
T1w+sWPt222bT/3BqRYiQYjjw3dlS3PrsnzsF2mU/0KBHB9H/eLB56D6FcC2SXf3
ayQ1nFZESQq4ex+V/xIfrBZhBxWf9gsjYKkFQwtF9dxLTdQQgGENaEMN9w74tMup
bCfJOaDmotN/l/OfOwGOJV2z5i9Sl7jcBlejYUYiAX5stMv9fx8X1ifR9086XI89
4OmP0hBN8cuTz+HogNstXmu/zgZ4ikcyrXOWDnwWDwFet1Z66jhmXlvVhX4fQLFj
/FJuq57x3U4msBwzbXfcn9OKPsr0m9jMcR+rzL2Tpvie5GyJLO9iv/Vwe9M2iQOW
HPxN2s9tcBZ/Ua5ol3Iv46//EwiGgwMfZprrLVUUNoJpUeBmljbox0f2B3HQoFF8
z3WTGz0JvDMMz7V5xJsHEvcFy392ZFl0tWLuHQrzvoYXOZcDAl8fby8gJ2072b8D
qKtTaz/p/xeBeT/zP1JHhmeR68hKmJ5NR2s5bOPPConAkOrpRYlg8tsW6tiCUsjc
0XKf3HuYN1MSiJsC
-----END CERTIFICATE-----
root@broker:/ssl# keytool -printcert -v -file cert-signed
Owner: CN=broker, OU=pan
Issuer: CN=Kafka-Security-CA
Serial number: 8ff87603dd025186
Valid from: Wed Oct 10 00:07:47 UTC 2018 until: Thu Oct 10
00:07:47 UTC 2019
Certificate fingerprints:
MD5:
87:A0:38:A9:AF:31:42:BE:E1:4C:B6:6D:1A:D2:6A:02
SHA1:
F6:C9:EC:87:4D:02:62:40:3E:6D:8F:76:E3:3F:57:F6:81:8F:AC:53
SHA256: B2:D0:E7:52:9D:EB:0A:E2:69:60:EC:00:86:F9:B2:24:78:07:91:1E:52:3A:64:D2:25:5F:06:0B:F4:2D:26:B8
Signature algorithm name: SHA256withRSA
Subject Public Key Algorithm: 2048-bit DSA key
Version: 1
Create trustStore on Kafka broker and add
the demo CA public certificate to it for trusting all certificates signed by
our demo CA:
root@broker:/ssl# keytool -keystore kafka.server.truststore.jks
-alias CARoot -import -file ca-cert -storepass password -keypass password
-noprompt
Certificate was
added to keystore
Import both CA certificate and signed
Kafka server certificate into our keyStore:
root@broker:/ssl# keytool -keystore kafka.server.keystore.jks
-alias CARoot -import -file ca-cert -storepass password -keypass password
-noprompt
Certificate was
added to keystore
root@broker:/ssl# keytool -keystore kafka.server.keystore.jks
-import -file cert-signed -storepass password -keypass password -noprompt
Certificate reply
was installed in keystore
Configure Kafka server to use them:
Edit /etc/kafka/server.properties file
listeners=PLAINTEXT://:9092,SSL://:9093
advertised.listeners=PLAINTEXT://your.host.name:9092,SSL://your.host.name:9093
ssl.keystore.location=/ssl/kafka.server.keystore.jks
ssl.keystore.password=password
ssl.key.password=password
ssl.truststore.location=/ssl/kafka.server.truststore.jks
ssl.truststore.password=password
Need restart Kafka: systemctl restart
kafka
Verify if SSL is ready:
root@broker:/ssl# openssl s_client -connect localhost:9093
Files created under ssl folder:
root@broker:/ssl# ls -l
total 32
-rw-r--r-- 1 root root 1809 Oct
9 23:42 ca-cert
-rw-r--r-- 1 root root 17
Oct 10 00:07 ca-cert.srl ß can be removed
-rw-r--r-- 1 root root 3272 Oct
9 23:42 ca-key ß never share with others
-rw-r--r-- 1 root root 1473 Oct 10 00:00 cert-file ß can be removed
-rw-r--r-- 1 root root 2086 Oct 10 00:07 cert-signed
-rw-r--r-- 1 root root 5373 Oct 10 16:19 kafka.server.keystore.jks ß never share it
-rw-r--r-- 1 root
root 1358 Oct 10 16:03 kafka.server.truststore.jks
Setup on Kafka client:
Create trustStore to storeh Kafka server
certificates. There are two ways to
store certificates:
1.
Store every Kafka server
certificates.
2.
Store only the CA
certificate so all certificates assigned by such CA are trusted. This is preferred way.
Copy (scp) ca-cert file to Kafka client
machine.
Create trustStore for each Kafka client:
keytool -keystore kafka.client.truststore.jks -alias CARoot
-import -file ca-cert -storepass password -keypass password -noprompt
Create a new file called client.properties
file and put following lines:
security.protocol=SSL
ssl.truststore.location=/ssl/kafka.client.truststore.jks
ssl.truststore.password=password
Use such properties for each Kafka client:
kafka-console-producer.sh --broker-list
{kafka-server1:9093} --topic kafka-topic --producer.config
/ssl/client.properties
Kafka
client and Kafka server handshake:
- Kafka client connects to a Kafka server (broker) secured with SSL
(https). Client requests that the server identify itself.
- Kafka Server sends a copy of its SSL Certificate, including the server’s
public key.
- Kafka client checks the certificate root against a list of trusted CAs
and that the certificate is unexpired, unrevoked, and that its common name
is valid for the Kafka server that it is connecting to. If the Kafka
client trusts the certificate, it creates, encrypts, and sends back a
symmetric session key using the Kafka server’s public key.
- Kafka Server decrypts the symmetric session key using its private key and
sends back an acknowledgement encrypted with the session key to start the
encrypted session.
- Kafka Server and Client now encrypt all transmitted data
with the session key.
Enabling SSL will impact the zero-copy on Kafka broker. It will affect the latency, CPU utilization
and RAM usage for both Kafka client and Kafka server.
SSL Client Authentication
To support encryption, only server certificates are
needed. With SSL, clients can have
certificates. If client certificate is
validated by the broker, the client is authenticated and has an identity.
Generate keyStore for each Kafka client to holder
client private key and certificate
keytool -genkey -keystore
kafka.client.keystore.jks -validity 365 -storepass password -keypass password
-dname "CN=mylaptop" -alias my-local-pc -storetype pkcs12
Getting
a signing certificate request:
root@broker:/ssl#
keytool -keystore kafka.client.keystore.jks -certreq -file client-cert-sign-request
-alias my-local-pc -storepass password -keypass password
Send
above file to CA admin to get a signed certificate. In our case, we will
use the demo CA to do so:
scp above client-cert-sign-request file to
the demo CA machine and run following comman:
openssl x509 -req -CA ca-cert -CAkey ca-key -in client-cert-sign-request
-out client-cert-signed -days 365 -CAcreateserial -passin pass:password
Import
both CA certificate and signed Kafka client certificate into client keyStore:
keytool -keystore kafka.client.keystore.jks -alias CARoot -import
-file ca-cert -storepass password -keypass password -noprompt
keytool -keystore kafka.client.keystore.jks -import -file client-cert-signed
-alias my-local-pc -storepass password -keypass password -noprompt
Edit Kafka
server.properties file to enable SSL client authentication:
ssl.client.auth=required
Then restart Kafka server
Create a
new file called client-ssl-auth.properties file and put following lines:
security.protocol=SSL
ssl.truststore.location=/ssl/kafka.client.truststore.jks
ssl.truststore.password=password
ssl.keystore.location=/ssl/kafka.client.keystore.jks
ssl.keystore.password=password
ssl.key.password=password
Use such
properties for each Kafka client:
kafka-console-producer.sh --broker-list
{kafka-server1:9093} --topic kafka-topic --producer.config
/ssl/client-ssl-auth.properties
SASL Authentication – Kerberos
There
are two ways to authenticate your Kafka clients to your brokers: SSL and SASL.
SSL Authentication
SSL
Auth is leveraging a capability from SSL called two ways authentication. The
idea is to also issue certificates to your clients, signed by a certificate
authority, which will allow your Kafka brokers to verify the identity of the
clients.
This
is the most common setup when you are leveraging a managed Kafka clusters from
a provider such as Heroku, Confluent Cloud or CloudKarafka.
SASL Authentication
SASL
stands for Simple Authorization Service Layer and trust me, the name is
deceptive, things are not simple. The idea is that the authentication mechanism
is separated from the Kafka protocol (which is a nice idea). It’s very popular
with Big Data systems and most likely your Hadoop setup already leverages that.
SASL (Simple Authentication
and Security Layer) in Kafka currently implements following protocols:
· PLAIN (simple username/password)
· SCRAM (modern username/password with challenge)
· GSSAPI (Kerberos authentication / Active Directory
authentication)
· OAUTHBEARER for OAuth 2 (working in progress)
Key Characteristics of Kerberos:
·
All communication is encrypted
·
No passwords are being sent over the wire
·
Tickers do expire
·
Your clients will automatically renew tickets as
long as their credentials are valid
·
Only clients interact with the KDC, the target
service/server never talks to the KDC
Install and
Configure Kerberos server
Install Kerberos server into a new Centos machine:
yum install -y krb5-server
Configure Kerberos server:
vi /var/kerberos/krb5kdc/kdc.conf
[kdcdefaults]
kdc_ports = 88
kdc_tcp_ports = 88
default_realm=KAFKA.SECURE
[realms]
KAFKA.SECURE = {
acl_file =
/var/kerberos/krb5kdc/kadm5.acl
dict_file = /usr/share/dict/words
admin_keytab =
/var/kerberos/krb5kdc/kadm5.keytab
supported_enctypes =
aes256-cts:normal aes128-cts:normal des3-hmac-shal:normal arcfour-hmac:normal
camellia256-cts:normal camellia128-cts:normal des-hmac-shal:normal
des-cbc-md5:normal des-cbc-crc:normal
}
vi /var/kerberos/krb5kdc/kadm5.acl
*/admin@KAFKA.SECURE
*
vi /etc/krb5.conf
[logging]
default =
FILE:/var/log/krb5libs.log
kdc =
FILE:/var/log/krb5kdc.log
admin_server =
FILE:/var/log/kadmind.log
[libdefaults]
default_realm =
KAFKA.SECURE
kdc_timesync = 1
ticket_lifetime =
24h
[realms]
KAFKA.SECURE = {
admin_server = {kdc_hostname}
kdc = {kdc_hostname}
}
Create Kerberos database:
sudo /usr/sbin/kdb5_util create -s -r KAFKA.SECURE -P
this-is-unsecure
Create admin principal:
sudo kadmin.local -q “add_principal -pw this-is-unsecure
admin/admin”
Start Kerberos service:
sudo systemctl restart krb5kdc
sudo systemctl restart kadmin
Create Kafka Principals
& Keytabs
Create principal:
sudo kadmin.local -q “add_principal -randkey
reader@KAFKA.SECURE”
sudo kadmin.local -q “add_principal -randkey writer@KAFKA.SECURE”
sudo kadmin.local -q “add_principal -randkey
admin@KAFKA.SECURE”
sudo kadmin.local -q “add_principal -randkey kafka/{
kafka_server_hostname}@KAFKA.SECURE”
Export these principals to keytab files:
sudo kadmin.local -q “xst -kt /tmp/reader.user.keytab reader@KAFKA.SECURE”
sudo kadmin.local -q “xst -kt /tmp/writer.user.keytab writer@KAFKA.SECURE”
sudo kadmin.local -q “xst -kt /tmp/admin.user.keytab admin@KAFKA.SECURE”
sudo kadmin.local -q “xst -kt /tmp/kafka.service.keytab
kafka/{ kafka_server_hostname}@KAFKA.SECURE”
Copy these keytab files to Kafka client & server
machines:
scp -i centos@{kerberos_hostname}:/tmp/kafka.*
ubuntu@{kafka_server_hostname}:/tmp/.
scp -i centos@{kerberos_hostname}:/tmp/admin.*
ubuntu@{kafka_client_hostname}:/tmp/.
Install and
configure Kerberos on Kafka client
Install Kerberos client:
sudo apt-get install -y krb5-user
Configure Kafka client:
vi /etc/krb5.conf
[logging]
default =
FILE:/var/log/krb5libs.log
kdc = FILE:/var/log/krb5kdc.log
admin_server =
FILE:/var/log/kadmind.log
[libdefaults]
default_realm =
KAFKA.SECURE
kdc_timesync = 1
ticket_lifetime =
24h
[realms]
KAFKA.SECURE = {
admin_server = {kdc_hostname}
kdc = {kdc_hostname}
}
Obtain and cache an initial ticket-granting ticket for
principal:
kinit -kt /tmp/admin.user.keytab admin
Use klist to verify what is the ticket cache:
klist
Ticket
cache: FILE:/tmp/krb5cc_1000
Default
principal: admin@KAFKA.SECURE
Valid
starting Expires Service principal
02/22/2018
14:06:03 02/13/2018 14:06:03 krbtgt/KAFKA.SECURE@KAFKA.SECURE
Install and
configure Kerberos on Kafka server
Install Kerberos client:
sudo apt-get install -y krb5-user
Configure Kafka server:
vi /etc/krb5.conf
[logging]
default =
FILE:/var/log/krb5libs.log
kdc =
FILE:/var/log/krb5kdc.log
admin_server =
FILE:/var/log/kadmind.log
[libdefaults]
default_realm =
KAFKA.SECURE
kdc_timesync = 1
ticket_lifetime =
24h
[realms]
KAFKA.SECURE = {
admin_server = {kdc_hostname}
kdc = {kdc_hostname}
}
Obtain and cache an initial ticket-granting ticket for
principal:
kinit -kt /tmp/kafka.service.keytab kafka/{
kafka_server_hostname}@KAFKA.SECURE
Use klist to verify what is the ticket cache:
klist
Ticket
cache: FILE:/tmp/krb5cc_1000
Default
principal: kafka/{kafka_server_hostname}@KAFKA.SECURE
Valid
starting Expires Service principal
02/22/2018
14:06:03 02/13/2018 14:06:03 krbtgt/KAFKA.SECURE@KAFKA.SECURE
Kafka Broker
Configuration
Configure Kerberos authentication in Kafka
Edit /etc/kafka/server.properties file
listeners=PLAINTEXT://:9092,SSL://:9093,SASL_SSL://:9094
advertised.listeners=PLAINTEXT://your.host.name:9092,SSL://your.host.name:9093,SASL_SSL://your.host.name:9094
ssl.keystore.location=/ssl/kafka.server.keystore.jks
ssl.keystore.password=password
ssl.key.password=password
ssl.truststore.location=/ssl/kafka.server.truststore.jks
ssl.truststore.password=password
sasl.enabled.mechanisms=GSSAPI
sasl.kerberos.service.name=kafka
Create/Edit
/etc/kafka/kafka_server_jaas.conf file:
KafkaServer {
com.sun.security.auth.module.Krb5LoginModule
required
useKeyTab=true
storeKey=true
keyTab=”/tmp/kafka.service.keytab”
principal=kafka/{kafka_server_hostname}@KAFKA.SECURE;
};
Extend the system script
Put following line into [Service] section of kafka.service
file:
Environment=”KAFKA_OPTS=-Djava.security.auth.login.config=/…kafka_server_jaas.conf”
Restart Kafka
Kafka Client
Configuration
Create jaas file kafka_client_jaas.conf:
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule
required
useTicketCache=true;
};
Create file /tmp/kafka_client_kerberos.properties
security.protocol=SASL_SSL
sasl.kerberos.servie.name=kafka
ssl.truststore.location=/ssl/kafka.client.truststore.jks
ssl.truststore.password=password
Use such properties for each Kafka client:
export KAFKA_OPTS=”-Djava.security.auth.login.config=/tmp/kafka_client_jaas.conf”
kafka-console-producer.sh --broker-list
{kafka-server1:9094} --topic kafka-topic --producer.config
/tmp/kafka-client-kerberos.properties
Authorization
ACL comes in two flavors:
·
Topics: restrict which client can read/write
data
·
Consumer Groups: which client can use a specific
consumer group.
·
Cluster: which client can create/delete to
topics or apply settins.
There is no concept of User Groups so far in Kafka. Each ACL has to be written for each client or
user.
ACLs are stored in Zookeepers and added through command
line. Need restrict who can access your
Zookeepers cluster through security or network rules.
Only Kafka admins should have the right to create topics and
create ACLs.
Configure Kafka
Broker to support ACLs
Edit server.properties file
authorizer.class.name=kafka.security.auth.SimpleAclAuthorizer
super.users=User:admin;User:kafka
allow.everyone.if.no.acl.found=false
security.inter.broker.protocol=SASL_SSL
Restart Kafka
Create ACLs
kafka/bin/kafka-acls.sh --authorizer-properties zookeeper.connect={zookeeper_hostname}:2181 --add --allow-principal User:reader --group=* --operation Read --topic {topic_name}
(Above command gives User ‘reader’ READ permission for all consumer group)
kafka/bin/kafka-acls.sh --authorizer-properties zookeeper.connect={zookeeper_hostname}:2181 --remove --allow-principal User:reader --operation Read --topic {topic_name}
(Above command removes User ‘reader’ READ permission)
kafka/bin/kafka-acls.sh --authorizer-properties zookeeper.connect={zookeeper_hostname}:2181 --list --topic {topic_name}
(Above command lists the current ACLs for the given topic)
Verify ACLs
use kinit to use different user to see if producing message
can be successful or not.
Additional Security:
Broker to broker authentication and encryption:
Add following property to server.properties:
inter.broker.protocol=SASL_SSL
Broker to zookeeper authentication and encryption:
Similar with above.
Using Network:
Restrict which machines can access Zookeeper using network
rules. Preferred.
No comments:
Post a Comment