Skip to content

Commit

Permalink
Keycloak Authorization Services based authz prototype (#24)
Browse files Browse the repository at this point in the history
* Keycloak Authz Services based authz prototype

Signed-off-by: Marko Strukelj <[email protected]>

* Address spotbugs issues

Signed-off-by: Marko Strukelj <[email protected]>

* Address PR comments

Signed-off-by: Marko Strukelj <[email protected]>

* Added initial authz test to testsuite

Signed-off-by: Marko Strukelj <[email protected]>

* Fix SpotBugs failure on Java 11

Signed-off-by: Marko Strukelj <[email protected]>

* Add missing resource and action scope

Signed-off-by: Marko Strukelj <[email protected]>

* A more comprehensive test

Signed-off-by: Marko Strukelj <[email protected]>

* Fix JSONUtil.asListOfString() to convert non-textual values to text

Signed-off-by: Marko Strukelj <[email protected]>

* Fixes to examples/README-authz.md

Signed-off-by: Marko Strukelj <[email protected]>

* Rebase on master + improve README-authz.md + add DelegationToken resource for completeness

Signed-off-by: Marko Strukelj <[email protected]>

* Fix equals() method

Signed-off-by: Marko Strukelj <[email protected]>

* Fix UserSpec.of() method

Signed-off-by: Marko Strukelj <[email protected]>

* Throw exception for ACL methods if delegation to simple kafka ACL is not enabled

Signed-off-by: Marko Strukelj <[email protected]>

* Fix issues identified when testing with Kafka 2.4 and mutual TLS listeners

Signed-off-by: Marko Strukelj <[email protected]>

* Improve handling of non-oauth users and delegation to simple ACL

Signed-off-by: Marko Strukelj <[email protected]>

* Upgrade examples to Kafka 2.4.0 + oauthz example improvements

Signed-off-by: Marko Strukelj <[email protected]>

* Suggestions and comments from @tombentley

Signed-off-by: Marko Strukelj <[email protected]>

* Suggestions and comments from @tombentley

Signed-off-by: Marko Strukelj <[email protected]>

* Change strimzi.authz.* to strimzi.authorization.*

Signed-off-by: Marko Strukelj <[email protected]>

* Document authorization-scopes.json

Signed-off-by: Marko Strukelj <[email protected]>
  • Loading branch information
mstruk authored and scholzj committed Jan 28, 2020
1 parent 20e8342 commit 8d4def3
Show file tree
Hide file tree
Showing 45 changed files with 4,022 additions and 44 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,7 @@ Copy the following jars into your Kafka libs directory:

oauth-common/target/kafka-oauth-common-*.jar
oauth-server/target/kafka-oauth-server-*.jar
oauth-keycloak-authorizer/target/kafka-oauth-keycloak-authorizer-*.jar
oauth-client/target/kafka-oauth-client-*.jar
oauth-client/target/lib/keycloak-common-*.jar
oauth-client/target/lib/keycloak-core-*.jar
Expand Down
419 changes: 419 additions & 0 deletions examples/README-authz.md

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ public class ExampleConsumer {

public static void main(String[] args) {

String topic = "Topic1";
String topic = "a_Topic1";

Properties defaults = new Properties();
Config external = new Config();
Expand Down Expand Up @@ -50,8 +50,8 @@ public static void main(String[] args) {
final String accessToken = external.getValue(ClientConfig.OAUTH_ACCESS_TOKEN, null);

if (accessToken == null) {
defaults.setProperty(Config.OAUTH_CLIENT_ID, "kafka-producer-client");
defaults.setProperty(Config.OAUTH_CLIENT_SECRET, "kafka-producer-client-secret");
defaults.setProperty(Config.OAUTH_CLIENT_ID, "kafka-consumer-client");
defaults.setProperty(Config.OAUTH_CLIENT_SECRET, "kafka-consumer-client-secret");
}

// Use 'preferred_username' rather than 'sub' for principal name
Expand Down Expand Up @@ -94,7 +94,7 @@ private static Properties buildConsumerConfig() {
p.setProperty(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
p.setProperty(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());

p.setProperty(ConsumerConfig.GROUP_ID_CONFIG, "consumer-group");
p.setProperty(ConsumerConfig.GROUP_ID_CONFIG, "a_consumer-group");
p.setProperty(ConsumerConfig.MAX_POLL_RECORDS_CONFIG, "10");
p.setProperty(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "true");

Expand Down
8 changes: 7 additions & 1 deletion examples/docker/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -124,14 +124,20 @@ To regenerate Root CA run the following:

You also have to regenerate keycloak and hydra server certificates otherwise clients won't be able to connect any more.

cd /opt/jboss/keycloak/certificates
cd keycloak/certificates
rm *.srl *.p12 cert-*
./gen-keycloak-certs.sh

cd ../hydra/certificates
rm *.srl *.crt *.key *.csr
./gen-hydra-certs.sh

And if CA has changed, then kafka broker certificates have to be regenerated as well:

cd kafka-oauth-strimzi/kafka/certificates
rm *.p12
./gen-kafka-certs.sh

And finally make sure to rebuild the docker module again and re-run `docker-compose` to ensure new keys and certificates are used everywhere.

mvn clean install
Expand Down
86 changes: 86 additions & 0 deletions examples/docker/kafka-oauth-strimzi/compose-authz.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
version: '3.5'

services:

#################################### KAFKA BROKER ####################################
kafka:
image: strimzi/example-kafka
build: kafka-oauth-strimzi/kafka/target
container_name: kafka
ports:
- 9092:9092

# javaagent debug port
- 5006:5006

environment:

# Java Debug
KAFKA_DEBUG: y
DEBUG_SUSPEND_FLAG: y
JAVA_DEBUG_PORT: 5006

#
# KAFKA Configuration
#
LOG_DIR: /home/kafka/logs

KAFKA_BROKER_ID: 1
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_LISTENERS: REPLICATION://kafka:9091,CLIENT://kafka:9092
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: REPLICATION:SSL,CLIENT:SASL_PLAINTEXT
KAFKA_SASL_ENABLED_MECHANISMS: OAUTHBEARER
KAFKA_INTER_BROKER_LISTENER_NAME: REPLICATION
KAFKA_SSL_SECURE_RANDOM_IMPLEMENTATION: SHA1PRNG
KAFKA_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: ""

KAFKA_LISTENER_NAME_REPLICATION_SSL_KEYSTORE_LOCATION: /tmp/kafka/cluster.keystore.p12
KAFKA_LISTENER_NAME_REPLICATION_SSL_KEYSTORE_PASSWORD: Z_pkTh9xgZovK4t34cGB2o6afT4zZg0L
KAFKA_LISTENER_NAME_REPLICATION_SSL_KEYSTORE_TYPE: PKCS12
KAFKA_LISTENER_NAME_REPLICATION_SSL_TRUSTSTORE_LOCATION: /tmp/kafka/cluster.truststore.p12
KAFKA_LISTENER_NAME_REPLICATION_SSL_TRUSTSTORE_PASSWORD: Z_pkTh9xgZovK4t34cGB2o6afT4zZg0L
KAFKA_LISTENER_NAME_REPLICATION_SSL_TRUSTSTORE_TYPE: PKCS12
KAFKA_LISTENER_NAME_REPLICATION_SSL_CLIENT_AUTH: required

KAFKA_LISTENER_NAME_CLIENT_OAUTHBEARER_SASL_JAAS_CONFIG: "org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required;"
KAFKA_LISTENER_NAME_CLIENT_OAUTHBEARER_SASL_LOGIN_CALLBACK_HANDLER_CLASS: io.strimzi.kafka.oauth.client.JaasClientOauthLoginCallbackHandler
KAFKA_LISTENER_NAME_CLIENT_OAUTHBEARER_SASL_SERVER_CALLBACK_HANDLER_CLASS: io.strimzi.kafka.oauth.server.JaasServerOauthValidatorCallbackHandler

KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1

KAFKA_AUTHORIZER_CLASS_NAME: io.strimzi.kafka.oauth.server.authorizer.KeycloakRBACAuthorizer
KAFKA_PRINCIPAL_BUILDER_CLASS: io.strimzi.kafka.oauth.server.authorizer.JwtKafkaPrincipalBuilder

KAFKA_STRIMZI_AUTHORIZATION_KAFKA_CLUSTER_NAME: cluster2
KAFKA_STRIMZI_AUTHORIZATION_DELEGATE_TO_KAFKA_ACL: "true"
KAFKA_SUPER_USERS: User:CN=my-cluster-kafka,O=io.strimzi;User:CN=my-cluster-entity-operator,O=io.strimzi;User:CN=my-cluster-kafka-exporter,O=io.strimzi;User:service-account-kafka

#
# Strimzi OAuth Configuration
#

# Authentication config
OAUTH_CLIENT_ID: "kafka"
OAUTH_CLIENT_SECRET: "kafka-secret"
OAUTH_TOKEN_ENDPOINT_URI: "http://${KEYCLOAK_HOST:-keycloak}:8080/auth/realms/${REALM:-kafka-authz}/protocol/openid-connect/token"

# Validation config
OAUTH_VALID_ISSUER_URI: "http://${KEYCLOAK_HOST:-keycloak}:8080/auth/realms/${REALM:-kafka-authz}"
OAUTH_JWKS_ENDPOINT_URI: "http://${KEYCLOAK_HOST:-keycloak}:8080/auth/realms/${REALM:-kafka-authz}/protocol/openid-connect/certs"
#OAUTH_INTROSPECTION_ENDPOINT_URI: "http://${KEYCLOAK_HOST}:8080/auth/realms/${REALM:-demo}/protocol/openid-connect/token/introspect"

# username extraction from JWT token claim
OAUTH_USERNAME_CLAIM: preferred_username

# For start.sh script to know where the keycloak is listening
KEYCLOAK_HOST: ${KEYCLOAK_HOST:-keycloak}
REALM: ${REALM:-kafka-authz}

zookeeper:
image: strimzi/example-zookeeper
build: kafka-oauth-strimzi/zookeeper/target
container_name: zookeeper
ports:
- 2181:2181
environment:
LOG_DIR: /home/kafka/logs
3 changes: 2 additions & 1 deletion examples/docker/kafka-oauth-strimzi/kafka/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
FROM strimzi/kafka:latest-kafka-2.3.0
FROM strimzi/kafka:latest-kafka-2.4.0

COPY libs/* /opt/kafka/libs/strimzi/
COPY config/* /opt/kafka/config/
COPY *.sh /opt/kafka/
COPY certificates/*.p12 /tmp/kafka/

USER root
RUN chmod +x /opt/kafka/*.sh
Expand Down
Binary file not shown.
Binary file not shown.
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
#!/bin/sh

set -e

STOREPASS=Z_pkTh9xgZovK4t34cGB2o6afT4zZg0L

echo "#### Generate broker keystore"
keytool -keystore cluster.keystore.p12 -alias localhost -validity 380 -genkey -keyalg RSA -ext SAN=DNS:kafka -dname "CN=my-cluster-kafka,O=io.strimzi" -deststoretype pkcs12 -storepass $STOREPASS -keypass $STOREPASS

echo "#### Add the CA to the brokers’ truststore"
keytool -keystore cluster.truststore.p12 -deststoretype pkcs12 -storepass $STOREPASS -alias CARoot -importcert -file ../../../certificates/ca.crt -noprompt

echo "#### Export the certificate from the keystore"
keytool -keystore cluster.keystore.p12 -storetype pkcs12 -alias localhost -certreq -file cert-file -storepass $STOREPASS

echo "#### Sign the certificate with the CA"
openssl x509 -req -CA ../../../certificates/ca.crt -CAkey ../../../certificates/ca.key -in cert-file -out cert-signed -days 400 -CAcreateserial -passin pass:$STOREPASS

echo "#### Import the CA and the signed certificate into the broker keystore"
keytool -keystore cluster.keystore.p12 -deststoretype pkcs12 -alias CARoot -import -file ../../../certificates/ca.crt -storepass $STOREPASS -noprompt
keytool -keystore cluster.keystore.p12 -deststoretype pkcs12 -alias localhost -import -file cert-signed -storepass $STOREPASS -noprompt
14 changes: 14 additions & 0 deletions examples/docker/kafka-oauth-strimzi/kafka/jwt.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
#!/bin/bash

if [ "$1" == "" ] || [ "$1" == "--help" ]; then
echo "Usage: $0 [JSON_WEB_TOKEN]"
exit 1
fi

IFS='.' read -r -a PARTS <<< "$1"

echo "Head: "
echo $(echo -n "${PARTS[0]}" | base64 -d 2>/dev/null)
echo
echo "Payload: "
echo $(echo -n "${PARTS[1]}" | base64 -d 2>/dev/null)
121 changes: 121 additions & 0 deletions examples/docker/kafka-oauth-strimzi/kafka/oauth.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,121 @@
#!/bin/bash

usage() {
echo "Usage: $0 [USERNAME] [PASSWORD] [ARGUMENTS] ..."
echo
echo "$0 is a tool for obtaining an access token or a refresh token for the user or the client."
echo
echo " USERNAME The username for user authentication"
echo " PASSWORD The password for user authentication (prompted for if not specified)"
echo
echo " If USERNAME and PASSWORD are not specified, client credentials as specified by --client-id and --secret will be used for authentication."
echo
echo " ARGUMENTS:"
echo " --quiet, -q No informational outputs"
echo " --insecure Allow http:// in token endpoint url"
echo " --access Return access_token rather than refresh_token"
echo " --endpoint TOKEN_ENDPOINT_URL Authorization server token endpoint"
echo " --client-id CLIENT_ID Client id for client authentication - must be configured on authorization server"
echo " --secret CLIENT_SECRET Secret to authenticate the client"
echo " --scopes SCOPES Space separated list of scopes to request - default value: offline_access"
}


CLAIM=refresh_token
GRANT_TYPE=password
DEFAULT_SCOPES=offline_access

while [ $# -gt 0 ]
do
case "$1" in
"-q" | "--quiet")
QUIET=1
;;
--endpoint)
shift
TOKEN_ENDPOINT="$1"
;;
--insecure)
INSECURE=1
;;
--access)
CLAIM=access_token
DEFAULT_SCOPES=""
;;
--client-id)
shift
CLIENT_ID="$1"
;;
--secret)
shift
CLIENT_SECRET="$1"
;;
--scopes)
shift
SCOPES="$1"
;;
--help)
usage
exit 1
;;
*)
if [ "$UNAME" == "" ]; then
UNAME="$1"
elif [ "$PASS" == "" ]; then
PASS="$1"
else
>&2 echo "Unexpected argument!"
exit 1
fi
;;
esac
shift
done

if [ "$TOKEN_ENDPOINT" == "" ]; then
>&2 echo "ENV variable TOKEN_ENDPOINT not set."
exit 1
fi

if [ "$UNAME" != "" ] && [ "$PASS" == "" ]; then
>&2 read -s -p "Password: " PASS
>&2 echo
fi

if [ "$UNAME" == "" ] && [ "$CLIENT_ID" == "" ]; then
echo "USERNAME not specified. Use --client-id and --secret to authenticate with client credentials."
exit 1
fi

if [ "$CLIENT_ID" == "" ]; then
[ "$QUIET" == "" ] && >&2 echo "ENV var CLIENT_ID not set. Using default value: kafka-cli"
CLIENT_ID=kafka-cli
fi

if [ "$UNAME" == "" ]; then
GRANT_TYPE=client_credentials
else
USER_PASS_CLIENT="&username=${UNAME}&password=${PASS}&client_id=${CLIENT_ID}"
fi

if [ "$SCOPES" == "" ] && [ DEFAULT_SCOPES != "" ]; then
[ "$QUIET" == "" ] && >&2 echo "ENV var SCOPES not set. Using default value: ${DEFAULT_SCOPES}"
SCOPES="${DEFAULT_SCOPES}"
fi

if [ "$CLIENT_SECRET" != "" ]; then
AUTH_VALUE=$(echo -n "$CLIENT_ID:$CLIENT_SECRET" | base64)
AUTHORIZATION="-H 'Authorization: Basic ""$AUTH_VALUE'"
fi

[ "$QUIET" == "" ] && >&2 echo curl -s -X POST $TOKEN_ENDPOINT \
$AUTHORIZATION \
-H 'Content-Type: application/x-www-form-urlencoded' \
-d "grant_type=${GRANT_TYPE}${USER_PASS_CLIENT}&scope=${SCOPES}"

result=$(curl -s -X POST $TOKEN_ENDPOINT \
$AUTHORIZATION \
-H 'Content-Type: application/x-www-form-urlencoded' \
-d "grant_type=${GRANT_TYPE}${USER_PASS_CLIENT}&scope=${SCOPES}")

echo $result | awk -F "$CLAIM\":\"" '{printf $2}' | awk -F "\"" '{printf $1}'
9 changes: 8 additions & 1 deletion examples/docker/kafka-oauth-strimzi/kafka/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,12 @@
<include>functions.sh</include>
<include>start.sh</include>
<include>start_with_hydra.sh</include>
<include>jwt.sh</include>
<include>oauth.sh</include>
<include>simple_kafka_config.sh</include>
<include>Dockerfile</include>
<include>config/</include>
<include>certificates/</include>
</includes>
<filtering>false</filtering>
</resource>
Expand Down Expand Up @@ -70,6 +73,10 @@
<groupId>io.strimzi</groupId>
<artifactId>kafka-oauth-common</artifactId>
</artifactItem>
<artifactItem>
<groupId>io.strimzi</groupId>
<artifactId>kafka-oauth-keycloak-authorizer</artifactId>
</artifactItem>
<artifactItem>
<groupId>org.keycloak</groupId>
<artifactId>keycloak-core</artifactId>
Expand All @@ -90,4 +97,4 @@
</plugin>
</plugins>
</build>
</project>
</project>
8 changes: 7 additions & 1 deletion examples/docker/kafka-oauth-strimzi/kafka/start.sh
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,13 @@ wait_for_url $URI "Waiting for Keycloak to start"

wait_for_url "$URI/realms/${REALM:-demo}" "Waiting for realm '${REALM}' to be available"

./simple_kafka_config.sh | tee /tmp/strimzi.properties
if [ "$SERVER_PROPERTIES_FILE" == "" ]; then
echo "Generating a new strimzi.properties file using ENV vars"
./simple_kafka_config.sh | tee /tmp/strimzi.properties
else
echo "Using provided server.properties file: $SERVER_PROPERTIES_FILE"
cp $SERVER_PROPERTIES_FILE /tmp/strimzi.properties
fi

# add Strimzi kafka-oauth-* jars and their dependencies to classpath
export CLASSPATH="/opt/kafka/libs/strimzi/*:$CLASSPATH"
Expand Down
2 changes: 1 addition & 1 deletion examples/docker/kafka-oauth-strimzi/zookeeper/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM strimzi/zookeeper:0.11.4-kafka-2.1.0
FROM strimzi/kafka:latest-kafka-2.4.0

COPY start.sh /opt/kafka/
COPY simple_zk_config.sh /opt/kafka/
Expand Down
Loading

0 comments on commit 8d4def3

Please sign in to comment.