Authentication to Confluent Cloud Using OAuth/OIDC with Auth0 as Identity Provider

Authentication to Confluent Cloud Using OAuth/OIDC with Auth0 as Identity Provider

Configuring OAuth/OIDC for Confluent Cloud using Auth0

In the early days of the internet, sharing information between services was straightforward but risky. Users often had to share their usernames and passwords, exposing their credentials to potential misuse. Today, we have secure standards like OAuth 2.0 and OpenID Connect (OIDC) that make this process safer and more efficient.

The popularity of OAuth and OIDC for authentication and authorization says a lot about the importance of the technology. If you see a login page online on a modern website, it is most likely configured using OAuth 2.0 and OpenID Connect (OIDC). These protocols have become the standard for secure, robust authentication and authorization across various applications and services.

The Problem with Sharing Credentials

Back in the day, if you wanted a service to access your information on another platform, you had to share your username and password. This practice was insecure for several reasons:

  • No guarantee that the service would keep your credentials safe.

  • No control over how much of your personal information the service could access.

Thankfully, standards like OAuth 2.0 have been developed to address these security concerns.

Understanding OAuth and OIDC: A Simple Example

OAuth and OIDC work together to make secure logins easy and seamless. Imagine you want to log in to a new app using your Google account. When you click “Log in with Google,” OAuth handles the authorization by asking Google for permission to share your info with the new app. If you agree, OIDC comes into play, providing the app with your identity details, such as your name and email address, so you can log in without creating a new account. This way, OAuth ensures your data stays secure, and OIDC confirms your identity.

Image1

For more information on OAuth and OIDC go through this beautifully explained blog post from Okta: An Illustrated Guide to OAuth and OpenID Connect

Scope of this blog

In this blog post, we will walk through the steps of configuring Auth0 as an identity provider for Confluent Cloud. Auth0 is a flexible, drop-in solution to add authentication and authorization services to your applications. Confluent Cloud is a fully managed event streaming platform based on Apache Kafka. By integrating Auth0 with Confluent Cloud, you can enhance the security of your Kafka clusters and streamline the user authentication process.

Configuring Auth0 as an identity provider in Confluent Cloud

Prerequisites:

  1. A Confluent Cloud account.

  2. An Auth0 account.

  3. Administrative access to both platforms.

Steps:

1. Create a New Application in Auth0

  1. Log in to Auth0:
    1. Go to your Auth0 dashboard.
  2. Create a New Application:
    1. You can see a Create Application button on the getting started page as shows in this picture or alternatively you can go to the application section on the left side of the screen and create an application from there

      Image2

  3. Both options should land you in this page where you give your application details. Here you give your application name, and select Machine to Machine under application type. 

  4. Below this there is an option to select an API. I will be selecting the default Auth0 Management API in this example. But you can create your own API using the APIs section from the menu.

    Image3

  5. Once we’ve selected the API, now we can select the permissions (scopes) we want this API to use. In this case, I am selecting all permissions.

  6. Click on continue to successfully create the Application

    Image4

  7. Check Application Settings:
    1. In the application settings, take note of the Client ID and Client Secret.

      Image5

2. Configure Auth0 as an Identity Provider in Confluent Cloud

  1. Log in to Confluent Cloud:
    1. Go to your Confluent Cloud dashboard.
    1. Go to the Accounts and access section

    2. Navigate to the Workload Identities tab

      Image6

  2. Add a New Identity Provider:
    1. Click on Add Identity Providers.

    2. Choose Other OIDC Provider as the provider type.

      Image7

  3. Fill in the Identity Provider Details:
    1. Fill in the Name, Description, OIDC Discovery URL, Issuer URI, JWKS URI as shown in the picture below.

      Image8

  4. You might be wondering where to find these details. Well, the OIDC Discovery URL can be found under your application settings mentioned as Domain

  5. The issuer URI and and JWKS URI can be found in the OpenID Connect (OIDC) discovery documents found here: https://{yourDomain}/.well-known/openid-configuration

    1. In my case, it looks something like the image shown below

      Image9

  6. Once this is filled, click on Validate and Save

5) Add Identity Pool:

  1. Once the Identity provider is created, we need to create an Identity pool. Cick on Add Identity Pool button as shown below

    Image10

  2. Fill in the details as per your requirement. I have filled it as shown in the picture.

    Image11

  3. You can use an identity pool to provide granular control over access of your applications to your Confluent Cloud resources. An identity pool is a group of external application identities that are assigned a certain level of access based on a claims-based policy

    For details on identity pool and how to use it, Check out Use Identity Pools with Your OAuth/OIDC Identity Provider on Confluent Cloud.

  4. Click next once you are done with populating the fields.

  5. Since we are creating everything from scratch we will select Add new permissions as shown below and hit next.

    Image6

  6. On the next page, select the cluster you want to and give the permissions required and finish the setup.

3. Configuring Kafka Client

  1. Open your terminal from where you are trying to access the confluent cloud and where your client is present.

  2. Create a client configuration, in my case I have named it client.properties

  3. Use the following template to fill in your details 

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
# Kafka Broker Connection
bootstrap.servers=YOUR_BOOTSTRAP_SERVER

# Security Protocol
security.protocol=SASL_SSL

# OAuth2 Token Endpoint URL
sasl.oauthbearer.token.endpoint.url=YOUR_TOKEN_ENDPOINT_URL

# Login Callback Handler Class
sasl.login.callback.handler.class=org.apache.kafka.common.security.oauthbearer.secured.OAuthBearerLoginCallbackHandler

# SASL Mechanism
sasl.mechanism=OAUTHBEARER

# JAAS Configuration
sasl.jaas.config= \
  org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required \
    clientId='YOUR_CLIENT_ID' \
    clientSecret='YOUR_CLIENT_SECRET' \
    extension_logicalCluster='YOUR_LOGICAL_CLUSTER' \
    extension_identityPoolId='YOUR_IDENTITY_POOL_ID' \
    scope='YOUR_SCOPES' \
    audience='YOUR_API_IDENTIFIER';
  1. In my case, I have used the following config
1
2
3
4
5
6
7
8
9
10
11
12
bootstrap.servers=pkc-p11xm.us-east-1.aws.confluent.cloud:9092
security.protocol=SASL_SSL
sasl.oauthbearer.token.endpoint.url=https://dev-jpup1hj0aphkbijm.us.auth0.com/oauth/token
sasl.login.callback.handler.class=org.apache.kafka.common.security.oauthbearer.secured.OAuthBearerLoginCallbackHandler
sasl.mechanism=OAUTHBEARER
sasl.jaas.config= \
  org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required \
    clientId='8GJmNPXY5UV1yrSNgc1ggiKkYcHiifFM' \
    clientSecret='wSQzDaI-MRB80w2HyzKbV-JjPS4Ijd5zUu10LdisgEDR7_LRoC98ruBGgnLd_Lha' \
    extension_logicalCluster='lkc-37vn0o' \
    extension_identityPoolId='pool-AXqR' \
    scope='read:users';

Note that, you should never share your client details, especially the client secret. I have deleted all instances of my application, hence I haven’t redacted the client secret.

  1. Here we can add as many permissions as we want to access under the scope parameter. But this can be done only if we have allowed the access in the identity pool.

  2. We can use this client.properties to access the kafka cluster depending on the amount of permissions we have.

Conclusion

In conclusion, integrating Auth0 as an identity provider for Confluent Cloud significantly enhances the security and efficiency of your Kafka clusters. By leveraging the power of OAuth 2.0 and OpenID Connect, you can ensure that user credentials are protected, and access is precisely controlled. This setup not only streamlines the authentication process but also provides a robust framework for managing permissions and identities. As you follow the steps outlined in this guide, you’ll find that configuring Auth0 with Confluent Cloud is straightforward and highly beneficial for maintaining a secure, scalable, and user-friendly environment for your event streaming applications.

About The Author

About The Author

Vikhyat Shetty
Vikhyat Shetty DevOps/SRE at Platformatory