From b5a94be006e7d35466bea1eedbf8d8556faa6e8b Mon Sep 17 00:00:00 2001 From: "claude[bot]" <41898282+claude[bot]@users.noreply.github.com> Date: Wed, 6 May 2026 13:10:41 +0000 Subject: [PATCH 1/2] docs(kafka-triggers): document SASL_SSL_OAUTHBEARER security mode Adds the new OAuth 2.0 / OIDC client_credentials authentication option to the Kafka trigger security table, with a dedicated section covering Confluent Cloud, Aiven, Azure Event Hubs and AWS MSK with external IdPs. Source PR: windmill-labs/windmill#9054 Co-Authored-By: Claude Opus 4.7 --- .../core_concepts/41_kafka_triggers/index.mdx | 47 +++++++++++++++++++ 1 file changed, 47 insertions(+) diff --git a/docs/core_concepts/41_kafka_triggers/index.mdx b/docs/core_concepts/41_kafka_triggers/index.mdx index 9746ec43d..457553601 100644 --- a/docs/core_concepts/41_kafka_triggers/index.mdx +++ b/docs/core_concepts/41_kafka_triggers/index.mdx @@ -31,6 +31,7 @@ The resource requires: | SASL_SSL | Username/password authentication with TLS encryption. | | SASL_GSSAPI | Kerberos (GSSAPI) authentication without encryption. | | SASL_SSL_GSSAPI | Kerberos (GSSAPI) authentication with TLS encryption. | +| SASL_SSL_OAUTHBEARER | OAuth 2.0 / OIDC `client_credentials` authentication with TLS encryption. | ### Kerberos (GSSAPI) authentication @@ -145,6 +146,52 @@ In containerized environments (Docker, Kubernetes) where reverse DNS may return This tells GSSAPI to use the hostname as configured in your broker list without reverse DNS canonicalization. ::: +### OAUTHBEARER (OIDC) authentication + +For brokers that delegate authentication to an external identity provider (IdP), select `SASL_SSL_OAUTHBEARER`. Windmill performs the OAuth 2.0 `client_credentials` grant against the IdP's token endpoint and forwards the resulting bearer token to the broker on every connection. This is the recommended mechanism for managed Kafka services that expose OIDC, including: + +- Confluent Cloud OAuth (with an external IdP such as Okta, Auth0, Azure AD, or Keycloak) +- Aiven for Apache Kafka with OIDC +- Azure Event Hubs (Kafka surface) with OAuth 2.0 +- AWS MSK configured to delegate authentication to an external IdP + +| Property | Description | Required | +|----------|-------------|----------| +| oauthbearer_client_id | OAuth client ID registered with the IdP | Yes | +| oauthbearer_client_secret | OAuth client secret registered with the IdP | Yes | +| oauthbearer_token_endpoint_url | Full URL of the IdP token endpoint (e.g., `https://login.example.com/realms/kafka/protocol/openid-connect/token`) | Yes | +| oauthbearer_scope | Space-separated list of OAuth scopes to request from the IdP | No | +| oauthbearer_extensions | Comma-separated `key=value` pairs sent as SASL/OAUTHBEARER extensions (e.g., `logicalCluster=lkc-xxxxx,identityPoolId=pool-xxxxx` for Confluent Cloud) | No | + +:::info OIDC token flow +Token acquisition happens on the **server** (`windmill-app` pod), not on workers. Tokens are cached and refreshed automatically by `librdkafka` before they expire, so the trigger maintains a long-lived consumer connection without re-issuing credentials on every message. +::: + +#### Confluent Cloud example + +When using Confluent Cloud OAuth with an external IdP, set the broker URL to the cluster bootstrap server, point the token endpoint at your IdP, and pass the cluster ID and identity pool through `oauthbearer_extensions`: + +``` +oauthbearer_token_endpoint_url = https://idp.example.com/oauth/token +oauthbearer_client_id = my-confluent-client +oauthbearer_client_secret = +oauthbearer_scope = api://kafka/.default +oauthbearer_extensions = logicalCluster=lkc-xxxxx,identityPoolId=pool-xxxxx +``` + +#### Azure Event Hubs example + +``` +oauthbearer_token_endpoint_url = https://login.microsoftonline.com//oauth2/v2.0/token +oauthbearer_client_id = +oauthbearer_client_secret = +oauthbearer_scope = https://.servicebus.windows.net/.default +``` + +:::warning TLS is required +OAUTHBEARER must be paired with TLS (`SASL_SSL_OAUTHBEARER`). Sending bearer tokens over a plaintext connection would expose them on the wire, so the unencrypted variant is intentionally not offered. +::: + ## How to use Create a new trigger on the Kafka triggers page. From 47344fe023c8b8f84922a894627d62ad9cb228a5 Mon Sep 17 00:00:00 2001 From: hugocasa Date: Fri, 8 May 2026 17:08:43 +0200 Subject: [PATCH 2/2] docs(kafka-triggers): fix OAUTHBEARER field names, add TLS fields, trim unverified examples Co-Authored-By: Claude Opus 4.7 (1M context) --- .../core_concepts/41_kafka_triggers/index.mdx | 43 +++++++------------ 1 file changed, 15 insertions(+), 28 deletions(-) diff --git a/docs/core_concepts/41_kafka_triggers/index.mdx b/docs/core_concepts/41_kafka_triggers/index.mdx index 457553601..faac4bfe9 100644 --- a/docs/core_concepts/41_kafka_triggers/index.mdx +++ b/docs/core_concepts/41_kafka_triggers/index.mdx @@ -148,44 +148,31 @@ This tells GSSAPI to use the hostname as configured in your broker list without ### OAUTHBEARER (OIDC) authentication -For brokers that delegate authentication to an external identity provider (IdP), select `SASL_SSL_OAUTHBEARER`. Windmill performs the OAuth 2.0 `client_credentials` grant against the IdP's token endpoint and forwards the resulting bearer token to the broker on every connection. This is the recommended mechanism for managed Kafka services that expose OIDC, including: - -- Confluent Cloud OAuth (with an external IdP such as Okta, Auth0, Azure AD, or Keycloak) -- Aiven for Apache Kafka with OIDC -- Azure Event Hubs (Kafka surface) with OAuth 2.0 -- AWS MSK configured to delegate authentication to an external IdP +For brokers that delegate authentication to an external identity provider (IdP), select `SASL_SSL_OAUTHBEARER`. Windmill performs the OAuth 2.0 `client_credentials` grant against the IdP's token endpoint and forwards the resulting bearer token to the broker. | Property | Description | Required | |----------|-------------|----------| -| oauthbearer_client_id | OAuth client ID registered with the IdP | Yes | -| oauthbearer_client_secret | OAuth client secret registered with the IdP | Yes | -| oauthbearer_token_endpoint_url | Full URL of the IdP token endpoint (e.g., `https://login.example.com/realms/kafka/protocol/openid-connect/token`) | Yes | -| oauthbearer_scope | Space-separated list of OAuth scopes to request from the IdP | No | -| oauthbearer_extensions | Comma-separated `key=value` pairs sent as SASL/OAUTHBEARER extensions (e.g., `logicalCluster=lkc-xxxxx,identityPoolId=pool-xxxxx` for Confluent Cloud) | No | +| client_id | OAuth client ID registered with the IdP | Yes | +| client_secret | OAuth client secret registered with the IdP | Yes | +| token_endpoint_url | Full URL of the IdP token endpoint (e.g., `https://login.example.com/realms/kafka/protocol/openid-connect/token`) | Yes | +| scope | Space-separated list of OAuth scopes to request from the IdP | No | +| extensions | Comma-separated `key=value` pairs sent as SASL/OAUTHBEARER extensions | No | +| ca | PEM-encoded CA certificate for verifying the broker | No | +| certificate | PEM-encoded client certificate for mutual TLS | No | +| key | PEM-encoded client private key | No | +| key_password | Password for the client private key, if encrypted | No | :::info OIDC token flow Token acquisition happens on the **server** (`windmill-app` pod), not on workers. Tokens are cached and refreshed automatically by `librdkafka` before they expire, so the trigger maintains a long-lived consumer connection without re-issuing credentials on every message. ::: -#### Confluent Cloud example - -When using Confluent Cloud OAuth with an external IdP, set the broker URL to the cluster bootstrap server, point the token endpoint at your IdP, and pass the cluster ID and identity pool through `oauthbearer_extensions`: - -``` -oauthbearer_token_endpoint_url = https://idp.example.com/oauth/token -oauthbearer_client_id = my-confluent-client -oauthbearer_client_secret = -oauthbearer_scope = api://kafka/.default -oauthbearer_extensions = logicalCluster=lkc-xxxxx,identityPoolId=pool-xxxxx -``` - -#### Azure Event Hubs example +#### Example ``` -oauthbearer_token_endpoint_url = https://login.microsoftonline.com//oauth2/v2.0/token -oauthbearer_client_id = -oauthbearer_client_secret = -oauthbearer_scope = https://.servicebus.windows.net/.default +client_id = my-kafka-client +client_secret = +token_endpoint_url = https://login.example.com/realms/kafka/protocol/openid-connect/token +scope = kafka ``` :::warning TLS is required