Skip to content
Draft
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 47 additions & 0 deletions docs/core_concepts/41_kafka_triggers/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ The resource requires:
| SASL_SSL | Username/password authentication with TLS encryption. |
| SASL_GSSAPI | Kerberos (GSSAPI) authentication without encryption. |
| SASL_SSL_GSSAPI | Kerberos (GSSAPI) authentication with TLS encryption. |
| SASL_SSL_OAUTHBEARER | OAuth 2.0 / OIDC `client_credentials` authentication with TLS encryption. |

### Kerberos (GSSAPI) authentication

Expand Down Expand Up @@ -145,6 +146,52 @@ In containerized environments (Docker, Kubernetes) where reverse DNS may return
This tells GSSAPI to use the hostname as configured in your broker list without reverse DNS canonicalization.
:::

### OAUTHBEARER (OIDC) authentication

For brokers that delegate authentication to an external identity provider (IdP), select `SASL_SSL_OAUTHBEARER`. Windmill performs the OAuth 2.0 `client_credentials` grant against the IdP's token endpoint and forwards the resulting bearer token to the broker on every connection. This is the recommended mechanism for managed Kafka services that expose OIDC, including:

- Confluent Cloud OAuth (with an external IdP such as Okta, Auth0, Azure AD, or Keycloak)
- Aiven for Apache Kafka with OIDC
- Azure Event Hubs (Kafka surface) with OAuth 2.0
- AWS MSK configured to delegate authentication to an external IdP

| Property | Description | Required |
|----------|-------------|----------|
| oauthbearer_client_id | OAuth client ID registered with the IdP | Yes |
| oauthbearer_client_secret | OAuth client secret registered with the IdP | Yes |
| oauthbearer_token_endpoint_url | Full URL of the IdP token endpoint (e.g., `https://login.example.com/realms/kafka/protocol/openid-connect/token`) | Yes |
| oauthbearer_scope | Space-separated list of OAuth scopes to request from the IdP | No |
| oauthbearer_extensions | Comma-separated `key=value` pairs sent as SASL/OAUTHBEARER extensions (e.g., `logicalCluster=lkc-xxxxx,identityPoolId=pool-xxxxx` for Confluent Cloud) | No |

:::info OIDC token flow
Token acquisition happens on the **server** (`windmill-app` pod), not on workers. Tokens are cached and refreshed automatically by `librdkafka` before they expire, so the trigger maintains a long-lived consumer connection without re-issuing credentials on every message.
:::

#### Confluent Cloud example

When using Confluent Cloud OAuth with an external IdP, set the broker URL to the cluster bootstrap server, point the token endpoint at your IdP, and pass the cluster ID and identity pool through `oauthbearer_extensions`:

```
oauthbearer_token_endpoint_url = https://idp.example.com/oauth/token
oauthbearer_client_id = my-confluent-client
oauthbearer_client_secret = <secret>
oauthbearer_scope = api://kafka/.default
oauthbearer_extensions = logicalCluster=lkc-xxxxx,identityPoolId=pool-xxxxx
```

#### Azure Event Hubs example

```
oauthbearer_token_endpoint_url = https://login.microsoftonline.com/<tenant-id>/oauth2/v2.0/token
oauthbearer_client_id = <azure-app-client-id>
oauthbearer_client_secret = <azure-app-secret>
oauthbearer_scope = https://<namespace>.servicebus.windows.net/.default
```

:::warning TLS is required
OAUTHBEARER must be paired with TLS (`SASL_SSL_OAUTHBEARER`). Sending bearer tokens over a plaintext connection would expose them on the wire, so the unencrypted variant is intentionally not offered.
:::

## How to use

Create a new trigger on the Kafka triggers page.
Expand Down
Loading