Kafka
Produce events to Kafka topics via Confluent REST Proxy v3.
The kafka provider produces events to Kafka topics via the Confluent REST Proxy v3. Events become JSON messages, keyed and headered with the hookstream event metadata.
When to use this: you're already running Confluent Cloud (or self-hosted Confluent Platform) and want hookstream events to land as Kafka records.
Cloudflare Workers can't speak the native Kafka wire protocol — it needs raw TCP sockets. hookstream produces over the Confluent REST Proxy v3 HTTP API instead. Your cluster must have the REST Proxy enabled (it's on by default in Confluent Cloud).
Configuration
Confluent bootstrap server, e.g. pkc-xxxxx.us-east-1.aws.confluent.cloud:9092. The REST endpoint is derived from this by switching to HTTPS on port 443.
Confluent cluster ID, e.g. lkc-xxxxx. Found in the Confluent Cloud console under cluster settings.
Target Kafka topic name. The topic must already exist.
Confluent API key with produce permissions on the topic.
Confluent API secret. Stored encrypted and masked in GET responses.
The full REST URL hookstream hits is:
textPOST https://{cluster_host}/kafka/v3/clusters/{cluster_id}/topics/{topic}/records
Authentication
- In the Confluent Cloud console, open your cluster and create a new API key.
- Grant it
DeveloperWriteon the target topic (orResourceOwnerif you're setting up quickly). - Paste
api_keyandapi_secretinto the destination config.
hookstream uses HTTP Basic auth (api_key:api_secret) on every produce request.
Message Format
Each produce request is a single record:
json{ "value": { "type": "JSON", "data": { "...webhook body..." : "..." } }, "headers": [ { "name": "hookstream-event-id", "value": "<base64>" }, { "name": "hookstream-destination-id", "value": "<base64>" } ] }
Headers are base64-encoded per the REST Proxy v3 contract. The full webhook body lives in value.data.
Example
bashcurl -X POST https://hookstream.io/v1/destinations \ -H "X-API-Key: $HOOKSTREAM_API_KEY" \ -H "Content-Type: application/json" \ -d '{ "name": "Event Stream", "type": "kafka", "config": { "bootstrap_servers": "pkc-xxxxx.us-east-1.aws.confluent.cloud:9092", "cluster_id": "lkc-xxxxx", "topic": "webhook-events", "api_key": "ABCDEFGHIJKLMNOP", "api_secret": "ABCdef123..." } }'
Gotchas
hookstream cannot connect via kafka:// bootstrap servers directly. Confluent Cloud clusters expose REST Proxy by default; self-hosted clusters need the Confluent REST Proxy running in front of them.
hookstream does not create topics. Create it via the Confluent UI, Terraform, or confluent kafka topic create.
If you're running your own REST Proxy, set bootstrap_servers to the proxy host:port (HTTPS on 443). The auth model still expects api_key/api_secret in Basic auth headers.