Kafka

Produce events to Kafka topics via Confluent REST Proxy v3.

The kafka provider produces events to Kafka topics via the Confluent REST Proxy v3. Events become JSON messages, keyed and headered with the hookstream event metadata.

When to use this: you're already running Confluent Cloud (or self-hosted Confluent Platform) and want hookstream events to land as Kafka records.

Cloudflare Workers can't speak the native Kafka wire protocol — it needs raw TCP sockets. hookstream produces over the Confluent REST Proxy v3 HTTP API instead. Your cluster must have the REST Proxy enabled (it's on by default in Confluent Cloud).

Configuration

bootstrap_servers string required

Confluent bootstrap server, e.g. pkc-xxxxx.us-east-1.aws.confluent.cloud:9092. The REST endpoint is derived from this by switching to HTTPS on port 443.

cluster_id string required

Confluent cluster ID, e.g. lkc-xxxxx. Found in the Confluent Cloud console under cluster settings.

topic string required

Target Kafka topic name. The topic must already exist.

api_key string required

Confluent API key with produce permissions on the topic.

api_secret string required

Confluent API secret. Stored encrypted and masked in GET responses.

The full REST URL hookstream hits is:

text
POST https://{cluster_host}/kafka/v3/clusters/{cluster_id}/topics/{topic}/records

Authentication

  1. In the Confluent Cloud console, open your cluster and create a new API key.
  2. Grant it DeveloperWrite on the target topic (or ResourceOwner if you're setting up quickly).
  3. Paste api_key and api_secret into the destination config.

hookstream uses HTTP Basic auth (api_key:api_secret) on every produce request.

Message Format

Each produce request is a single record:

json
{ "value": { "type": "JSON", "data": { "...webhook body..." : "..." } }, "headers": [ { "name": "hookstream-event-id", "value": "<base64>" }, { "name": "hookstream-destination-id", "value": "<base64>" } ] }

Headers are base64-encoded per the REST Proxy v3 contract. The full webhook body lives in value.data.

Example

bash
curl -X POST https://hookstream.io/v1/destinations \ -H "X-API-Key: $HOOKSTREAM_API_KEY" \ -H "Content-Type: application/json" \ -d '{ "name": "Event Stream", "type": "kafka", "config": { "bootstrap_servers": "pkc-xxxxx.us-east-1.aws.confluent.cloud:9092", "cluster_id": "lkc-xxxxx", "topic": "webhook-events", "api_key": "ABCDEFGHIJKLMNOP", "api_secret": "ABCdef123..." } }'

Gotchas

hookstream cannot connect via kafka:// bootstrap servers directly. Confluent Cloud clusters expose REST Proxy by default; self-hosted clusters need the Confluent REST Proxy running in front of them.

hookstream does not create topics. Create it via the Confluent UI, Terraform, or confluent kafka topic create.

If you're running your own REST Proxy, set bootstrap_servers to the proxy host:port (HTTPS on 443). The auth model still expects api_key/api_secret in Basic auth headers.

Next Steps

RabbitMQ

Publish via RabbitMQ's HTTP Management API.

Learn More
GCP Pub/Sub

Managed pub/sub without the Kafka overhead.

Learn More
AWS SQS

Simple queue-based delivery.

Learn More
Destinations API

Full API reference.

Learn More
Ask a question... ⌘I