MotherDuck Now Speaks Postgres! Our pg_endpoint is now live!Demo - April 21

Skip to main content

Connect from Cloudflare Workers

Preview
This feature is in preview and is subject to change.

Cloudflare Workers do not support native DuckDB bindings, but they can connect to MotherDuck through the Postgres endpoint using the pg npm package. This gives you a thin-client path to query MotherDuck from edge functions without any DuckDB dependencies.

This guide walks through building a Worker that queries NYC taxi data from MotherDuck's built-in sample_data database. The full source code is available in the motherduck-examples repository.

Prerequisites

Project setup

Create a new directory and install dependencies:

mkdir motherduck-worker && cd motherduck-worker
npm init -y
npm install pg@^8.16.3
npm install --save-dev wrangler @types/pg

Configure wrangler.toml

name = "motherduck-taxi-stats"
main = "src/index.ts"
compatibility_date = "2026-04-02"
compatibility_flags = ["nodejs_compat"]

[vars]
MOTHERDUCK_HOST = "pg.us-east-1-aws.motherduck.com"
MOTHERDUCK_DB = "sample_data"

The nodejs_compat flag is required — it enables the node:net module that the pg package uses for TCP connections. Use a compatibility_date on or after 2024-09-23; in practice, set it to today's date when you create the project.

Generate the Worker binding types after you save wrangler.toml:

npx wrangler types

Store your token as a secret

npx wrangler secret put MOTHERDUCK_TOKEN

This prompts you to paste your MotherDuck token. It's stored encrypted and injected as an environment variable at runtime — it never appears in your source code or wrangler.toml.

For local development, create a .dev.vars file (add this to .gitignore):

MOTHERDUCK_TOKEN="your_token_here"

Write the Worker

Create src/index.ts. We'll build this in two parts: first the connection and routing, then the route handlers.

Connect and route requests

import { Client } from "pg";

interface Env {
MOTHERDUCK_HOST: string;
MOTHERDUCK_DB: string;
MOTHERDUCK_TOKEN: string;
}

function createClient(env: Env): Client {
return new Client({
connectionString:
`postgresql://user:${env.MOTHERDUCK_TOKEN}@${env.MOTHERDUCK_HOST}:5432/${env.MOTHERDUCK_DB}?sslmode=require`,
});
}

export default {
async fetch(request: Request, env: Env): Promise<Response> {
const url = new URL(request.url);

if (url.pathname === "/stats") {
return handleStats(env, url);
}

return handleDefault(env);
},
};

The connection string is assembled from the environment variables defined in wrangler.toml and the secret token. The ?sslmode=require parameter tells pg to open a TLS connection, and the Workers runtime performs certificate verification.

The fetch handler routes first and opens a database connection only inside the route handlers. That keeps validation failures on /stats returning 400 instead of depending on database connectivity.

Handle route logic

Add the two handler functions to the same file. The /stats route accepts date range parameters and returns aggregated fare data. It validates inputs before querying and uses parameterized queries ($1, $2) to prevent SQL injection — never interpolate user input directly into SQL strings.

async function handleStats(env: Env, url: URL): Promise<Response> {
const startDate = url.searchParams.get("start");
const endDate = url.searchParams.get("end");

if (!startDate || !endDate) {
return Response.json(
{ error: "Both 'start' and 'end' query parameters are required. Use YYYY-MM-DD format." },
{ status: 400 }
);
}

const datePattern = /^\d{4}-\d{2}-\d{2}$/;
if (!datePattern.test(startDate) || !datePattern.test(endDate)) {
return Response.json(
{ error: "Invalid date format. Use YYYY-MM-DD." },
{ status: 400 }
);
}

const client = createClient(env);

try {
await client.connect();

const result = await client.query(
`SELECT
sum(passenger_count)::INTEGER AS total_passengers,
round(sum(fare_amount), 2) AS total_fare
FROM nyc.taxi
WHERE tpep_pickup_datetime >= $1
AND tpep_pickup_datetime < $2`,
[`${startDate} 00:00:00`, `${endDate} 00:00:00`]
);

return Response.json({
start: startDate,
end: endDate,
...result.rows[0],
});
} finally {
await client.end();
}
}

The default route returns a sample of recent taxi trips — no user input needed:

async function handleDefault(env: Env): Promise<Response> {
const client = createClient(env);

try {
await client.connect();

const result = await client.query(
`SELECT
tpep_pickup_datetime AS pickup,
tpep_dropoff_datetime AS dropoff,
passenger_count,
trip_distance,
fare_amount,
tip_amount,
total_amount
FROM nyc.taxi
ORDER BY tpep_pickup_datetime DESC
LIMIT 20`
);

return Response.json(result.rows);
} finally {
await client.end();
}
}

Test locally

npx wrangler dev

Then open http://localhost:8787/ or try the stats endpoint with a date range:

http://localhost:8787/stats?start=2022-11-01&end=2022-12-01

If wrangler dev starts successfully but direct Postgres queries fail locally with Connection terminated, switch to the Hyperdrive setup below and use a localConnectionString for local testing, or run npx wrangler dev --remote to exercise the Cloudflare runtime directly.

Deploy

npx wrangler deploy

Using Hyperdrive for connection pooling

For production workloads, Cloudflare Hyperdrive provides built-in connection pooling. This reduces latency by reusing connections across Worker invocations instead of opening a new connection per request.

1. create a Hyperdrive configuration

npx wrangler hyperdrive create motherduck-db \
--connection-string="postgresql://user:$MOTHERDUCK_TOKEN@pg.us-east-1-aws.motherduck.com:5432/sample_data?sslmode=require"

2. update wrangler.toml

name = "motherduck-taxi-stats"
main = "src/index.ts"
compatibility_date = "2026-04-02"
compatibility_flags = ["nodejs_compat"]

[[hyperdrive]]
binding = "MD_HYPERDRIVE"
id = "<your-hyperdrive-config-id>"

3. update the connection code

Replace the connection string construction with:

const client = new Client({
connectionString: env.MD_HYPERDRIVE.connectionString,
});

Hyperdrive handles connection pooling and credential injection automatically.

For local development with Hyperdrive, configure a direct connection string for wrangler dev:

export CLOUDFLARE_HYPERDRIVE_LOCAL_CONNECTION_STRING_MD_HYPERDRIVE="postgresql://user:$MOTHERDUCK_TOKEN@pg.us-east-1-aws.motherduck.com:5432/sample_data?sslmode=require"
npx wrangler dev

SSL notes

Cloudflare Workers use pg-cloudflare for socket connections, which delegates TLS to the Workers runtime through cloudflare:sockets. The runtime encrypts the connection and verifies the server certificate against Cloudflare's trust store, but those verification settings are not exposed through the pg client. In this environment, application code uses the runtime-managed TLS configuration rather than supplying rejectUnauthorized, custom CA certificates, or sslmode=verify-full.

Use ?sslmode=require in the connection string. This tells pg to initiate TLS using STARTTLS, and the Workers runtime handles the actual certificate verification at the socket level.

For standard Node.js environments where you can configure certificate verification directly, see Connect from Node.js.