Talk to your Cloudflare Workers from Claude Desktop!
This is a proof-of-concept of writing a Model Context Protocol (MCP) Server in a Cloudflare Worker. This gives you a way to extend Claude Desktop (among other MCP clients) by invoking functions using Cloudflare Worker's new RPC syntax, which gives you access to any Cloudflare or third-party binding.
You write worker code that looks like this:
export class ExampleWorkerMCP extends WorkerEntrypoint<Env> {
/**
* Generates a random number. This is extra random because it had to travel all the way to
* your nearest Cloudflare PoP to be calculated which... something something lava lamps?
*
* @return {string} A message containing a super duper random number
* */
async getRandomNumber() {
return `Your random number is ${Math.random()}`
}
}
And, using the provided MCP proxy, your Claude Desktop can see & invoke these messages:
Yes, I know that
Math.random()
works the same on a Worker as it does on your local machine, but don't tell Claude 🤫
pnpm install
wrangler.json
wrangler.json
or your deploy will fail.pnpm deploy:worker
src/index.ts
file and generate dist/docs.json
from it, then deploys it using Wrangler.npx workers-mcp secret generate && npx workers-mcp secret upload
.dev.vars
and uploads it using wrangler secret put
. You only need to this once.npx workers-mcp install <server-alias> <worker-url>
To iterate on your server, do the following:
src/index.ts
pnpm deploy:worker
pnpm deploy:worker
is enough.These are replaced with relevant sections in Workers MCP package.
Separately to your MCP code inside src/index.ts
, there are three pieces required to make this work:
scripts/generate-docs.ts
The MCP Specification separates the tools/list
and tools/call
operations into separate steps, and most MCP servers have naturally followed suit and separated their schema definition from the implementation. However, combining them provides a much better DX for the author.
I'm using ts-blank-space and jsdoc-api to parse the TS and emit the schema, slightly tweaked. This gives you LLM-friendly documentation at build time:
/**
* Send a text or HTML email to an arbitrary recipient.
*
* @param {string} recipient - The email address of the recipient.
* @param {string} subject - The subject of the email.
* @param {string} contentType - The content type of the email. Can be text/plain or text/html
* @param {string} body - The body of the email. Must match the provided contentType parameter
* @return {Promise<string>} A success message.
* @throws {Error} If the email fails to send, or if that destination email address hasn't been verified.
*/
async sendEmail(recipient: string, subject: string, contentType: string, body: string) {
// ...
}
{
"ExampleWorkerMCP": {
"exported_as": "ExampleWorkerMCP",
"description": null,
"methods": [
{
"name": "sendEmail",
"description": "Send a text or HTML email to an arbitrary recipient.",
"params": [
{
"description": "The email address of the recipient.",
"name": "recipient",
"type": "string"
},
{
"description": "The subject of the email.",
"name": "subject",
"type": "string"
},
{
"description": "The content type of the email. Can be text/plain or text/html",
"name": "contentType",
"type": "string"
},
{
"description": "The body of the email. Must match the provided contentType parameter",
"name": "body",
"type": "string"
}
],
"returns": {
"description": "A success message.",
"type": "Promise.<string>"
}
}
]
}
}
This list of methods is very similar to the required MCP format for tools/list
, but also gives us a list of the WorkerEntrypoint
exports names to look up our service bindings later.
To iterate on your docs, run pnpm generate:docs:watch
and you'll see the output change as you tweak your JSDoc in your src/index.ts
(you'll need watchexec installed).
lib/WorkerMCP.ts
Since our WorkerEntrypoint
is not directly accessible, we need something that defines a default export with a fetch()
handler. This is what lib/WorkerMCP.ts
does.
This exposes a single endpoint, /rpc
, which takes a JSON payload of { method: string, args?: any[] }
, then calls that method on your WorkerEntrypoint
.
scripts/local-proxy.ts
This file uses the @modelcontextprotocol/sdk
library to build up a normal, local MCP server. This responds to tools/list
by producing the data from docs.json
for the specified entrypoint.
On tools/call
, a .fetch
call is made to the remote worker on the /rpc
route, providing a Bearer
token with the contents of generated/.shared-secret
. The responses are then piped back to Claude.
Calling pnpm install:claude <server-alias> <worker-url>
adds a sever definition that points to this file in your claude_desktop_config.json
:
{
"mcpServers": {
"<server-alias>": {
"command": "<absolute-path-to>/node",
"args": [
"<project-dir>/node_modules/tsx/dist/cli.mjs",
"<project-dir>/scripts/local-proxy.ts",
"<server-alias>",
"<worker-url>",
"<entrypoint-name>"
]
}
}
}
In this way you can install as many of these as you like, as long as they each have a distinct <server-alias>
.
There are lots. This pizza is straight out of the oven. You may well burn your mouth.
docs.json
is only generated from src/index.ts
. It doesn't currently crawl imports like a bundler, because no bundler I could find preserved comments in-place in order for me to run the docs generator afterwards.class X {}; export { X as Y }
, but in general most people do export default class X {}
anyway so this is fine for now.wrangler dev
support yet, but wrangler dev --remote
should be possible so you don't have to deploy so oftennotifications/tools/list_changed
notification that should trigger Claude to refresh its list of the tools available, meaning fewer restarts of Claude Desktop. But I haven't implemented that yet.@param
blocks in the JSDocObviously, having Claude Desktop talk directly to the Worker would be ideal. Also, wrangler dev --remote
support would be great: you could iterate on your worker without redeploying, but still access your production bindings.
The docs generator needs to be extracted into a library so we can publish changes, as it needs to grow in scope to be really useful, and likely incorporate other sources of data (d.ts
files, zod schemas, etc).
Give it a try! Then, raise an issue or send a PR . This is all very new, so it could really go in a lot of different directions. We'd love to hear from you!
{
"mcpServers": {
"server-alias": {
"env": {},
"args": [
"project-dir/node_modules/tsx/dist/cli.mjs",
"project-dir/scripts/local-proxy.ts",
"server-alias",
"worker-url",
"entrypoint-name"
],
"command": "node"
}
}
}
Seamless access to top MCP servers powering the future of AI integration.