A TypeScript implementation that connects local LLMs (via Ollama) to Model Context Protocol (MCP) servers. This bridge allows open-source models to use the same tools and capabilities as Claude, enabling powerful local AI assistants.
This project bridges local Large Language Models with MCP servers that provide various capabilities like:
The bridge translates between the LLM's outputs and the MCP's JSON-RPC protocol, allowing any Ollama-compatible model to use these tools just like Claude does.
@modelcontextprotocol/server-filesystem
)@modelcontextprotocol/server-brave-search
)@modelcontextprotocol/server-github
)@modelcontextprotocol/server-memory
)@patruff/server-flux
)@patruff/server-gmail-drive
)ollama pull qwen2.5-coder:7b-instruct
npm install -g @modelcontextprotocol/server-filesystem
npm install -g @modelcontextprotocol/server-brave-search
npm install -g @modelcontextprotocol/server-github
npm install -g @modelcontextprotocol/server-memory
npm install -g @patruff/server-flux
npm install -g @patruff/server-gmail-drive
BRAVE_API_KEY
for Brave SearchGITHUB_PERSONAL_ACCESS_TOKEN
for GitHubREPLICATE_API_TOKEN
for Fluxnode path/to/gmail-drive/index.js auth
The bridge is configured through bridge_config.json
:
Example:
{
"mcpServers": {
"filesystem": {
"command": "node",
"args": ["path/to/server-filesystem/dist/index.js"],
"allowedDirectory": "workspace/path"
},
// ... other MCP configurations
},
"llm": {
"model": "qwen2.5-coder:7b-instruct",
"baseUrl": "http://localhost:11434"
}
}
npm run start
list-tools
: Show available toolsquit
: Exit the programExample interactions:
> Search the web for "latest TypeScript features"
[Uses Brave Search MCP to find results]
> Create a new folder called "project-docs"
[Uses Filesystem MCP to create directory]
> Send an email to user@example.com
[Uses Gmail MCP to compose and send email]
The bridge includes smart tool detection based on user input:
Responses are processed through multiple stages:
This bridge effectively brings Claude's tool capabilities to local models:
All while running completely locally with open-source models.
This bridge integrates with the broader Claude ecosystem:
The result is a powerful local AI assistant that can match many of Claude's capabilities while running entirely on your own hardware.
Seamless access to top MCP servers powering the future of AI integration.