An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.
To install Ollama MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @rawveg/ollama-mcp --client claude
Install globally via npm:
npm install -g @rawveg/ollama-mcp
To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:
{
"mcpServers": {
"@rawveg/ollama-mcp": {
"command": "npx",
"args": [
"-y",
"@rawveg/ollama-mcp"
]
}
}
}
The settings file location varies by application:
claude_desktop_config.json
in the Claude app data directorycline_mcp_settings.json
in the VS Code global storageSimply run:
ollama-mcp
The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:
PORT=3457 ollama-mcp
PORT
: Server port (default: 3456). Can be used both when running directly and during Smithery installation:
# When running directly
PORT=3457 ollama-mcp
# When installing via Smithery
PORT=3457 npx -y @smithery/cli install @rawveg/ollama-mcp --client claude
OLLAMA_API
: Ollama API endpoint (default: http://localhost:11434)GET /models
- List available modelsPOST /models/pull
- Pull a new modelPOST /chat
- Chat with a modelGET /models/:name
- Get model detailsgit clone https://github.com/rawveg/ollama-mcp.git
cd ollama-mcp
npm install
npm run build
npm start
Contributions are welcome! Please feel free to submit a Pull Request.
MIT
{
"mcpServers": {
"@rawveg/ollama-mcp": {
"env": {},
"args": [
"-y",
"@rawveg/ollama-mcp"
],
"command": "npx"
}
}
}
Seamless access to top MCP servers powering the future of AI integration.