This project demonstrates integration between Model Context Protocol (MCP) servers and Ollama, allowing AI models to interact with various tools through a unified interface.
mcp-config.json
similar to claude_desktop_config.json
Prerequisites:
Node.js (version 18 or higher)
Ollama installed and running
Install the MCP tools globally that you want to use:
# For filesystem operations
npm install -g @modelcontextprotocol/server-filesystem
# For web research
npm install -g @mzxrai/mcp-webresearch
Clone and install:
git clone https://github.com/ausboss/mcp-ollama-agent.git
cd mcp-ollama-agent
npm install
Configure your tools and tool supported Ollama model in mcp-config.json
:
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["@modelcontextprotocol/server-filesystem", "./"]
},
"webresearch": {
"command": "npx",
"args": ["-y", "@mzxrai/mcp-webresearch"]
}
},
"ollama": {
"host": "http://localhost:11434",
"model": "qwen2.5:latest"
}
}
Run the demo to test filesystem and webresearch tools without an LLM:
npx tsx ./src/demo.ts
Or start the chat interface with Ollama:
npm start
mcpServers
sectionThis example used this model qwen2.5:latest
Chat started. Type "exit" to end the conversation.
You: can you use your list directory tool to see whats in test-directory then use your read file tool to read it to me?
Model is using tools to help answer...
Using tool: list_directory
With arguments: { path: 'test-directory' }
Tool result: [ { type: 'text', text: '[FILE] test.txt' } ]
Assistant:
Model is using tools to help answer...
Using tool: read_file
With arguments: { path: 'test-directory/test.txt' }
Tool result: [ { type: 'text', text: 'rosebud' } ]
Assistant: The content of the file `test.txt` in the `test-directory` is:
rosebud
You: thanks
Assistant: You're welcome! If you have any other requests or need further assistance, feel free to ask.
Some local models may need help with tool selection. Customize the system prompt in ChatManager.ts
to improve tool usage.
Contributions welcome! Feel free to submit issues or pull requests.
{
"mcpServers": {
"filesystem": {
"env": {},
"args": [
"@modelcontextprotocol/server-filesystem",
"./"
],
"command": "npx"
},
"webresearch": {
"env": {},
"args": [
"-y",
"@mzxrai/mcp-webresearch"
],
"command": "npx"
}
}
}
Seamless access to top MCP servers powering the future of AI integration.