A Model Context Protocol (MCP) server for enhanced Qdrant vector database functionality. This server provides tools for managing Qdrant collections, adding documents, and performing semantic searches.
npm install -g better-qdrant-mcp-server
Or use it directly with npx:
npx better-qdrant-mcp-server
The server uses environment variables for configuration. You can set these in a .env
file in your project root:
# Qdrant Configuration
QDRANT_URL=http://localhost:6333
QDRANT_API_KEY=your_api_key_if_needed
# Embedding Service API Keys
OPENAI_API_KEY=your_openai_api_key
OPENROUTER_API_KEY=your_openrouter_api_key
OLLAMA_ENDPOINT=http://localhost:11434
To use this MCP server with Claude, add it to your MCP settings configuration file:
{
"mcpServers": {
"better-qdrant": {
"command": "npx",
"args": ["better-qdrant-mcp-server"],
"env": {
"QDRANT_URL": "http://localhost:6333",
"QDRANT_API_KEY": "your_api_key_if_needed",
"DEFAULT_EMBEDDING_SERVICE": "ollama",
"OPENAI_API_KEY": "your_openai_api_key",
"OPENAI_ENDPOINT": "https://api.openai.com/v1",
"OPENROUTER_API_KEY": "your_openrouter_api_key",
"OPENROUTER_ENDPOINT": "https://api.openrouter.com/v1",
"OLLAMA_ENDPOINT": "http://localhost:11434",
"OLLAMA_MODEL": "nomic-embed-text"
}
}
}
}
use_mcp_tool
server_name: better-qdrant
tool_name: list_collections
arguments: {}
use_mcp_tool
server_name: better-qdrant
tool_name: add_documents
arguments: {
"filePath": "/path/to/your/document.pdf",
"collection": "my-collection",
"embeddingService": "openai",
"chunkSize": 1000,
"chunkOverlap": 200
}
use_mcp_tool
server_name: better-qdrant
tool_name: search
arguments: {
"query": "your search query",
"collection": "my-collection",
"embeddingService": "openai",
"limit": 5
}
use_mcp_tool
server_name: better-qdrant
tool_name: delete_collection
arguments: {
"collection": "my-collection"
}
MIT
Seamless access to top MCP servers powering the future of AI integration.