Model Context Protocol Server for Dify AI. This server enables LLMs to interact with Dify AI's chat completion capabilities through a standardized protocol.
# Build the Docker image
make docker
# Run with Docker
docker run -i --rm mcp/dify https://your-dify-api-endpoint your-dify-api-key
Add the following configuration to your claude_desktop_config.json
:
{
"mcpServers": {
"dify": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-dify",
"https://your-dify-api-endpoint",
"your-dify-api-key"
]
}
}
}
Replace your-dify-api-endpoint
and your-dify-api-key
with your actual Dify API credentials.
Restaurant recommendation tool that interfaces with Dify AI:
Parameters:
LOCATION
(string): Location of the restaurantBUDGET
(string): Budget constraintsquery
(string): Query to send to Dify AIconversation_id
(string, optional): For maintaining chat context# Initial setup
make setup
# Build the project
make build
# Format code
make format
# Run linter
make lint
This project is released under the MIT License.
This server interacts with Dify AI using your provided API key. Ensure to:
Contributions are welcome! Please feel free to submit a Pull Request.
{
"mcpServers": {
"dify": {
"env": {},
"args": [
"-y",
"@modelcontextprotocol/server-dify",
"https://your-dify-api-endpoint",
"your-dify-api-key"
],
"command": "npx"
}
}
}
Seamless access to top MCP servers powering the future of AI integration.