A Model Context Protocol (MCP) server that enables seamless management of n8n workflows directly within LLMs and AI agents through the Smithery Model Context Protocol.
Install the package
npm install @dopehunter/n8n-mcp-server
Create a .env file
cp .env.example .env
Configure your n8n connection
Edit the .env
file and set:
N8N_BASE_URL
: URL to your n8n instance (e.g., http://localhost:5678/api
)N8N_API_KEY
: Your n8n API key (generate this in n8n settings)Start the server
npm start
Test the server
curl -X POST http://localhost:3000/mcp -H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":"1","method":"mcp.tools.list","params":{}}'
For more detailed troubleshooting, see the Troubleshooting Guide.
n8n_list_workflows
n8n_get_workflow
workflowId
(string, required): ID of the workflow to retrieven8n_execute_workflow
workflowId
(string, required): ID of the workflow to executedata
(object, optional): Data to pass to the workflown8n_get_executions
workflowId
(string, required): ID of the workflow to get executions forlimit
(number, optional): Maximum number of executions to returnn8n_activate_workflow
workflowId
(string, required): ID of the workflow to activaten8n_deactivate_workflow
workflowId
(string, required): ID of the workflow to deactivate{
"mcpServers": {
"n8n": {
"command": "docker",
"args": ["run", "-i", "--rm", "--init", "-e", "N8N_API_KEY=$N8N_API_KEY", "-e", "N8N_BASE_URL=$N8N_BASE_URL", "mcp/n8n-mcp-server"]
}
}
}
{
"mcpServers": {
"n8n": {
"command": "npx",
"args": ["-y", "@dopehunter/n8n-mcp-server"]
}
}
}
npm install @dopehunter/n8n-mcp-server
npx @dopehunter/n8n-mcp-server
git clone https://github.com/dopehunter/n8n_MCP_server_complete.git
cd n8n_MCP_server_complete
npm install
cp .env.example .env
# Edit the .env file with your n8n API details
Start the development server:
npm run start:dev
Build the project:
npm run build
Run tests:
npm test
Start the MCP server:
npm start
Configure your LLM client to use the MCP server:
http://localhost:3000/mcp
).Your LLM can now use n8n workflows directly through MCP commands.
docker build -t mcp/n8n-mcp-server .
See the API Documentation for details on the available MCP functions.
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the ISC License.
{
"mcpServers": {
"n8n": {
"env": {},
"args": [
"run",
"-i",
"--rm",
"--init",
"-e",
"N8N_API_KEY=$N8N_API_KEY",
"-e",
"N8N_BASE_URL=$N8N_BASE_URL",
"mcp/n8n-mcp-server"
],
"command": "docker"
}
}
}
Seamless access to top MCP servers powering the future of AI integration.