A Model Context Protocol (MCP) compliant server implementation for WeCom (WeChat Work) bot.
There are several ways to install WeCom Bot MCP Server:
npx -y @smithery/cli install wecom-bot-mcp-server --client claude
pip install wecom-bot-mcp-server
Create or update your MCP configuration file:
// For Windsurf: ~/.windsurf/config.json
{
"mcpServers": {
"wecom": {
"command": "uvx",
"args": [
"wecom-bot-mcp-server"
],
"env": {
"WECOM_WEBHOOK_URL": "your-webhook-url"
}
}
}
}
# Windows PowerShell
$env:WECOM_WEBHOOK_URL = "your-webhook-url"
# Optional configurations
$env:MCP_LOG_LEVEL = "DEBUG" # Log levels: DEBUG, INFO, WARNING, ERROR, CRITICAL
$env:MCP_LOG_FILE = "path/to/custom/log/file.log" # Custom log file path
The logging system uses platformdirs.user_log_dir()
for cross-platform log file management:
C:\Users\<username>\AppData\Local\hal\wecom-bot-mcp-server
~/.local/share/hal/wecom-bot-mcp-server
~/Library/Application Support/hal/wecom-bot-mcp-server
The log file is named mcp_wecom.log
and is stored in the above directory.
wecom-bot-mcp-server
# Scenario 1: Send weather information to WeCom
USER: "How's the weather in Shenzhen today? Send it to WeCom"
ASSISTANT: "I'll check Shenzhen's weather and send it to WeCom"
await mcp.send_message(
content="Shenzhen Weather:\n- Temperature: 25°C\n- Weather: Sunny\n- Air Quality: Good",
msg_type="markdown"
)
# Scenario 2: Send meeting reminder and @mention relevant people
USER: "Send a reminder for the 3 PM project review meeting, remind Zhang San and Li Si to attend"
ASSISTANT: "I'll send the meeting reminder"
await mcp.send_message(
content="## Project Review Meeting Reminder\n\nTime: Today 3:00 PM\nLocation: Meeting Room A\n\nPlease be on time!",
msg_type="markdown",
mentioned_list=["zhangsan", "lisi"]
)
# Scenario 3: Send a file
USER: "Send this weekly report to the WeCom group"
ASSISTANT: "I'll send the weekly report"
await mcp.send_message(
content=Path("weekly_report.docx"),
msg_type="file"
)
from wecom_bot_mcp_server import mcp
# Send markdown message
await mcp.send_message(
content="**Hello World!**",
msg_type="markdown"
)
# Send text message and mention users
await mcp.send_message(
content="Hello @user1 @user2",
msg_type="text",
mentioned_list=["user1", "user2"]
)
from wecom_bot_mcp_server import send_wecom_file
# Send file
await send_wecom_file("/path/to/file.txt")
from wecom_bot_mcp_server import send_wecom_image
# Send local image
await send_wecom_image("/path/to/image.png")
# Send URL image
await send_wecom_image("https://example.com/image.png")
git clone https://github.com/loonghao/wecom-bot-mcp-server.git
cd wecom-bot-mcp-server
# Using uv (recommended)
pip install uv
uv venv
uv pip install -e ".[dev]"
# Or using traditional method
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -e ".[dev]"
# Using uv (recommended)
uvx nox -s pytest
# Or using traditional method
nox -s pytest
# Check code
uvx nox -s lint
# Automatically fix code style issues
uvx nox -s lint_fix
# Build the package
uv build
# Build and publish to PyPI
uv build && twine upload dist/*
wecom-bot-mcp-server/
├── src/
│ └── wecom_bot_mcp_server/
│ ├── __init__.py
│ ├── server.py
│ ├── message.py
│ ├── file.py
│ ├── image.py
│ ├── utils.py
│ └── errors.py
├── tests/
│ ├── test_server.py
│ ├── test_message.py
│ ├── test_file.py
│ └── test_image.py
├── docs/
├── pyproject.toml
├── noxfile.py
└── README.md
This project is licensed under the MIT License - see the LICENSE file for details.
Seamless access to top MCP servers powering the future of AI integration.