中文 Tama is a Command-Line Interface (CLI) tool designed for managing tasks, enhanced with AI capabilities for task generation and expansion. It utilizes AI (specifically configured for DeepSeek models via their OpenAI-compatible API) to parse Product Requirements Documents (PRDs) and break down complex tasks into manageable subtasks.
tama prd <filepath>
) Automatically generate a structured task list from a .txt
or .prd
file.tama expand <task_id>
) Break down a high-level task into detailed subtasks using AI.tama deps
) Detect circular dependencies within your tasks.tama report [markdown|mermaid]
) Generate task reports in Markdown table format or as a Mermaid dependency graph.tama gen-file <task_id>
) Create placeholder code files based on task details.tama next
) Identify the next actionable task based on status and dependencies.rich
for formatted and visually appealing console output (e.g., tables, panels).git clone https://github.com/Gitreceiver/TAMA-MCP
cd TAMA-MCP
uv venv -p 3.12
# Windows
.\.venv\Scripts\activate
# macOS/Linux
source .venv/bin/activate
uv
- install with pip install uv
if you don't have it)
shell uv pip install .
(Alternatively, using pip: pip install .
)
Tama requires API keys for its AI features.
.env
file in the project root directory.# .env file
DEEPSEEK_API_KEY="your_deepseek_api_key_here"
(See .env.example
for a template)
The application uses settings defined in src/config/settings.py
, which loads variables from the .env
file.
Tama commands are run from your terminal within the activated virtual environment. Core Commands:
tama list
tama list --status pending --priority high # Filter
tama show 1 # Show task 1
tama show 1.2 # Show subtask 2 of task 1
# Add a top-level task
tama add "Implement user authentication" --desc "Handle login and sessions" --priority high
# Add a subtask to task 1
tama add "Create login API endpoint" --parent 1 --desc "Needs JWT handling"
tama status 1 done
tama status 1.2 in-progress
(Valid statuses: pending, in-progress, done, deferred, blocked, review)
tama remove 2
tama remove 1.3
tama next
AI Commands:
.txt
or .prd
)tama prd path/to/your/document.txt
tama expand 1
Utility Commands:
tama deps
tama report markdown # Print markdown table to console
tama report mermaid # Print mermaid graph definition
tama report markdown --output report.md # Save to file
tama gen-file 1
tama gen-file 2 --output-dir src/generated
Shell Completion:
tama --install-completion
(Note: This might require administrator privileges depending on your shell and OS settings)
If you modify the source code, remember to reinstall the package to make the changes effective in the CLI:
uv pip install .
Tama can be used as an MCP (Model Context Protocol) server, allowing other applications to interact with it programmatically. To start the server, run:
uv --directory /path/to/your/TAMA_MCP run python -m src.mcp_server
in your mcp client: (cline,cursor,claude)
{
"mcpServers": {
"TAMA-MCP-Server": {
"command": "uv",
"args": [
"--directory",
"/path/to/your/TAMA_MCP",
"run",
"python",
"-m",
"src.mcp_server"
],
"disabled": false,
"transportType": "stdio",
"timeout": 60
},
}
}
This will start the Tama MCP server, which provides the following tools
MIT License This project is licensed under the MIT License. See the LICENSE file for details.
=======
AI-Powered Task Manager CLI with MCP Server
{
"mcpServers": {
"TAMA-MCP-Server": {
"env": {},
"args": [
"--directory",
"/path/to/your/TAMA_MCP",
"run",
"python",
"-m",
"src.mcp_server"
],
"command": "uv"
}
}
}
Seamless access to top MCP servers powering the future of AI integration.