uv
, llama-index
, ollama
, and Cursor IDEpip install uv
)uv init mcp-server
cd mcp-server
uv venv
.venv\Scripts\activate # On Windows
# OR
source .venv/bin/activate # On Linux/Mac
Create a .env
file in the root of your project and add your API key:
LINKUP_API_KEY=your_api_key_here
Run these commands one by one inside your virtual environment:
# Core MCP CLI and HTTP utilities
uv add mcp[cli] httpx
# Linkup SDK for orchestrating agents
uv add linkup-sdk
# LlamaIndex integrations
uv add llama-index
uv add llama-index-embeddings-huggingface
uv add llama-index-llms-ollama
# Optional: for using notebooks
uv add ipykernel
After installation, check your uv
-managed pyproject.toml
for something like this:
[tool.uv.dependencies]
mcp = { extras = ["cli"] }
httpx = "*"
linkup-sdk = "*"
llama-index = "*"
llama-index-embeddings-huggingface = "*"
llama-index-llms-ollama = "*"
ipykernel = "*"
Create a server.py
file inside the project root:
# server.py
from mcp.cli import app
if __name__ == "__main__":
app()
You can later replace this with your own
FastMCP
or Agent orchestrator script.
Make sure Ollama is installed and running:
ollama run llama3.2 Or any model you want
This starts the LLM backend at http://localhost:11434
.
Settings
→ Go to MCP section.Replace the paths with your actual machine paths. You can get the full path to uv
by running:
where uv # Windows
Now add this to your Cursor IDE settings:
{
"mcpServers": {
"weather": {
"command": "C:\\Users\\SIDHYA\\AppData\\Roaming\\Python\\Python311\\Scripts\\uv.exe", // Replace with your actual uv path
"args": [
"--directory",
"C:\\Users\\SIDHYA\\Development\\Ai\\mcp-server",
"run",
"server.py"
]
}
}
}
.py
file in Cursor.⌘K
or Ctrl+K
) to run the “weather” MCP server.server.py
.mcp-server/
├── .env
├── pyproject.toml
├── server.py
└── rag.py
To update dependencies:
uv pip install --upgrade llama-index
uv pip install --upgrade linkup-sdk
If you're building something around AI agents, local LLMs, or automated RAG pipelines—I'd love to connect or collaborate!
{
"mcpServers": {
"weather": {
"env": {},
"args": [
"--directory",
"C:\\Users\\SIDHYA\\Development\\Ai\\mcp-server",
"run",
"server.py"
],
"command": "uv"
}
}
}
Seamless access to top MCP servers powering the future of AI integration.