Remote#MCP server#uv#llama-indexLicense: NoneLanguage: Python

🧠 Advanced MCP Server Setup with uv, llama-index, ollama, and Cursor IDE


✅ Prerequisites


🛠 Step 1: Project Setup

1.1 Create a New Project Directory

uv init mcp-server
cd mcp-server

1.2 Create and Activate Virtual Environment

uv venv
.venv\Scripts\activate  # On Windows
# OR
source .venv/bin/activate  # On Linux/Mac

🔐 Step 2: Environment Configuration

Create a .env file in the root of your project and add your API key:

LINKUP_API_KEY=your_api_key_here

📦 Step 3: Install Required Dependencies

Run these commands one by one inside your virtual environment:

# Core MCP CLI and HTTP utilities
uv add mcp[cli] httpx

# Linkup SDK for orchestrating agents
uv add linkup-sdk

# LlamaIndex integrations
uv add llama-index
uv add llama-index-embeddings-huggingface
uv add llama-index-llms-ollama

# Optional: for using notebooks
uv add ipykernel

🧪 Step 4: Confirm Installation

After installation, check your uv-managed pyproject.toml for something like this:

[tool.uv.dependencies]
mcp = { extras = ["cli"] }
httpx = "*"
linkup-sdk = "*"
llama-index = "*"
llama-index-embeddings-huggingface = "*"
llama-index-llms-ollama = "*"
ipykernel = "*"

⚙️ Step 5: Create a Minimal Server Entry Point

Create a server.py file inside the project root:

# server.py

from mcp.cli import app

if __name__ == "__main__":
    app()

You can later replace this with your own FastMCP or Agent orchestrator script.


🧠 Step 6: Run Ollama Locally

Make sure Ollama is installed and running:

ollama run llama3.2 Or any model you want

This starts the LLM backend at http://localhost:11434.


🖥️ Step 7: Configure MCP Server in Cursor IDE

7.1 Open Cursor Settings

  • Open Settings → Go to MCP section.
  • Click on "Add New Global MCP Server"

7.2 Fill Out the Configuration

Replace the paths with your actual machine paths. You can get the full path to uv by running:

where uv  # Windows

Now add this to your Cursor IDE settings:

{
  "mcpServers": {
    "weather": {
      "command": "C:\\Users\\SIDHYA\\AppData\\Roaming\\Python\\Python311\\Scripts\\uv.exe",  // Replace with your actual uv path
      "args": [
        "--directory",
        "C:\\Users\\SIDHYA\\Development\\Ai\\mcp-server",
        "run",
        "server.py"
      ]
    }
  }
}

🧪 Step 8: Test the Integration

  1. Open any .py file in Cursor.
  2. Use the MCP tools (usually accessible via ⌘K or Ctrl+K) to run the “weather” MCP server.
  3. You should see the server spin up using your server.py.

📘 Suggested Directory Structure

mcp-server/
├── .env
├── pyproject.toml
├── server.py
└── rag.py

🔁 Keep Things Updated

To update dependencies:

uv pip install --upgrade llama-index
uv pip install --upgrade linkup-sdk

✍️ Author

👋 Hey, I'm Asutosh Sidhya

🌐 Connect with Me

If you're building something around AI agents, local LLMs, or automated RAG pipelines—I'd love to connect or collaborate!

Installation

Claude
Claude
Cursor
Cursor
Windsurf
Windsurf
Cline
Cline
Witsy
Witsy
Spin AI
Spin AI
Run locally with the following command:
Terminal
Add the following config to your client:
JSON
{
  "mcpServers": {
    "weather": {
      "env": {},
      "args": [
        "--directory",
        "C:\\Users\\SIDHYA\\Development\\Ai\\mcp-server",
        "run",
        "server.py"
      ],
      "command": "uv"
    }
  }
}

MCPLink

Seamless access to top MCP servers powering the future of AI integration.

© 2025 MCPLink. All rights reserved.
discordgithubdiscord