Skip to content

aahj/model_context_protocol_projects

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MCP Examples: Sampling, Notifications, and Roots

This repository is a hands-on set of Model Context Protocol (MCP) examples built to show how MCP clients and MCP servers work together in real workflows.

It focuses on three practical scenarios:

  • Sampling: server requests model generation through the client
  • Logging and progress notifications: server streams status updates while a tool runs
  • Roots-based file access: server is restricted to approved file-system paths

What does this project do, and why was it built?

This project demonstrates how to build MCP-based systems with Python and mcp[cli], using stdio transport for local development.

It was built to make core MCP concepts tangible:

  • how to expose tools from an MCP server
  • how an MCP client invokes those tools
  • how model calls can be delegated via MCP sampling
  • how to enforce safe file access with roots
  • how to improve UX with logging/progress notifications

In short: it is a learning and reference repo for MCP patterns, not a single production app.

What does the architecture look like?

At a high level, each example follows this flow:

User/CLI -> MCP Client -> MCP Server -> Tool logic

For sampling, the server asks the client to call a model:

MCP Server -> ctx.session.create_message(...) -> MCP Client sampling callback -> LLM provider (Gemini/Claude)

Project structure

  • sampling_example/
    • server.py: exposes a rephrase tool and triggers sampling via MCP context
    • client.py / client_anthropic.py: starts MCP session and implements sampling callback to model SDK
  • notifications_example/
    • server.py: exposes add tool with ctx.info(...) and ctx.report_progress(...)
    • client.py: receives logging/progress callbacks and prints updates
  • roots_example/
    • mcp_server.py: tools list_roots, read_dir, convert_video with path checks
    • main.py: starts CLI chat + MCP client with provided root directories
    • core/: chat loop, model adapters, utilities, video conversion helpers

How do I run this locally?

Prerequisites

  • Python 3.10+
  • uv installed (pip install uv)
  • API key(s), depending on example:
    • GOOGLE_API_KEY for Gemini
    • ANTHROPIC_API_KEY for Claude
  • FFmpeg (only required for roots_example video conversion)

1) Sampling example

cd sampling_example
uv sync
uv run client.py

Notes:

  • client.py uses Gemini (GOOGLE_API_KEY).
  • If using Anthropic, switch to client_anthropic.py as documented in that folder.

2) Notifications example

cd notifications_example
uv sync
uv run client.py

3) Roots example

cd roots_example
cp .env.example .env
# set GOOGLE_API_KEY or ANTHROPIC_API_KEY in .env
uv sync
uv run main.py <root1> [root2] [root3]

Example:

uv run main.py .

How do I deploy it?

This repo is currently designed for local stdio-based execution. There is no production deployment pipeline included yet.

If you want to deploy:

  • package each example as a service/container
  • replace stdio transport with a network transport (for remote clients)
  • inject secrets through environment variables in your platform
  • add observability, auth, and access controls for production traffic

What decisions were made, and why?

  • Split into three focused examples to isolate MCP concepts and keep each learning path simple.
  • Use Python + mcp[cli] to reduce boilerplate and stay close to MCP primitives.
  • Use stdio transport because it is easiest for local development and debugging.
  • Delegate model generation through client-side sampling so server tool logic can remain model-agnostic at the protocol boundary.
  • Enforce roots for file operations to demonstrate safe-by-default filesystem access.
  • Include progress/log callbacks to show user-friendly behavior for long-running tools.

What would be improved next?

  • Add a unified root-level Makefile/task runner for one-command setup and execution.
  • Standardize docs and naming across all examples (some README titles/descriptions currently differ).
  • Add tests (unit + integration) for tool handlers and path-permission checks.
  • Add optional HTTP transport examples for remote deployment scenarios.
  • Add CI for linting, formatting, and automated smoke tests.

Additional Notes:

Model Context Protocol (MCP) MCP Server allow us to gains access to data & functionality to the outside (third-party) service. Think of MCP as a standard way to for LLM to communicate with database, files, APIs, and internal tools.

Why need MCP Without MCP, LLM integration look like this: user -> app -> custom tool logic -> LLM

Every tool integration is custom built:

  • custom auth
  • custom api wrapper
  • custom schema
  • custom error handling

With MCP: user -> LLM -> MCP Client -> MCP Server -> Tools Instead of writing custom tool definition, you expose tools via MCP and any MCP-compatible model can adopt this.

About

This repository is a hands-on set of Model Context Protocol (MCP) examples built to show how MCP clients and MCP servers work together in real workflows.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages