How does MCP work?

Prototype
GitHub

A rapid prototype demonstrating the Model Context Protocol. An MCP server registers tools, the AI selects and calls them via JSON-RPC 2.0, and results are used to generate responses. Runs fully locally without an API key.

Try asking a question like:

How does MCP work? — Interactive Model Context Protocol Demo

What is the Model Context Protocol (MCP)?

MCP is an open protocol that standardizes how AI applications connect to external tools and data sources. Instead of each AI model implementing custom integrations, MCP provides a universal interface using JSON-RPC 2.0 messages. This allows any MCP-compatible client to discover and use tools from any MCP server, creating an interoperable ecosystem for AI tool use.

How the MCP Pipeline Works

Step 1: Tool Discovery

The MCP client sends a tools/list JSON-RPC request to the server. The server responds with all registered tools, including their names, descriptions, and input schemas. This allows the AI to understand what capabilities are available before deciding which tool to use.

Step 2: Tool Selection

Based on the user's query, the AI model analyzes the available tools and selects the most appropriate one. It determines which parameters to pass based on the tool's input schema and the information in the query.

Step 3: Tool Execution

The client sends a tools/call JSON-RPC request with the tool name and arguments. The server executes the tool and returns the result in a standardized format. All communication uses JSON-RPC 2.0 with proper request IDs, methods, and error handling.

Step 4: Response Generation

The AI model receives the tool results and generates a natural-language response that incorporates the data. This grounds the response in real, retrieved information rather than relying solely on the model's training data.

Key Concepts Demonstrated

  • JSON-RPC 2.0 protocol for client-server communication
  • Tool registration with typed input schemas
  • Dynamic tool discovery at runtime
  • AI-driven tool selection based on user intent
  • Standardized tool execution and result formatting
  • Three demo tools: weather lookup, calculator, knowledge base search
  • Fallback mode that works without any API key

Interactive Features

This demo includes a 4-step pipeline visualization showing tool discovery with registered tool cards, tool selection reasoning, JSON-RPC request/response viewers with syntax highlighting, and response generation. It features bidirectional hover highlighting connecting tool names in chat with tool cards in the sidebar. Runs fully locally without an API key.

Technology Stack

Built with Next.js 16, the Vercel AI SDK, Anthropic Claude for generation, and a simulated MCP server/client that uses real JSON-RPC 2.0 message formats. Uses custom AI SDK data stream parts to send MCP pipeline metadata (tools, selection, execution, messages) alongside the LLM response.

Knowledge Base Topics

The demo knowledge base covers four MCP-related topics: the MCP Protocol (JSON-RPC basics, transport layers), MCP Tools and Resources (tool definitions, schemas, resources, prompts), MCP Architecture (client-server model, capability negotiation), and Tool Use Patterns (function calling, agentic loops, multi-tool orchestration).