Skip to content

Commit 2dd930b

Browse files
committed
Update README
1 parent 42a5f6f commit 2dd930b

File tree

1 file changed

+272
-20
lines changed

1 file changed

+272
-20
lines changed

README.md

Lines changed: 272 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,27 @@
11
# MCP Python SDK
2+
[![PyPI][pypi-badge]][pypi-url]
3+
[![MIT licensed][mit-badge]][mit-url]
4+
[![Python Version][python-badge]][python-url]
5+
[![Documentation][docs-badge]][docs-url]
6+
[![Specification][spec-badge]][spec-url]
7+
[![GitHub Discussions][discussions-badge]][discussions-url]
28

9+
<<<<<<< Updated upstream
10+
=======
11+
[pypi-badge]: https://img.shields.io/pypi/v/mcp.svg
12+
[pypi-url]: https://pypi.org/project/mcp/
13+
[mit-badge]: https://img.shields.io/pypi/l/mcp.svg
14+
[mit-url]: https://github.com/modelcontextprotocol/python-sdk/blob/main/LICENSE
15+
[python-badge]: https://img.shields.io/pypi/pyversions/mcp.svg
16+
[python-url]: https://www.python.org/downloads/
17+
[docs-badge]: https://img.shields.io/badge/docs-modelcontextprotocol.io-blue.svg
18+
[docs-url]: https://modelcontextprotocol.io
19+
[spec-badge]: https://img.shields.io/badge/spec-spec.modelcontextprotocol.io-blue.svg
20+
[spec-url]: https://spec.modelcontextprotocol.io
21+
[discussions-badge]: https://img.shields.io/github/discussions/modelcontextprotocol/python-sdk
22+
[discussions-url]: https://github.com/modelcontextprotocol/python-sdk/discussions
23+
24+
>>>>>>> Stashed changes
325
Python implementation of the [Model Context Protocol](https://modelcontextprotocol.io) (MCP), providing both client and server capabilities for integrating with LLM surfaces.
426

527
## Overview
@@ -13,60 +35,290 @@ The Model Context Protocol allows applications to provide context for LLMs in a
1335

1436
## Installation
1537

38+
We recommend the use of [uv](https://docs.astral.sh/uv/) to manage your Python projects:
39+
1640
```bash
1741
uv add mcp
1842
```
1943

44+
Alternatively, add mcp to your `requirements.txt`:
45+
```
46+
pip install mcp
47+
# or add to requirements.txt
48+
pip install -r requirements.txt
49+
```
50+
51+
## Overview
52+
MCP servers provide focused functionality like resources, tools, prompts, and other capabilities that can be reused across many client applications. These servers are designed to be easy to build, highly composable, and modular.
53+
54+
### Key design principles
55+
- Servers are extremely easy to build with clear, simple interfaces
56+
- Multiple servers can be composed seamlessly through a shared protocol
57+
- Each server operates in isolation and cannot access conversation context
58+
- Features can be added progressively through capability negotiation
59+
60+
### Server provided primitives
61+
- [Prompts](https://modelcontextprotocol.io/docs/concepts/prompts): Templatable text
62+
- [Resources](https://modelcontextprotocol.io/docs/concepts/resources): File-like attachments
63+
- [Tools](https://modelcontextprotocol.io/docs/concepts/tools): Functions that models can call
64+
- Utilities:
65+
- Completion: Auto-completion provider for prompt arguments or resource URI templates
66+
- Logging: Logging to the client
67+
- Pagination*: Pagination for long results
68+
69+
### Client provided primitives
70+
- [Sampling](https://modelcontextprotocol.io/docs/concepts/sampling): Allow servers to sample using client models
71+
- Roots: Information about locations to operate on (e.g., directories)
72+
73+
Connections between clients and servers are established through transports like **stdio** or **SSE** (Note that most clients support stdio, but not SSE at the moment). The transport layer handles message framing, delivery, and error handling.
74+
2075
## Quick Start
2176

77+
### Creating a Server
78+
79+
MCP servers follow a decorator approach to register handlers for MCP primitives like resources, prompts, and tools. The goal is to provide a simple interface for exposing capabilities to LLM clients.
80+
81+
```python
82+
from mcp.server import Server, NotificationOptions
83+
from mcp.server.models import InitializationOptions
84+
import mcp.server.stdio
85+
import mcp.types as types
86+
87+
# Create a server instance
88+
server = Server("example-server")
89+
90+
# Add prompt capabilities
91+
@server.list_prompts()
92+
async def handle_list_prompts() -> list[types.Prompt]:
93+
return [
94+
types.Prompt(
95+
name="example-prompt",
96+
description="An example prompt template",
97+
arguments=[
98+
types.PromptArgument(
99+
name="arg1",
100+
description="Example argument",
101+
required=True
102+
)
103+
]
104+
)
105+
]
106+
107+
@server.get_prompt()
108+
async def handle_get_prompt(
109+
name: str,
110+
arguments: dict[str, str] | None
111+
) -> types.GetPromptResult:
112+
if name != "example-prompt":
113+
raise ValueError(f"Unknown prompt: {name}")
114+
115+
return types.GetPromptResult(
116+
description="Example prompt",
117+
messages=[
118+
types.PromptMessage(
119+
role="user",
120+
content=types.TextContent(
121+
type="text",
122+
text="Example prompt text"
123+
)
124+
)
125+
]
126+
)
127+
128+
# Run the server as STDIO
129+
async with mcp.server.stdio.stdio_server() as (read_stream, write_stream):
130+
await server.run(
131+
read_stream,
132+
write_stream,
133+
InitializationOptions(
134+
server_name="example",
135+
server_version="0.1.0",
136+
capabilities=server.get_capabilities(
137+
notification_options=NotificationOptions(),
138+
experimental_capabilities={},
139+
)
140+
)
141+
)
142+
```
143+
22144
### Creating a Client
23145

24146
```python
25-
from mcp import ClientSession
147+
from mcp import ClientSession, StdioServerParameters
26148
from mcp.client.stdio import stdio_client
27149

28-
async with stdio_client(command="path/to/server") as (read, write):
150+
# Create server parameters for stdio connection
151+
server_params = StdioServerParameters(
152+
command="path/to/server",
153+
args=[], # Optional command line arguments
154+
env=None # Optional environment variables
155+
)
156+
157+
async with stdio_client(server_params) as (read, write):
29158
async with ClientSession(read, write) as session:
30159
# Initialize the connection
31160
await session.initialize()
32161

33162
# List available resources
34163
resources = await session.list_resources()
164+
165+
# List available prompts
166+
prompts = await session.list_prompts()
167+
168+
# List available tools
169+
tools = await session.list_tools()
170+
171+
# Read a resource
172+
resource = await session.read_resource("file://some/path")
173+
174+
# Call a tool
175+
result = await session.call_tool("tool-name", arguments={"arg1": "value"})
176+
177+
# Get a prompt
178+
prompt = await session.get_prompt("prompt-name", arguments={"arg1": "value"})
35179
```
36180

37-
### Creating a Server
181+
## Primitives
182+
183+
The MCP Python SDK provides decorators that map to the core protocol primitives. Each primitive follows a different interaction pattern based on how it is controlled and used:
184+
185+
| Primitive | Control | Description | Example Use |
186+
|-----------|-----------------------|-----------------------------------------------------|------------------------------|
187+
| Prompts | User-controlled | Interactive templates invoked by user choice | Slash commands, menu options |
188+
| Resources | Application-controlled| Contextual data managed by the client application | File contents, API responses |
189+
| Tools | Model-controlled | Functions exposed to the LLM to take actions | API calls, data updates |
190+
191+
### User-Controlled Primitives
192+
193+
**Prompts** are designed to be explicitly selected by users for their interactions with LLMs.
194+
195+
| Decorator | Description |
196+
|--------------------------|----------------------------------------|
197+
| `@server.list_prompts()` | List available prompt templates |
198+
| `@server.get_prompt()` | Get a specific prompt with arguments |
199+
200+
### Application-Controlled Primitives
201+
202+
**Resources** are controlled by the client application, which decides how and when they should be used based on its own logic.
203+
204+
| Decorator | Description |
205+
|--------------------------------|---------------------------------------|
206+
| `@server.list_resources()` | List available resources |
207+
| `@server.read_resource()` | Read a specific resource's content |
208+
| `@server.subscribe_resource()` | Subscribe to resource updates |
209+
210+
### Model-Controlled Primitives
211+
212+
**Tools** are exposed to LLMs to enable automated actions, with user approval.
213+
214+
| Decorator | Description |
215+
|------------------------|------------------------------------|
216+
| `@server.list_tools()` | List available tools |
217+
| `@server.call_tool()` | Execute a tool with arguments |
218+
219+
### Server Management
220+
221+
Additional decorators for server functionality:
222+
223+
| Decorator | Description |
224+
|-------------------------------|--------------------------------|
225+
| `@server.set_logging_level()` | Update server logging level |
226+
227+
### Capabilities
228+
229+
MCP servers declare capabilities during initialization. These map to specific decorators:
230+
231+
| Capability | Feature Flag | Decorators | Description |
232+
|-------------|------------------------------|-----------------------------------------------------------------|-------------------------------------|
233+
| `prompts` | `listChanged` | `@list_prompts`<br/>`@get_prompt` | Prompt template management |
234+
| `resources` | `subscribe`<br/>`listChanged`| `@list_resources`<br/>`@read_resource`<br/>`@subscribe_resource`| Resource exposure and updates |
235+
| `tools` | `listChanged` | `@list_tools`<br/>`@call_tool` | Tool discovery and execution |
236+
| `logging` | - | `@set_logging_level` | Server logging configuration |
237+
| `completion`| - | `@complete_argument` | Argument completion suggestions |
238+
239+
Capabilities are negotiated during connection initialization. Servers only need to implement the decorators for capabilities they support.
240+
241+
## Client Interaction
242+
243+
The MCP Python SDK enables servers to interact with clients through request context and session management. This allows servers to perform operations like LLM sampling and progress tracking.
244+
245+
### Request Context
246+
247+
The Request Context provides access to the current request and client session. It can be accessed through `server.request_context` and enables:
248+
249+
- Sampling from the client's LLM
250+
- Sending progress updates
251+
- Logging messages
252+
- Accessing request metadata
253+
254+
Example using request context for LLM sampling:
38255

39256
```python
40-
from mcp.server import Server
41-
from mcp.server.stdio import stdio_server
257+
@server.call_tool()
258+
async def handle_call_tool(name: str, arguments: dict) -> list[types.TextContent]:
259+
# Access the current request context
260+
context = server.request_context
42261

43-
# Create a server instance
44-
server = Server("example-server")
262+
# Use the session to sample from the client's LLM
263+
result = await context.session.create_message(
264+
messages=[
265+
types.SamplingMessage(
266+
role="user",
267+
content=types.TextContent(
268+
type="text",
269+
text="Analyze this data: " + json.dumps(arguments)
270+
)
271+
)
272+
],
273+
max_tokens=100
274+
)
45275

46-
# Add capabilities
47-
@server.list_resources()
48-
async def list_resources():
49-
return [
50-
{
51-
"uri": "file:///example.txt",
52-
"name": "Example Resource"
53-
}
54-
]
276+
return [types.TextContent(type="text", text=result.content.text)]
277+
```
278+
279+
Using request context for progress updates:
280+
281+
```python
282+
@server.call_tool()
283+
async def handle_call_tool(name: str, arguments: dict) -> list[types.TextContent]:
284+
context = server.request_context
55285

56-
# Run the server
57-
async with stdio_server() as (read, write):
58-
await server.run(read, write, server.create_initialization_options())
286+
if progress_token := context.meta.progressToken:
287+
# Send progress notifications
288+
await context.session.send_progress_notification(
289+
progress_token=progress_token,
290+
progress=0.5,
291+
total=1.0
292+
)
293+
294+
# Perform operation...
295+
296+
if progress_token:
297+
await context.session.send_progress_notification(
298+
progress_token=progress_token,
299+
progress=1.0,
300+
total=1.0
301+
)
302+
303+
return [types.TextContent(type="text", text="Operation complete")]
59304
```
60305

306+
The request context is automatically set for each request and provides a safe way to access the current client session and request metadata.
307+
61308
## Documentation
62309

63310
- [Model Context Protocol documentation](https://modelcontextprotocol.io)
311+
<<<<<<< Updated upstream
64312
- [MCP Specification](https://spec.modelcontextprotocol.io)
65313
- [Example Servers](https://github.com/modelcontextprotocol/servers)
314+
=======
315+
- [Model Context Protocol specification](https://spec.modelcontextprotocol.io)
316+
- [Officially supported servers](https://github.com/modelcontextprotocol/servers)
317+
>>>>>>> Stashed changes
66318
67319
## Contributing
68320

69-
Issues and pull requests are welcome on GitHub at https://github.com/modelcontextprotocol/python-sdk.
321+
We are passionate about supporting contributors of all levels of experience and would love to see you get involved in the project. See the [contributing guide](CONTRIBUTING.md) to get started.
70322

71323
## License
72324

0 commit comments

Comments
 (0)