Everything MCP Server
ECOSYSTEM REFERENCENO AUTHMITOriginally built by @Anthropic
The official MCP reference server that exercises every protocol feature — prompts, tools, resources, sampling, and all transports. Built for MCP client developers and testing.
Setup Guide
{
"mcpServers": {
"everything": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-everything"
]
}
}
}
Tools
echo
Echoes back the provided message — used for basic connectivity testing.
add
Adds two numbers — demonstrates numeric tool parameters.
longRunningOperation
Simulates a long-running operation with progress reporting.
sampleLLM
Demonstrates the MCP sampling capability — requests an LLM completion from the client.
getTinyImage
Returns a small test image — demonstrates binary content handling.
Compatibility
About
The Everything MCP Server is the official reference implementation from Anthropic that exercises every feature of the MCP protocol. It is not designed for daily use — it's a test server for developers building MCP clients, IDE integrations, or debugging MCP protocol compliance.
When to use this
- Building an MCP client? Use this server to verify your client handles all protocol features correctly.
- Testing IDE integration? It exposes prompts, tools, resources, and sampling — great for checking your IDE's MCP panel renders everything.
- Learning MCP? The source code is the best reference for how to implement each MCP primitive.
What it demonstrates
The server showcases the full MCP feature set:
| Feature | What it tests |
|---|---|
| Tools | Echo, add, long-running operations, binary content |
| Resources | Dynamic resource listing and content retrieval |
| Prompts | Prompt templates with argument substitution |
| Sampling | Client-side LLM sampling (requesting completions from the host) |
| Progress | Progress reporting during long-running operations |
| Transports | stdio, HTTP+SSE (deprecated), and Streamable HTTP |
All three transports
This is one of the few servers that supports all MCP transport modes:
bash# stdio (default) npx @modelcontextprotocol/server-everything # SSE (deprecated in MCP 2025-03-26) npx @modelcontextprotocol/server-everything sse # Streamable HTTP (recommended for remote servers) npx @modelcontextprotocol/server-everything streamableHttp
Not for production
This server includes intentionally simple tools (echo, add) and test fixtures. It adds unnecessary tools to your AI's context window if left enabled during normal work. Only enable it when actively testing or developing MCP integrations.
This server can be set up manually using the configs above. Browse AgenticMarket for servers you can install in one command with zero config.
BROWSE INSTALLABLE SERVERS →Fetch MCP Server
Give your AI assistant the ability to read any web page. The official Fetch server converts HTML to markdown so your LLM can process web content, read documentation, and scrape data in real time.
Filesystem MCP Server
Give your AI assistant read and write access to local files and directories. The most-used MCP server — lets Claude, Cursor, and other AI tools work with your filesystem directly.
Git MCP Server
Let your AI assistant interact with Git repositories directly. Status, diff, commit, branch, and log — all accessible to your LLM through 12 Git tools.
Time MCP Server
Give your AI assistant awareness of the current time and timezone conversions. Query the current time in any timezone and convert between timezones using IANA names.
AgenticMarket