Memory MCP Server
ECOSYSTEM REFERENCENO AUTHMITOriginally built by @Anthropic
Give your AI assistant persistent memory across conversations. The Memory server stores entities, relations, and observations in a local knowledge graph that persists between sessions.
Setup Guide
{
"mcpServers": {
"memory": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-memory"
]
}
}
}
Tools
create_entities
Create multiple new entities in the knowledge graph with names, types, and initial observations.
create_relations
Create directed relationships between existing entities using active-voice relation types.
add_observations
Add new observations (facts) to existing entities in the knowledge graph.
delete_entities
Remove entities and all their associated relations from the graph.
delete_observations
Remove specific observations from entities.
delete_relations
Remove specific relations between entities.
read_graph
Read the entire knowledge graph — all entities and relations.
search_nodes
Search across entity names, types, and observations by query string.
open_nodes
Retrieve specific entities by name along with their inter-relations.
Compatibility
About
The Memory MCP Server is part of the official MCP server collection maintained by Anthropic. It provides persistent memory across conversations using a local knowledge graph — giving your AI assistant the ability to remember facts, people, preferences, and relationships between sessions.
How it works
The server stores a knowledge graph in a local JSONL file with three primitives:
Entities — Named nodes with a type and observations:
json{ "name": "John_Smith", "entityType": "person", "observations": ["Speaks fluent Spanish", "Prefers dark mode"] }
Relations — Directed connections between entities:
json{ "from": "John_Smith", "to": "Acme_Corp", "relationType": "works_at" }
Observations — Atomic facts attached to entities. One fact per observation keeps things clean and deletable.
Custom storage location
By default, the knowledge graph is stored as memory.jsonl in the server's working directory. To use a specific path:
json{ "mcpServers": { "memory": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-memory"], "env": { "MEMORY_FILE_PATH": "/Users/you/.ai-memory/memory.jsonl" } } } }
This is useful when you want multiple AI tools to share the same memory file, or when you want to back up your knowledge graph with your dotfiles.
Prompting for better memory
The server works best when the AI is instructed to actively use it. Add instructions to your system prompt or Claude Project:
Follow these steps for each interaction:
1. Always begin by retrieving relevant information from your memory
2. While conversing, note new information about:
- People and their preferences
- Project requirements and decisions
- Technical choices and their rationale
3. After each interaction, update memory with any new facts learned
Docker installation
json{ "mcpServers": { "memory": { "command": "docker", "args": ["run", "-i", "-v", "claude-memory:/app/dist", "--rm", "mcp/memory"] } } }
The Docker volume claude-memory persists the knowledge graph between container restarts.
Common issues
Memory resets between sessions
Check that MEMORY_FILE_PATH points to a persistent location. If you're using npx without setting this, the file may be created in a temporary directory that gets cleaned up.
"Entity not found" errors
Entity names are case-sensitive. John_Smith and john_smith are different entities. Use consistent naming conventions.
Graph getting too large The server loads the entire graph into memory. For very large knowledge graphs (10,000+ entities), searches may slow down. Periodically prune outdated entities.
This server can be set up manually using the configs above. Browse AgenticMarket for servers you can install in one command with zero config.
BROWSE INSTALLABLE SERVERS →Fetch MCP Server
Give your AI assistant the ability to read any web page. The official Fetch server converts HTML to markdown so your LLM can process web content, read documentation, and scrape data in real time.
Filesystem MCP Server
Give your AI assistant read and write access to local files and directories. The most-used MCP server — lets Claude, Cursor, and other AI tools work with your filesystem directly.
Everything MCP Server
The official MCP reference server that exercises every protocol feature — prompts, tools, resources, sampling, and all transports. Built for MCP client developers and testing.
Git MCP Server
Let your AI assistant interact with Git repositories directly. Status, diff, commit, branch, and log — all accessible to your LLM through 12 Git tools.
AgenticMarket