EN83A65Mi11sZsyaTWc53e15CJV3nTxAoQmU3GcCEREi
Organize your LLM inputs with a standardized schema. No more brittle prompt engineering.
Seamlessly connect LLMs with external tools and APIs. Define tool schemas once, use everywhere.
Debug and trace every part of your LLM's context. Understand exactly what influenced model outputs.
Maintain short and long-term memory for your AI assistants. Create truly personalized experiences that improve over time.
Works with any LLM framework or provider. Integrate with LangChain, LlamaIndex, or your custom stack.
MCP provides a standardized way to structure all the context your LLM needs. Define your schema once, use it everywhere.
System Instructions - Define behavior and capabilities
User Context - Preferences, history, and goals
Tools - Available APIs and functions
Memory - Short and long-term storage
Retrieved Documents - RAG results and knowledge
// Define your MCP schema const mcpSchema = { version: "1.0", components: { systemInstruction: { type: "string", description: "High-level instruction for the LLM" }, userGoal: { type: "string", description: "Current user objective" }, memory: { type: "object", properties: { shortTerm: { type: "array" }, longTerm: { type: "object" } } }, tools: { type: "array", items: { type: "object", properties: { name: { type: "string" }, description: { type: "string" }, parameters: { type: "object" } } } }, retrievedDocuments: { type: "array", items: { type: "object" } } } }; export default mcpSchema;
// Create an MCP context instance import { MCPContext } from '@modl/mcp'; const shoppingAssistant = new MCPContext({ systemInstruction: "You are a helpful shopping assistant.", userGoal: "Find running shoes under $100", memory: { shortTerm: [ { type: "preference", value: "prefers Nike" }, { type: "dislike", value: "doesn't like bright colors" } ], longTerm: { shoeSize: "US 10", favoriteStyles: ["minimalist", "retro"] } }, tools: [ { name: "searchProducts", description: "Search product catalog", parameters: { query: "string", filters: "object", limit: "number" } }, { name: "checkInventory", description: "Check if product is in stock", parameters: { productId: "string" } } ] }); // Use with any LLM provider const response = await llm.generate({ context: shoppingAssistant.compile() });
MCP works with any LLM provider or framework. Create your context once, then compile it for your specific LLM implementation.
Provider Agnostic - Works with OpenAI, Anthropic, Gemini, and more
Framework Compatible - Integrates with LangChain, LlamaIndex, and custom solutions
Optimized Prompting - Automatically formats context for each model's strengths
May 5, 2025
New features include improved tool calling, better memory management, and TypeScript type definitions.
April 22, 2025
See how Shopify is using MCP to build personalized shopping experiences.
April 10, 2025
Learn how to coordinate multiple AI agents using structured context sharing.
Start building more reliable, debuggable, and powerful LLM applications with Model Context Protocol.