Define your context structure once, use it consistently across your entire AI application. MCP provides a standardized way to organize all the information that guides an LLM's behavior.
// Define your MCP schema const mcpSchema = { version: "1.0", components: { systemInstruction: { type: "string", description: "High-level instruction for the LLM" }, userGoal: { type: "string", description: "Current user objective" }, // More components... } };
Works with any LLM provider including OpenAI, Anthropic, Google, and more. Create your context once, then compile it for your specific LLM implementation.
// Compile for different providers const openaiPrompt = assistant.compileFor('openai'); const anthropicPrompt = assistant.compileFor('anthropic'); const googlePrompt = assistant.compileFor('google');
Structured approach to short and long-term memory in your LLM applications. Maintain conversation context and user preferences over time.
// Memory management assistant.updateMemory({ shortTerm: [ { type: "interaction", content: "Last user query and response" } ], longTerm: { userPreferences: { theme: "dark", detailedResponses: true } } });
Standardized way to define and use tools with LLMs. Connect your AI to external APIs, databases, and services.
// Tool definition const tools = [ { name: "searchDatabase", description: "Search the product database", parameters: { query: "string", limit: "number" } }, // More tools... ];
Debug and trace what influenced model outputs. Understand exactly why your LLM responded the way it did.
// Logging and tracing const trace = assistant.getContextTrace(); console.log("Context that influenced response:", trace); // Logs all components that affected the model's output
Coordinate multiple AI agents with shared context. Build complex workflows where agents collaborate on tasks.
Structured integration with knowledge bases and document retrieval. Provide relevant information to your LLM in a consistent format.
Track changes to your context schema over time. Maintain backward compatibility as your AI application evolves.
Build user-specific context profiles. Deliver tailored experiences based on preferences, history, and behavior.
Works with popular AI frameworks like LangChain, LlamaIndex, and custom solutions. Integrate MCP into your existing stack.
Build shopping assistants that remember preferences, compare products, and make personalized recommendations.
Create support agents that access knowledge bases, user history, and support tools to resolve issues efficiently.
Build coding assistants that understand your codebase, development patterns, and can access APIs and documentation.
Develop content assistants with brand guidelines, style preferences, and access to reference materials.