MCP (Model Context Protocol): What It Is and Why It Matters
Model Context Protocol is the emerging standard for connecting AI models to external tools and data. Heres what developers and businesses need to know.
Founding AI Engineer @ Origami
MCP (Model Context Protocol): What It Is and Why It Matters
Model Context Protocol (MCP) is an open standard for connecting AI models to external data sources and tools. Instead of building custom integrations for every AI + tool combination, MCP provides a universal interface that works across models and applications.
Think of it like USB for AI. Before USB, every device needed a different cable. MCP does the same thing for AI integrations.
The Problem MCP Solves
Today, connecting AI to your tools is painful:
- Every integration is custom - Connect GPT to Salesforce? Build an integration. Connect Claude to Salesforce? Build another one.
- Data access is fragmented - Each app has different APIs, auth methods, and data formats.
- Context is limited - Models only see what you explicitly give them.
- Maintenance is endless - APIs change, integrations break, you're always fixing.
The result: AI applications are islands, disconnected from the systems where your real data lives.
What MCP Does
MCP standardizes how AI models:
- Discover available tools and data sources
- Authenticate to external systems
- Read context from connected sources
- Execute actions through tools
- Write results back to systems
With MCP, you build one connection to a data source, and any MCP-compatible model can use it.
How MCP Works
Architecture
Your AI Application
↓
MCP Client
↓
MCP Protocol
↓
MCP Server
↓
External System
(CRM, Database, API)
Core Components
MCP Client Lives in the AI application. Handles:
- Connecting to MCP servers
- Sending requests
- Processing responses
MCP Server Connects to external systems. Provides:
- Available resources (data the model can read)
- Available tools (actions the model can take)
- Authentication management
MCP Protocol The standard interface between clients and servers. Defines:
- Message formats
- Resource types
- Tool specifications
- Error handling
Message Flow
- Discovery: Client asks server what resources and tools are available
- Context: Client requests data from resources
- Action: Client calls tools to execute actions
- Response: Server returns results to client
MCP Resources
Resources are data sources the model can read. Examples:
- Files in a directory
- Rows in a database
- Records in a CRM
- Messages in a conversation
- Documents in a knowledge base
Resources have:
- URI: Unique identifier (e.g.,
file:///path/to/doc.md) - MIME type: Content format (e.g.,
text/markdown) - Content: The actual data
MCP Tools
Tools are actions the model can take. Examples:
- Send an email
- Create a CRM record
- Query a database
- Call an API
- Write a file
Tools have:
- Name: What it's called (e.g.,
send_email) - Description: What it does (used by the model to decide when to use it)
- Parameters: Required inputs with types and descriptions
- Response: What it returns
Why MCP Matters
For Developers
Build once, use everywhere An MCP server for Salesforce works with any MCP-compatible AI application—Claude, GPT, Llama, anything.
Faster development Instead of building custom integrations, you use existing MCP servers or build to a standard spec.
Better maintenance Updates to an MCP server improve all applications using it.
For Businesses
More capable AI Models can access your actual business data, not just what's in their training.
Composable AI systems Swap models without rebuilding integrations. Add new data sources without changing your AI application.
Reduced vendor lock-in Your integrations work across AI providers.
For the AI Ecosystem
Interoperability Different AI systems can work together through a common protocol.
Accelerated innovation Developers focus on value-add instead of integration plumbing.
Trust and security Standardized protocols enable better security auditing and controls.
MCP in Practice
Example: Sales AI with MCP
Without MCP:
Custom integration to CRM
Custom integration to email
Custom integration to calendar
Custom integration to enrichment API
Each one is unique code to maintain
With MCP:
MCP server for CRM
MCP server for email
MCP server for calendar
MCP server for enrichment
Any MCP client can use all of them
Building an MCP Server
Here's a simplified example of an MCP server that provides CRM data:
from mcp import Server, Resource, Tool
server = Server("crm-server")
@server.resource("crm://contacts/{contact_id}")
async def get_contact(contact_id: str):
contact = crm.get_contact(contact_id)
return Resource(
uri=f"crm://contacts/{contact_id}",
mime_type="application/json",
content=json.dumps(contact)
)
@server.tool("create_contact")
async def create_contact(name: str, email: str, company: str):
"""Create a new contact in the CRM."""
result = crm.create(name=name, email=email, company=company)
return {"id": result.id, "created": True}
server.run()
Using an MCP Client
from mcp import Client
client = Client()
client.connect("localhost:3000") # Connect to CRM server
# Get a resource
contact = await client.get_resource("crm://contacts/12345")
# Use a tool
result = await client.call_tool("create_contact", {
"name": "Jane Smith",
"email": "jane@company.com",
"company": "Acme Corp"
})
Current State of MCP
MCP is still early but gaining traction:
What exists:
- Open specification published by Anthropic
- SDKs for Python and TypeScript
- Growing library of community MCP servers
- Integration with Claude Desktop and other tools
What's coming:
- More AI providers adopting MCP
- Enterprise-grade MCP servers for major platforms
- Security and governance frameworks
- Ecosystem of commercial MCP providers
MCP vs Alternatives
vs Custom Integrations
MCP is more standardized. Custom integrations are more flexible for unique requirements but harder to maintain.
vs Function Calling
Function calling is model-specific (OpenAI functions, Claude tools). MCP provides a universal protocol across models.
vs LangChain/LlamaIndex
These are frameworks that can use MCP servers. MCP is the protocol; LangChain is a framework that speaks the protocol.
Getting Started with MCP
If you're interested in MCP:
- Read the spec: modelcontextprotocol.io
- Try existing servers: GitHub has many open-source MCP servers
- Build a simple server: Start with a basic resource provider
- Integrate with your AI: Use Claude Desktop or an MCP-compatible framework
The Bottom Line
MCP is infrastructure. Like HTTP or SQL, it's not exciting on its own—it's what you build on top of it that matters.
For AI to fulfill its potential, it needs to connect to the systems where real work happens. MCP is the emerging standard that makes those connections possible at scale.
Early adoption means you're building on standards that will define the next decade of AI development.
Building AI workflows that connect to your systems?