Model Context Protocol hit 97 million monthly SDK downloads. v2.1 adds Server Cards — .well-known/mcp-server.json for auto-discovery. Every major AI provider adopted it. Full guide to what changed and how to upgrade.
Model Context Protocol solves the N-times-M integration problem: before MCP, M AI systems connecting to N tools required M times N custom connectors. MCP provides a universal interface — build one server, every compatible AI client connects automatically. Anthropic introduced it in November 2024. In 16 months it grew to 97 million monthly SDK downloads in Python and TypeScript. Every major AI provider — Anthropic, OpenAI, Google, Microsoft and Amazon — now supports MCP. The Linux Foundation's Agentic AI Foundation co-founded by all five companies governs MCP alongside the A2A (Agent-to-Agent) protocol. Claude Desktop and Cursor both shipped full MCP v2.1 support on launch week.
Server Cards expose structured MCP server metadata at .well-known/mcp-server.json. Before Server Cards, discovering an MCP server's tools required connecting to it and querying capabilities — a roundtrip that scales poorly across large enterprise tool stacks. Server Cards expose the full tool manifest, authentication requirements and rate limits at a static URL that registries and crawlers index without connecting. The practical effect: AI agent orchestrators auto-discover available tools across an entire enterprise stack by crawling .well-known endpoints rather than hardcoding tool configurations. This enables agent registries that update automatically as new MCP servers are added to the stack.
Two steps to become v2.1 compliant. First, create .well-known/mcp-server.json at your server root. Required fields: name, version, tools array (each with name, description and input schema), and authentication type. Optional: rate limits, contact information, supported MCP version range. Second, update the SDK: pip install mcp>=2.1 for Python or npm install @modelcontextprotocol/sdk@2.1 for TypeScript. The Server Cards format is backwards-compatible — v1.x servers keep working in v2.1 clients. Without the .well-known endpoint, your server won't appear in automated registry discovery. Upgrading is not urgent but recommended for any server you want discoverable by enterprise orchestration systems.
OpenAI function calling is model-specific — works only with OpenAI models, requires per-model definitions. LangChain tools are framework-specific — Python objects that require wrappers for other frameworks. MCP servers are transport-level — define once, any MCP-compatible client (Claude, Cursor, Copilot, custom agents) connects automatically. With 97 million monthly downloads and every major AI lab on board, MCP is the emerging universal standard. Strategic recommendation: build new tool integrations as MCP servers. Expose via function calling adapters if you need OpenAI-specific features. The long-term bet is MCP.