
Manufact’s $6.3M Seed Boosts MCP for Agents
Manufact raised $6.3M to simplify MCP for AI agents. What it means for builders and enterprises plus updates for the ai world summit 2026.
TL;DR
A seed round aimed at “agent plumbing”
Manufact Inc., formerly known as mcp-use, has raised $6.3 million in seed funding to build infrastructure that helps developers create and deploy AI agents using the Model Context Protocol (MCP). The round was led by Peak XV, with participation from Liquid 2 Ventures, Ritual Capital, Pioneer Fund and Y Combinator, along with angel investors that include a co-founder and COO of Supabase.
The company was founded in 2025 and is positioning itself around the practical problem that teams hit once they move from impressive demos to real AI-agent products: everything around the model, from tool connectivity to operational reliability. In the framing from Manufact’s leadership, the pain isn’t just choosing an LLM; it’s building and maintaining the “connective tissue” that lets agents interact with services, data, and other systems without brittle, one-off integrations.
For builders following agentic AI closely—including the global community around the ai world organisation—this seed round is a useful signal: investors are increasingly backing “agent infrastructure” companies that reduce integration overhead and help teams ship reliable AI workflows in production. It also sets up a broader conversation that fits naturally into the ai world summit and other ai world organisation events: as AI agents become more capable, tool interoperability and governance become the new battleground for differentiation.
What MCP is and why it’s becoming the default
MCP is described as a universal, open standard for connecting AI applications to external systems, and it was introduced by Anthropic as a way to standardize how AI tools and data connect to LLM-powered apps. InfoQ explains MCP as an open standard that connects AI agents to the tools and data they need, using a host/client/server model where the host is the user-facing AI application, the client manages communication, and the server exposes capabilities or data sources through the protocol.
The practical value of this approach is straightforward: instead of rewriting custom integration code for each service and each agent framework, a team can rely on a standardized interface and reuse connectors across tools and environments. That’s why MCP is often compared to a universal connector—because it attempts to turn an ecosystem of incompatible point-to-point integrations into a modular “plug-and-play” layer.
Adoption momentum matters here because standards only help when enough of the ecosystem agrees to speak the same language. Anthropic has said MCP reached broad uptake across the ecosystem, pointing to more than 10,000 active public MCP servers and adoption by products such as ChatGPT, Cursor, Gemini, Microsoft Copilot and Visual Studio Code. Anthropic also highlighted that major infrastructure providers offer deployment support for MCP, including AWS, Cloudflare, Google Cloud and Microsoft Azure.
In parallel, the news about Manufact reflects an adjacent reality: even when a standard exists, developers still need dependable tooling to prototype, deploy, secure and monitor production workloads. In other words, MCP can reduce fragmentation, but it doesn’t automatically remove the day-to-day engineering work of turning an “MCP-compatible idea” into something a company can run safely at scale.
What Manufact says it’s building for developers
Manufact’s pitch is that it gives developers an easier way to use MCP, especially outside closed-source products, and it wants to become an infrastructure layer that handles operational and connectivity heavy lifting. The company offers an open-source software development kit tied to its library, mcp-use, with the goal of letting teams experiment and prototype quickly while connecting models, tools and integrations with minimal code.
As those prototypes move toward production, Manufact says it can host the setup in its own cloud and take on some of the complexity involved in deploying AI systems “safely and securely,” which is typically where many teams slow down. The company’s founders have described the motivation in practical terms: before MCP existed, embedding AI into real products involved a painful tangle of custom connections, and even with MCP, reliable infrastructure is still needed to support real-world deployments.
Manufact has also shared usage traction metrics around its SDK downloads, saying it reached 3 million downloads in the fourth quarter of 2025 and stood at 5 million at the time of the announcement. Separately, the broader MCP ecosystem has been characterized as fast-growing, with an estimate in the same report that MCP servers see around 7 million downloads per month, reflecting how rapidly developers are experimenting with MCP-based toolchains.
The company says it will use this funding to expand its infrastructure, grow its team to meet rising enterprise demand, and push toward becoming a go-to framework for MCP development. For enterprise teams, that focus hints at a familiar “middle layer” opportunity: developers don’t just need a protocol; they need curated developer experience, deployment patterns, security boundaries, observability, and predictable operations across environments.
From the perspective of the ai world organisation community and its audiences—builders, business leaders, and operators—this is exactly the kind of “under the hood” story that can shape what’s possible in the next wave of AI products showcased at the ai world summit 2025 / 2026 and other ai conferences by ai world. The closer the ecosystem gets to standardized connectivity, the more product teams can focus on differentiated workflows, user experience, and measurable business outcomes instead of integration maintenance.
The bigger shift: open governance and ecosystem scaling
A major boost to MCP’s credibility is its push toward neutral stewardship. Anthropic announced it is donating MCP to the Agentic AI Foundation (AAIF), a directed fund under the Linux Foundation, and described AAIF as co-founded by Anthropic, Block and OpenAI with support from companies including Google, Microsoft, AWS, Cloudflare, and Bloomberg. The stated goal is to help agentic AI evolve transparently and collaboratively through shared development of open standards and community building.
Anthropic also pointed to continued work on MCP’s ecosystem maturity, including an official community-driven registry for discovering available MCP servers and spec improvements such as asynchronous operations, statelessness, server identity, and official extensions. It additionally cited official SDK support across major programming languages and “97M+ monthly SDK downloads across Python and TypeScript,” underscoring how large the developer footprint can become once an integration standard is widely adopted.
On the developer-education side, InfoQ lays out the architectural argument for why MCP matters: standardized protocols can transform an “M×N integration problem” into “M+N modularity,” improving interoperability and future-proofing tool integrations even as models and frameworks change. InfoQ also notes that MCP-compatible servers can expose functionality through a standardized interface—tools, resources, prompts, and sampling—which helps explain why MCP can serve both simple connector use cases and more complex agent workflows.
Taken together, these signals point toward a near-term reality: agentic AI is becoming less about a single model’s capabilities and more about the surrounding ecosystem—tool discovery, permissioning, reliability, and the ability to swap components without breaking everything. That’s why infrastructure startups like Manufact exist, and why large platform players are supporting open standards at the governance level.
What this means for enterprises and AI World audiences
If you run AI programs inside an enterprise, the real question isn’t whether AI agents are “cool,” but whether they can be operated with the same rigor as other production software. Manufact is betting that MCP will be the connective layer, while developer-first infrastructure will make it practical to ship agentic systems without rebuilding integrations for every new tool or deployment.
For practitioners tracking these developments through the lens of the ai world organisation, there are a few strategic angles that consistently matter in real deployments. First, interoperability: MCP’s host/client/server structure and standardized interfaces can make it easier to reuse tool integrations across different products and environments, which reduces long-term maintenance risk. Second, ecosystem and neutrality: MCP’s donation to a Linux Foundation-directed effort is meant to keep the standard open, vendor-neutral, and community-driven, which helps enterprises avoid getting locked into one closed ecosystem.
Third, developer velocity: Manufact’s emphasis on an open-source SDK for fast prototyping and a managed cloud path for production reflects a common adoption pattern—teams need to move from prototype to governed production without rewriting their foundations halfway through. Finally, scale: claims of strong MCP adoption across major AI products and large counts of active MCP servers suggest the protocol is becoming a default “language” for agent tool connectivity, which typically accelerates community tooling and best practices.
This is also why the ai world summit and related ai world organisation events are timely venues to discuss “agent infrastructure” rather than only model benchmarks. In practical terms, ai conferences by ai world can spotlight how organizations are using standards like MCP to connect agents to real business systems, how they handle security and governance, and what engineering patterns are emerging as “the safe way” to deploy agents that can take actions.