At Tyk, we think MCP is absolutely critical to AI adoption in the enterprise, and clearly, we’re not the only ones. OpenAI and Google are both pledging support for the standard in their models and clients, and the ecosystem is already starting to adapt to it.
The thing is, most people using MCP-powered tools today don’t realise what’s going on under the hood. The clients, now more like lightweight agents, often skip the official Tool Use APIs altogether. Why? Because they’re flaky. Prompt engineering and structured outputs still outperform the native integrations.
If you’re looking for a deeper dive into what MCP is and why it matters, check out part one of this series: Making sense of MCP: Why standardisation matters in the AI supply chain.
It’s a workaround, sure, but a telling one: the mere existence of a standard has given developers a base on which to build. And that’s the whole point. Standardization gives structure. Structure drives adoption. So yes, MCP is a bit messy right now. But that’s fine. It’s early. What matters is what comes next.
So, we’re adding a few tricks to AI Studio – all designed to make MCP not just usable but enterprise-ready. Here’s how we’re laying the groundwork for a scalable, secure AI supply chain:
1. Remote MCP catalogue & server support
We’ve already built the “universal client” inside Tyk AI Studio, a tool that can take any OAS spec and turn it into an LLM-friendly tool definition for vendors that support tool use.
Now, we’re going a step further. As more MCP clients start supporting remote servers, we’re making sure our universal client toolset is ready to meet them there.
That means Tyk AI Studio can now expose a remote MCP server, making it dead simple for enterprises to plug in their internal APIs and tools – securely, centrally, and without duct-taped scripts or rogue local installs. If your devs prefer to use their own chat or IDE client? No problem. The tools are still there, wired in via the remote server.
2. Secure local MCP server generator
For those who still need to run things locally (e.g., in dev or isolated environments), we’ve created an MCP generator. You feed it an OAS spec, and it spits out a locally installable MCP server – proxied through the Tyk AI Gateway, giving you the observability and policy enforcement you need. So even if you’re going local, you’re not going rogue.
MCP is gaining traction because it gives teams a common language – not a perfect one, but a usable one. What matters now is how well it holds up in real-world environments across diverse tools, teams, and workflows.
With the updates in Tyk AI Studio, the focus is on removing friction. Whether that means exposing internal tools securely or spinning up local test environments, the goal is a seamless integration that supports how teams actually work.
As adoption builds, what’s going to matter most is how smoothly these standards translate into practice. That’s where the value is and where we’re investing.