1 comments

  • tmuhlestein 14 hours ago

    At GoDaddy, we’ve been exploring ways to make large-scale LLM integrations more efficient — not just for us, but for the entire ecosystem.

    Today, most Model Context Protocol (MCP) clients inject every tool definition into every prompt. That’s fine for a few tools — but at scale, it means wasted tokens, slower responses, and higher compute costs.

    Our proposal — Advertise-and-Activate — offers a simple alternative: let clients advertise tool summaries first, then activate full definitions only when needed.

    This approach can:

    • Cut token usage by up to 94% • Reduce latency and context bloat • Improve LLM reasoning and scalability • Require no protocol changes — it works today

    It’s a small architectural shift with a big impact on performance and cost.

    We’re already testing it internally across engineering workflows — from infrastructure management to developer tooling — and the results are promising.

    This is a key piece of our broader vision for a scalable AI agent ecosystem. While our Agent Name Service (ANS) implementation focuses on secure, internet-scale discovery and verification, patterns like Advertise-and-Activate are crucial for making subsequent agent-to-agent communication via protocols like MCP both efficient and enterprise-ready.

    Read the full article here: Smarter MCP Clients: A Proposal for Advertise-and-Activate: https://www.godaddy.com/resources/news/smarter-mcp-clients-a...

    We’d love to hear your thoughts or experiences.

    — Jay Gowdy & Travis Muhlestein