MattDaEskimo 3 days ago

LLMs can potentially query _something_ and receive a concise, high-signal response to facilitate communications with the endpoint, similar to API documentation for us but more programmatic.

This is huge, as long as there's a single standard and other LLM providers don't try to release their own protocol. Which, historically speaking, is definitely going to happen.

1
gyre007 3 days ago

> This is huge, as long as there's a single standard and other LLM providers don't try to release their own protocol

Yes, very much this; I'm mildly worried because the competition in this space is huge and there is no shortage of money and crazy people who could go against this.

bloomingkales 3 days ago

They will go against this. I don’t want to be that guy, but this moment in time is literally the opening scene of a movie where everyone agrees to work together in the bandit group.

But, it’s a bandit group.

defnotai 3 days ago

Not necessarily. There’s huge demand to simplify the integration process between frontier models and consumers. If specs like this wind up saving companies weeks or months of developer time, then the MCP-compatible models are going to win over the more complex alternatives. This unlocks value for the community, and therefore the AI companies