jspahrsummers 3 days ago

We're definitely interested in extending MCP to cover remote connections as well. Both SDKs already support an SSE transport with that in mind: https://modelcontextprotocol.io/docs/concepts/transports#ser...

However, it's not quite a complete story yet. Remote connections introduce a lot more questions and complexity—related to deployment, auth, security, etc. We'll be working through these in the coming weeks, and would love any and all input!

1
jascha_eng 3 days ago

Will you also create some info on how other LLM providers can integrate this? So far it looks like it's mostly a protocol to integrate with anthropic models/desktop client. That's not what I thought of when I read open-source.

It would be a lot more interesting to write a server for this if this allowed any model to interact with my data. Everyone would benefit from having more integration and you (anthropic) still would have the advantage of basically controlling the protocol.

somnium_sn 3 days ago

Note that both Sourcegraph's Cody and the Zed editor support MCP now. They offer other models besides Claude in their respective application.

The Model Context Protocol initial release aims to solve the N-to-M relation of LLM applications (mcp clients) and context providers (mcp servers). The application is free to choose any model they want. We carefully designed the protocol such that it is model independent.

jascha_eng 3 days ago

LLM applications just means chat applications here though right? This doesn't seem to cover use cases of more integrated software. Like a typical documentation RAG chatbot.

nl 3 days ago

OpenAI has Actions which is relevant for this too: https://platform.openai.com/docs/actions/actions-library

Here's one for performing GitHub actions: https://cookbook.openai.com/examples/chatgpt/gpt_actions_lib...