I don't understand the value of this abstraction.
I can see the value of something like DSPy where there is some higher level abstractions in wiring together a system of llms.
But this seems like an abstraction that doesn't really offer much besides "function calling but you use our python code".
I see the value of language server protocol but I don't see the mapping to this piece of code.
That's actually negative value if you are integrating into an existing software system or just you know... exposing functions that you've defined vs remapping functions you've defined into this intermediate abstraction.
Here's the play:
If integrations are required to unlock value, then the platform with the most prebuilt integrations wins.
The bulk of mass adopters don't have the in-house expertise or interest in building their own. They want turnkey.
No company can build integrations, at scale, more quickly itself than an entire community.
If Anthropic creates an integration standard and gets adoption, then it either at best has a competitive advantage (first mover and ownership of the standard) or at worst prevents OpenAI et al. from doing the same to it.
(Also, the integration piece is the necessary but least interesting component of the entire system. Way better to commodify it via standard and remove it as a blocker to adoption)
The secret sauce part is the useful part -- the local vector store. Anthropic is probably not going to release that without competitive pressure. Meanwhile this helps Anthropic build an ecosystem.
When you think about it, function calling needs its own local state (embedded db) to scale efficiently on larger contexts.
I'd like to see all this become open source / standardized.
im not sure what you mean - the embedding model is independent of the embeddings themselves. Once generated, the embeddings and vector store should exist 100% locally and thus not part of any secret sauce