doug_durham 3 days ago

The biggest contribution is the LLM compatible metadata that describes the tool and its argument. It is trivial to adopt. In python you can use FASTMcp to add a decorator to a function, and as long as that function returns a JSON string you are in business. The decorator extracts the arguments and doc strings and presents that to the LLM.

1
jillesvangurp 3 days ago

What makes a spec LLM compatible? I've thrown a lot of different things at gpt o1 and it generally understands them more better than I do. OpenAI specifications, unstructured text, log output, etc.