This looks interesting but somewhat complicated or not obvious how to get going in a classic "Show HN" style.
The requirement for an OpenAI key may also be a little off-putting, or at least, could do with some indication of realistic costs; most Cursor users will likely need a significant motivation to add to the subscription they already have.
Don't get me wrong, this could be a really worthwhile addition to the LLM coding toolset but I think it needs some work on the presentation as to how to get quickly up and running.
Graphiti uses OpenAI (or other LLMs) to build the knowledge graph. Setting up the MCP server is fairly straight forward: https://github.com/getzep/graphiti/tree/main/mcp_server
There's also a Docker Compose setup: https://github.com/getzep/graphiti/tree/main/mcp_server#runn...
The Cursor MCP setup is also simple:
```{ "mcpServers": { "Graphiti": { "url": "http://localhost:8000/sse" } } }```
How complex is the system? Can a local model or the agent itself be used instead?