I embed tons of separate pieces of information, save the vectors in a db. Embed the user's question, then have a stored procedure in the db to calculate the top 10 (or 20 or 50 depending on the model) similar pieces of information.
I have an editor where I can ask a question and it brings up the most related pieces of info, and if I change any of those pieces it will update the embedding in the db
That's a good approach. But what I'm looking for is a bit different, more like Segment, but for LLMs. Something that when a user lands on your website, clicks around, and interacts with your app, you get a full behavioral context out of the box, including click path, location, language, currency, etc. You can then inject that context directly into your prompt so the LLM understands what the user is doing and responds without guessing or asking.
What is the application specific scenario that is requiring this context? Everyone has different scenarios and this might not make sense