singularity2001 3 days ago

Tangential question: Is there any LLM which is capable of preserving the context through many sessions, so it doesn't have to upload all my context every time?

1
fragmede 3 days ago

it's a bit of a hack but the web UI of ChatGPT has a limited amount of memories you can use to customize your interactions with it.

singularity2001 3 days ago

"remember these 10000 lines of code" ;)

In an ideal world gemini (or any other 1M token context model) would have an internal 'save snapshot' option so one could resume a blank conversation after 'priming' the internal state (activations) with the whole code base.