sigmoid10 6 days ago

How can it be run "locally" if you don't support local-hosted LLMs? The overlap between people who wouldn't trust a cloud api wrapper like yours, but would willingly let their (possibly sensitive) documents be sent to some AI provider's api seems rather small to me. Either embrace the cloud fully and don't worry about data confidentiality, or go full local and embrace the anxious community. This in between seems like a waste of time tbh.

(I'm not trying to sound overly critical - I very much like the idea and the premise. I merely wouldn't use this business approach)

3
eddythompson80 6 days ago

> This in between seems like a waste of time tbh.

Hard disagree. The “in between” is where you want where most are already ending up. Initially you had everyone so worried about privacy and what OpenAI is doing with their precious private data. “They will train on it. Privacy is important to me. I’m not about to like give OpenAI access to my private, secure, Google drive back ups or Gmail history or Facebook private messages or any real private “local only” information.

Also among those who understand data privacy concerns, when it come to work data, in the span of 2-3 years, all business folks I know went from “this is confidential business information. Please never upload to ChatGPT and only email it to me” to “just put everything on ChatGPT and see what it tells you”

The initial worry was driven by not understanding how LLMs worked. What if “it just learned as you talked to it?” And “what if it used that learning with somebody else?” Like I told it a childhood secret, will it turn around and tell others my secret?”

People understand how that works now and some concerns are less. Basically most understand that it’s similar risk as their already existing digital life is

sigmoid10 5 days ago

As someone who actually deals with this on a regular basis, I can guarantee you that serious companies definitely do not "just put everything in ChatGPT" if they have any sort of respectable legal department. Especially in Europe, where you have GDPR concerns on top of any business concerns. People who actually understand the privacy issues nowadays either use stuff like Azure's OpenAI custom hosting to be compliant with the law or go full open weight self hosted. Everything else is a legal time-bomb.

eddythompson80 5 days ago

Of course they aren't putting it on ChatGPT. Their data is stored in S3, Snowflake, BigQuery, or Azure Storage. It makes more sense to use the respective cloud provider LLM hosting service. You can use OpenAI's GPT models or Anthopic models hosted on Azure or AWS.

You're telling me companies in Europe aren't putting all their user data on AWS and Azure regions in Europe? Both AWS and Azure are gigantic in Europe.

sigmoid10 4 days ago

What are you getting at? This is not what was said.

spmurrayzzz 6 days ago

Supporting local models can be done by overriding one or two environment variables, as long as your local inference server has an OpenAI-compliant endpoint (which the majority of local stacks ship with).

Was there some level of support beyond this that you were referring to?

rjakob 6 days ago

Good point. Current focus is on improving AI feedback quality, not business model. But we’ll definitely consider local model support for privacy-conscious users. Thanks for the input!