chaosprint 6 days ago

it seems that this free version "may use your prompts and completions to train new models"

https://openrouter.ai/deepseek/deepseek-chat-v3-0324:free

do you think this needs attention?

5
wgd 6 days ago

That's typical of the free options on OpenRouter, if you don't want your inputs used for training you use the paid one: https://openrouter.ai/deepseek/deepseek-chat-v3-0324

overfeed 6 days ago

Is OpenRouter planning on distilling models off the prompts and responses from frontier models? That's smart - a little gross - but smart.

numlocked 6 days ago

COO of OpenRouter here. We are simply stating the WE can’t vouch for the behavior of the upstream provider’s retention and training policy. We don’t save your prompt data, regardless of the model you use, unless you explicitly opt-in to logging (in exchange for a 1% inference discount).

overfeed 6 days ago

I'm glad to hear you are not hoovering up this data for your own purposes.

simonw 6 days ago

That 1% discount feels a bit cheap to me - if it was a 25% or 50% discount I would be much more likely to sign up for it.

numlocked 6 days ago

We don’t particularly want our customers’ data :)

oofbaroomf 6 days ago

Yeah, but Openrouter has a 5% surcharge anyway.

YetAnotherNick 5 days ago

Better way to state is 20% of surcharge then :)

vintermann 5 days ago

You clearly want it a little if you give a discount for it?

huijzer 6 days ago

Since we are on HN here, I can highly recommend open-webui with some OpenAI-compatible provider. I'm running with Deep Infra for more than a year now and am very happy. New models are usually available within one or two days after release. Also have some friends who use the service almost daily.

l72 6 days ago

I too run openweb-ui locally and use deepinfra.com as my backend. It has been working very well, and I am quite happy with deepinfra's pricing and privacy policy.

I have set up the same thing at work for my colleagues, and they find it better than openai for their tasks.

jychang 5 days ago

Yeah, openweb-ui is the best frontend for API queries. Everything seems to work well.

I've tried LibreChat before, but the app is terrible at generating titles for chats instead of leaving it as "New Chat". Also it lacks a working Code Interpreter.

unquietwiki 6 days ago

I'm using open-webui at home with a couple of different models. gemma2-9b fits in VRAM on a NV 3060 card + performs nicely.

mdp2021 5 days ago

> performs nicely

Do you have rough indication of token/s ?

zakki 6 days ago

What is the memory of your NV3060? 8GB?

ngvjmfgb 6 days ago

12GB (edit: that is what mine is)

totetsu 6 days ago

And it’s quite easy to set up a Cloudflare tunnel to make your open-webui instance accessible online too just you

simonw 6 days ago

... or a TailScale network. I've been leaving open-webui running on my laptop on my desk and then going out into the word and accessing it from my phone via TailScale, works great.

totetsu 6 days ago

I would use tail scale. But I specifically want to use open web-ui from a place I can’t install a Tailscale client

fragmede 5 days ago

where's that?

wkat4242 6 days ago

Yeah this sounds like the more secure option, you don't want to be dependent on a single flaw in a web service

wkat4242 6 days ago

Yeah OpenWebUI is great with local models too. I love it. You can even do a combo, send the same prompt to local and cloud and even various providers and compare the results.

eurekin 5 days ago

I've tried using it, but it's browser tab seems to peg one core to 100% after some time. Anyone else experienced it?

indigodaddy 5 days ago

Can open-webui update code on your local computer ala cursor etc?

cess11 5 days ago

It has a module system so maybe it can but it seems more people are using Aider or Continue for that. There's a bit of stitching things together regardless of whether you show your project to some SaaS or run local models but if you can manage a Linux system it'll be easy.

Personally I heavily dislike the experience though, so I might not be the best one to answer.

TechDebtDevin 6 days ago

Thats because its a 3rd party API someone is hosting and trying to arb the infra cost or mine training data, or maybe something even more sinister. I stay away from open router API's that aren't served by reputable well known companies, and even then...

madduci 5 days ago

As always, avoid using sensitive information and you are good to go

behnamoh 6 days ago

good grief! people are okay with it when OpenAI and Google do it, but as soon as open source providers do it, people get defensive about it...

chaosprint 6 days ago

no. it's nothing to do with deepseek. it's openrouter and providers there

londons_explore 6 days ago

I trust big companies far more with my data than small ones.

Big companies have so much data they won't be having a human look at mine specifically. Some small place probably has the engineer looking at my logs as user #4.

Also, big companies have security teams whose job is securing the data, and it won't be going over some unencrypted link to cloudflare because OP was too lazy to set up Https certs.

henry2023 6 days ago

Equifax.

jimmygrapes 6 days ago

I'm not convinced any humans have worked there for most of my lifetime.