sekdek 2 days ago

Got tired of writing 100-char-long variables in java during my 9-5, so I decided to relax with js and created a website where you can clone voices and make them say any text.

Under the hood, it uses the OS solution xtts-v2

What do you think?

2
FreezerburnV 2 days ago

As someone with aspirations of voice acting, and who generally believes in consent for usage of voices, stuff like this raises my hackles. I know the cat is out of the bag and there’s legal stuff happening right now around this, but I don’t like seeing more of them pop up. I know you’re doing this for fun and to relax, so I don’t want to be a jerk and assume bad faith or anything, but wanted to be honest about my opinions around this tech.

Vampiero 1 day ago

Come on, all of our butts are on the line. You're not special just because you have aspirations of voice acting. So many jobs are going it's not even funny, if I were you I'd start accepting the inevitable instead of denying it.

The second ChatGPT learns to reason effectively this entire website and its users become useless. We're lucky that AI is kinda bad right now, and we can either hope that it's just a plateau, or prepare for the day when we need to find another job.

But asking nicely will get no one anywhere.

trod1234 1 day ago

I happen to fall into the same group as the previous poster who is a voice actor.

I know you did this as a challenge for fun but in fairness you should keep those challenges to yourself and not publicize them when the technologies involved are a pandora's box or ethically dubious.

I find it really hard to think of a single plausible use for this type of technology that is beneficial.

Nearly all uses directly or indirectly cause harm except maybe as voice restoration (for someone mimicking their own voice, whose vocal cords were damaged), and that would only help a limited number of people in these situations.

Used in lieu of voice actors, the demand for jobs in that sector go to zero collapsing the economic cycle/market eventually (non-market socialism->failure).

Same thing for any type of acting really, where it may violate their publicity or other rights.

Then there's the extreme where people use this technology to ransom victims in faux kidnappings.

If you created a public facing website for this, I hope you ran it by a lawyer and have an ironclad liability waiver (which I'm not sure any waiver's actually exist in this area of law that would be sufficiently defensible). Inevitably at some point there is always the question of negligence that comes up after harms have been done.

You don't want to be named in criminal lawsuits as a co-conspirator or otherwise, when your service/application is used by criminals to commit crimes.

The juice isn't worth the squeeze imo.

brewtide 1 day ago

It's a tool. Hammer manufacturers likely not worried about the broken glass use cases, but the creator of this website should be?

trod1234 1 day ago

> not worried ... but the creator of this website should be?

They should be worried. Hammers have many legitimate uses, voice cloning not so much. Claiming its a tool is meaningless when it has few legitimately legal uses. If there is any negligence involved at any step that violates the required duty of care, then they will have to deal with costly court battles.

If you do a little research this is not a new area in caselaw either. Napster made many arguments along these lines, we know how that turned out.

Your voice is also considered sensitive biometric data as well, often with a naturally higher degrees of protection under law in many municipalities/countries, especially when its processed without proper consent.

When you create something, you always have to be aware of the potential harms related to its use, and weigh that prior to public release. This is thoroughly covered in AI/business ethics courses.

usmanmehmood55 20 hours ago

If you want reassurance, a guy commented "The Trump example on the homepage sounds nothing like Trump."

trod1234 16 hours ago

There is no reassurance that can be made since there have already been examples of voice cloning that are not differentiable from the real people.

There is a very common saying in IT, "Attacks only get better".

Voice Cloning falls under the umbrella of deepfake technologies. When its done without consent, it is an attack on the person, it also violates any number of laws.

The ethics of building and releasing tools designed for deepfake generation have been widely discussed and debated. The general consensus by academics and industry (reasonable people) has been that it is not appropriate to build and release tools whose primary use is in causing harm or breaking the law.

Negligence is a pretty low bar to reach in this incidence, all you need is to show: duty, breach, causation, and damages to win such a claim.

It is almost certain that OP doesn't have consent forms from the people whose voices are being used on the front page.