If you want reassurance, a guy commented "The Trump example on the homepage sounds nothing like Trump."
There is no reassurance that can be made since there have already been examples of voice cloning that are not differentiable from the real people.
There is a very common saying in IT, "Attacks only get better".
Voice Cloning falls under the umbrella of deepfake technologies. When its done without consent, it is an attack on the person, it also violates any number of laws.
The ethics of building and releasing tools designed for deepfake generation have been widely discussed and debated. The general consensus by academics and industry (reasonable people) has been that it is not appropriate to build and release tools whose primary use is in causing harm or breaking the law.
Negligence is a pretty low bar to reach in this incidence, all you need is to show: duty, breach, causation, and damages to win such a claim.
It is almost certain that OP doesn't have consent forms from the people whose voices are being used on the front page.