OpenAI holds back wide release of voice-cloning tech due to misuse concerns

Voice Engine can clone voices with 15 seconds of audio, but OpenAI is warning of potential misuse.

https://arstechnica.com/information-technology/2024/03/openai-holds-back-wide-release-of-voice-cloning-tech-due-to-misuse-concerns/?utm_brand=arstechnica&utm_social-type=owned&utm_source=mastodon&utm_medium=social

OpenAI holds back wide release of voice-cloning tech due to misuse concerns

Voice Engine can clone voices with 15 seconds of audio, but OpenAI is warning of potential harms.

Ars Technica

@arstechnica
I remember OpenAI holding back GPT from public release in a very similar fashion: it’s too powerful, concerns over misuse, etc. Then they went ahead and released it to the public anyway, with all the potential abuses barely mitigated and now materializing exactly as predicted.

At this point, I can’t view this sort of “It’s too powerful to be public!!” statement as anything but prerelease marketing hype.

To be clear, I’m very much in favor of ethical non-release of dangerous tech.

If you’re going to do that, the way to do it is not to send out a press release. It’s to keep your dangerous discovery somewhere between low-key and confidential, and get your research community working to develop mitigations and countermeasures •before• your creation wanders into the view of investors and militaries.

@inthehands @BernieDoesIt OpenAI sent out a press release because remember their whole thing is doing a bit about how they’re worried about creating Skynet, when they know very well that’s not what they’re doing.
@MisuseCase
Yes, my point exactly