Deepfake voices improving kidnapping scams

https://sh.itjust.works/post/1063095

Deepfake voices improving kidnapping scams - sh.itjust.works

“A reasonably good clone can be created with under a minute of audio and some are claiming that even a few seconds may be enough.” Mom wary about answering calls for fear voice will be cloned for future virtual kidnapping.

I’m pro-AI but any technology that can lead to the creation of deepfakes must be explicitly banned.

Naturally, we’re already talking about criminals but you combat this issue the same way you combat school shootings. Banning the root of the issue and actively persecuting anyone who dares acquire it illegally.

Would it even be possible to ban? Every military in the world wants this technology.
Well, the military has tanks too, should we go around selling tanks to the general population?
In the US it is legal to own a tank.
If it’s legal to buy an AR-15 in America, why can’t you buy a tank? You can! John Blumenthal

Yet one might wonder why functional tanks are classified as “destructive devices” by the ATF and are subject to heavy restrictions, while the clearly destructive, military-style AR-15s, are not regulated in most states, writes John Blumenthal

cleveland
I can’t even be surprised anymore, I’ve been desensitized.