This is going to help scammers a lot. Imagine getting a call from your loved ones, and they explain they are in trouble and need immediate financial assistance. This is going to affect so many elderly and innocent people. OpenAI is developing these tools again from publicly scrapping of your voice samples.
Remember, Microsoft is investing another $100 billion into this company to build the giant AI supercomputer. They are equally responsible for this kind of problem. If this money were invested in cancer research, nobody would have said anything, but everything they do will either enable scammers or steal jobs.
@nixCraft Exactly! This sounds extremely dangerous
@nixCraft techbros have become so isolated from reality that all they can think of is the intellectual exercise of proving “can we do this” and never stop to think “should we?” There’s barely a legitimate use case for this and a whole world of abusive ones
@nixCraft your voice doesn't need to be public for this to be feasible. If they really only need 15 seconds, they can call you with some excuse, record that and train on your responses. Hell, even some voicemail greetings would be sufficient.
@nixCraft Some of this money is being used for cancer research. It seems like you’ve noticed a negative bias in your news about AI: https://www.microsoft.com/en-us/research/project/ai-for-health/
AI for Health - Microsoft Research

AI for Health is a philanthropic program launched by Microsoft, which aims to support nonprofits, researchers, and organizations working on global health challenges. The program provides access to artificial intelligence (AI) technology and expertise in three main areas: population health, imaging analytics, genomics & proteomics.

Microsoft Research
@nixCraft Microsoft investing $100bil into AI when they could've invested it into xz. /hj
@nixCraft To me that's completely wrong.
AI can help in better otimized cancer treatments and reduce treatment price.
@nixCraft I had to check it wasn’t April 1st :/
@nixCraft within the family we determined a code-word in order to establish whether it is a fake or the real person on the phone.

@spiezmaestro @nixCraft

We've done that as well with my immediate family.

In our house, we also don't answer any unknown phone callers.
If it's legit or important, they can leave a message.

I've also switched our answering machine message from one in my voice to the generic machine generated voice.

Ain't technology great?!? 🙄

@spiezmaestro @nixCraft Sad but essential tactic these days :(
@nixCraft More evidence that the purpose of #GenerativeAI is spam.
@nixCraft
One measure is to establish a password with one's peers. I have one with my wife and kids. If we can't tell each other the password (e.g. the response is "what the hell do you mean, you have to help me right now!!"), we'll consider it fraudulent and drop the connection.
@voidzero @nixCraft @ATLeagle I have this with my kids so they know never to discuss anything on a phone call from a police station. “I’m at this location I need a lawyer, I am fine, _______ (password)”. We also have a strict no negotiations with hostage takers so they know they’re on their own…. But that’s another story ;)
@PopeASDF
Hah, mine are a little too young for that second part yet :)
@nixCraft @ATLeagle
@nixCraft Who would this help that's not a scammer?
@jeni @nixCraft Advertisers and game studios replacing voice actors comes to mind.
@stefanie @nixCraft still sounds like a scam to me 🤷
@jeni @nixCraft I would almost argue all advertising is a scam, because a good product doesn't need advertising
@nixCraft What could possibly go wrong? /s
@nixCraft Thankfully, they're not releasing it just yet, but if they can do it, so can others...

@nixCraft we're all going to need personal passwords for family.

Fucking hell

@nixCraft Already had the conversation w mother after a spell of articles. Never used fb/stuff so where would they get my voice sample? Hope.... Sigh!
@dany_57987 @nixCraft If you use Win10 or 11, you have given Microsoft permission to use your microphone and listen in on you at any time without your knowledge.
It's part of the EULA.

@stefanie @nixCraft Nope, never did I let a windows install connect to any of that sh*t.

The frightening part is that if I had not had 35 years of software exp I would probably have done that unknowingly.

@nixCraft another dangerous use case is: it can say things in my voice I would never say, like right-wing propaganda. 💩
@lea oh yes, I agree. This would damage your image forever and you won’t be able to go outside because of fake propaganda AI did. This is just nightmare waiting to happen
@nixCraft It’s already happening. The mother of a former coworker got a call about his daughter being in a car crash and needed monitory assistance. She almost didn’t believe him that his daughter was fine and upstairs in her bedroom. The call was too realistic to her.
@nixCraft I struggle to think of one single non-malicious use of this technology????
@nixCraft @kastwey Yes, but this already exists, you have Eleven Labs, for example, and there is also an open source model, maybe the problem is not only from the big giants. Moreover, Microsoft has not yet liberated the model to the public

@nixCraft

I remember when you only used safe words for sexual peccadillos... sigh

@nixCraft Scraping… but yes. Totally.
@nixCraft ..true, there is no stopping it however, the technology is out there , also in the opensource community

@ErikJonker @nixCraft It might be a good time to look back at more physical interactions - imagine the world's tightest weapons systems could be easily tricked by this.

https://en.m.wikipedia.org/wiki/We_begin_bombing_in_five_minutes

We begin bombing in five minutes - Wikipedia

@nixCraft the trad phone call with no two way authentication is dead
@nixCraft woo hoo, more things that should be VERY illegal but arent
@nixCraft have they *ever* considered maybe not building the machine that destroys the world?

@trisweb @nixCraft

"We are building the Torment Nexus from the loved sci-fi novel 'we shouldn't built the Torment Nexus'"

@nixCraft Make sure to have a prearranged code word or phrase to use if in trouble.

@nixCraft It's awful, but also that scam has been going on for decades without AI. Meaning, the evidence of how people will for sure abuse this is there they just don't want to look at it.

But of course, this scam is more prominent in lower income countries or countries with a lot of violence and, I mean, eeww, right?

@nixCraft

Holy fuck!

And my bank last summer convinced me, for added security, to "use my voice as my PIN".

There is ZERO honest use for this kind of tech. Absolutely fucking zero.

@Lily_and_frog @nixCraft My bank, too. I never opted in nor recorded a sample. But I called them today and my voice print matched, the rep said.

@nixCraft

scraping !== scrapping

@nixCraft
Other ways to determine something is awry. Key thing is to know voice calls can be faked now.

“Mom, what’s wrong with Woofy?”
“Woofy’s fine dear. Where are you?”
(Terminator hangs up) “Your foster parents are dead.”

@nixCraft

Look, *someone* was going to do it. It may as well be a venture-funded company founded by a scary rich-people cult.

@nixCraft they know what people will do with thier technologies, "Open" "AI" needs to be destroyed
@nixCraft Prediction: soon we start seeing articles on the theme of "how to choose a family password/challenge question."
@nixCraft
I'm going to use this to create messages from myself from the future to remember to take out the garbage and other critical tasks.
@nixCraft I thought about that, but I do think it's going to sound off to the recipient, because you can't use it to have a conversation, you can only use it to create recordings. so it may work for a voicemail, but not for an actual phone conversation or something alike.

@anthropy @nixCraft people are already pulling this scam off now, without voice copying. But they have to stick to further relatives — grandchildren, especially.

This just makes the scam more widely usable.

And if you think people won’t fall for it, you don’t know enough about how scams work.

@lkanies @nixCraft I never said you can't scam people even with stupid non-voice scams, I never said this isn't a problem, I just said you're not going to be able to have a conversation with this thing because it just generates samples and takes time doing so.

@anthropy @nixCraft there’s no link to the original article, but I assumed it would be connected to the chat functions, so it was generating real time audio.

Even without that, voicemail would become a scam vector (and already is for lots of other scams).

So sure, not being able to be real time reduces the problem for a while. But it’s still a huge increase in danger, without thinking at all about the consequences.

@lkanies @nixCraft I don't think it's generating realtime audio, at best it's a chat where you have to wait X seconds before you get a response, so not exactly realtime. and again, I never said there wasn't an "increase in danger", I literally said "I also thought this", I merely added that it isn't real time or usable for realtime applications.

this didn't have to be an offense/defense kind of conversation, and honestly I'm not planning to get into one either, so I'll leave it at this.