I like that Microsoft is being responsible and will not release any API or software until they are certain that the technology will be used responsibly. #GenerativeAI #vasa1
https://www.microsoft.com/en-us/research/project/vasa-1/
(wp.me pn73N-tjT)
I like that Microsoft is being responsible and will not release any API or software until they are certain that the technology will be used responsibly. #GenerativeAI #vasa1
https://www.microsoft.com/en-us/research/project/vasa-1/
(wp.me pn73N-tjT)
From Microsoft Research: The woman in this video is not real. VASA-1 is capable of not only producing lip movements that are exquisitely synchronized with the audio, but also capturing a large spectrum of facial nuances and natural head motions that contribute to the perception of authenticity & liveliness.
Oof. This is a horrible deep fake nightmare…
Microsoft's VASA-1: Lifelike Audio-Driven Talking Faces Generated in Real Time #vasa #vasa1 #microsoft #ai #research #artificialintelligence
Single portrait photo + speech audio = hyper-realistic talking face video with precise lip-audio sync, lifelike facial behavior, and naturalistic head movements, generated in real time.
Microsoft’s VASA-1 can #deepfake a person with one photo and one audio track
#privacy #Microsoft #vasa1
Here we go again... What they are capable of creating with AI tech is incredible. I do enjoy the development of AI and all its uses that can be of help or entertainment for us.
Though it come with challenges. When general trust goes out the window some gatekeepers will take up the role to enforce trust and present themselves as the keeper and judge of truth, that worries me greatly. AI used extensively in the wrong way can lead to so much mistrust that someone has to come inn and enforce monopoly on trust. With that much power, temptations arise.
On the other hand, maybe we will learn to go back to analog for truth.
In any case, the need to learn how to be skeptical of any digital content only grow. Separating fake from real is a most needed skill set.
#AI #VASA1 #Deepfake
Microsoft’s VASA-1 can deepfake a person with one photo and one audio track