Microsoft CEO calls for tech industry to ‘act’ after AI photos of Taylor Swift circulate X::Satya Nadella spoke to Lester Holt about artificial intelligence and its ability to create deepfake images of others. After pictures of Taylor Swift circulated, he called for actions
Isn’t this something that could have been done with photoshop in 30 minutes? What’s the difference when the result could have been almost perfect just as easily?
Ps. Haven’t seen the images being discussed, and this is even more alarming given legislation could be passed based on images you’re not even morally allowed to review. It could all be fictional and I would never even know.
“Allowing entities other than us to control AI is dangerous. We must act!”
– Microsoft probably
I have no problem using the law to stop abusive deep fakes, but I do have a problem using the law to take AI away from regular people. Regular people need to be able to run their own AIs. All the worst outcomes involve taking AI away from regular people first.
MFW they make owning and operating AI illegal before guns
:: surprised Pikachu face ::
AI generation can be used for disinformation which can literally destabilize or right away end the world as we know it.
But fake Taylor Swift pictures, this is where we draw the line …
"TAYLOR SWIFT WAS A LINE THAT SHOULD NOT HAVE BEEN CROSSED.
PREPARE TO REAP THE WHIRLWIND!"
– The Whitehouse, apparently.
i will never understand how taylor swift became this super duper billionaire royalty who i have to hear about every day now…
You mean you can’t understand how other people can like things you don’t?
No, nor would I want any insight into minds of those that stan billionaires such control would require
Didn’t we all see this coming? Porn deepfakes were already a thing, and even before generative AI we already had people photoshop women in explicit situations.
I’d even say that right now we have much better tools to deal with the fakes than before AI, and all that is required is legislative action.
The tech is already capable of doing automatic facial recognition at scale and we could give victims the tools to automatically send take-down notices and have them enforced.