Now that OpenAI’s technology is integrated all the way across Apple’s flagship software and flagship devices, I guarantee you people will blame Apple if OpenAI fumbles privacy even if just on their end.
I’ve been hearing mixed reactions to Apple choosing OpenAI, because of recent drama and because of Sam Altman specifically. To me, it feels like a “keep your enemies closer” decision on Apple’s part because while the company sucks, they do have a competitive (potentially superior) service at the moment.
And Apple has jack without some kind of partnership.
The masses want AI, even if they don’t know why. And OpenAI is a big name, even if they make Google look privacy-conscious. The smartest thing for Apple to do is to funnel as many inevitable OpenAI users on their platforms through their own sanitized version of the service.
Why do you think only one of those can be true?
Why would apple care about the privacy implications of openAI? No one will blame Apple for privacy concerns arising because of them.
Now that OpenAI’s technology is integrated all the way across Apple’s flagship software and flagship devices, I guarantee you people will blame Apple if OpenAI fumbles privacy even if just on their end.
I’ve been hearing mixed reactions to Apple choosing OpenAI, because of recent drama and because of Sam Altman specifically. To me, it feels like a “keep your enemies closer” decision on Apple’s part because while the company sucks, they do have a competitive (potentially superior) service at the moment.
And Apple has jack without some kind of partnership.
Well the ChatGPT Mac app and the universal Siri AI are two different things.
Imagine if it was just the OpenAI app.
The masses want AI, even if they don’t know why. And OpenAI is a big name, even if they make Google look privacy-conscious. The smartest thing for Apple to do is to funnel as many inevitable OpenAI users on their platforms through their own sanitized version of the service.
I don’t like to blindly imagine things like most of the Lemmy user base.