IIRC they demonstrated an interaction with Siri where it asks the user for consent before enriching the data through chatgpt. So yeah, that seems to mean your data is sent out (if you consent).
I’d say the proof is on Apple to show that it’s being done on-device or that all processing is done on iCloud servers.
You’re saying that OpenAI is just going to hand over their full ChatGPT model for Apple to set up on their own servers for free?
But from the article itself:
the partnership could burn extra money for OpenAI, because it pays Microsoft to host ChatGPT’s capabilities on its Azure cloud
I get it if they created a small version of their LLM to run locally, but I would expect Apple to pay a price even for that.
I think you may be confusing this ChatGPT integration with Apple’s own LLM that they’re working on…
Again, from the linked article:
Still, Apple’s choice of ChatGPT as Apple’s first external AI integration has led to widespread misunderstanding, especially since Apple buried the lede about its own in-house LLM technology that powers its new “Apple Intelligence” platform.
Thanks! It’s a good read and I like the idea of a private cloud compute (PCC) system, but that doesn’t mention anywhere that ChatGPT will be running in that PCC system (if you were trying to imply that).
And while OpenAI could implement something similar to PCC, I haven’t seen them announce that anywhere either.
Do you have proof they’re sending it to OpenAI?
I believe I heard it’s done on device or on iCloud servers then deleted.
I mean, that’s the claim at least
https://security.apple.com/blog/private-cloud-compute
IIRC they demonstrated an interaction with Siri where it asks the user for consent before enriching the data through chatgpt. So yeah, that seems to mean your data is sent out (if you consent).
I’d say the proof is on Apple to show that it’s being done on-device or that all processing is done on iCloud servers.
You’re saying that OpenAI is just going to hand over their full ChatGPT model for Apple to set up on their own servers for free?
But from the article itself:
I get it if they created a small version of their LLM to run locally, but I would expect Apple to pay a price even for that.
I think you may be confusing this ChatGPT integration with Apple’s own LLM that they’re working on… Again, from the linked article:
https://security.apple.com/blog/private-cloud-compute/. See section on Verifiable Security.
Thanks! It’s a good read and I like the idea of a private cloud compute (PCC) system, but that doesn’t mention anywhere that ChatGPT will be running in that PCC system (if you were trying to imply that).
And while OpenAI could implement something similar to PCC, I haven’t seen them announce that anywhere either.
I don’t trust OpenAI but I do trust that Apple is doing what it can.