tonygiorgio 11 hours ago

> Although PCC is currently unique to Apple, we can hope that other privacy-focused services will soon crib the idea.

IMHO, Apple's PCC is a step in the right direction in terms of general AI privacy nightmares where they are at today. It's not a perfect system, since it's not fully transparent and auditable, and I do not like their new opt-out photo scanning feature running on PCC, but there really is a lot to be inspired by it.

My startup is going down this path ourselves, building on top of AWS Nitro and Nvidia Confidential Compute to provide end to end encryption from the AI user to the model running on the enclave side of an H100. It's not very widely known that you can do this with H100s but I really want to see this more in the next few years.

2
mnahkies 10 hours ago

I didn't actually realize that AWS supported this, I thought Azure was the only one offering it (https://azure.microsoft.com/en-us/blog/azure-confidential-co...)

Are you speaking of this functionality? https://developer.nvidia.com/blog/confidential-computing-on-... (and am I just failing to find the relevant AWS docs?)

tonygiorgio 9 hours ago

Yes, you're correct on both, though I think Google Cloud recently started supporting it as well. AWS will likely have GPU enclave support with Trainium 2 soon (AFAIK, that feature is not publicly offered yet but could be wrong).

We work with Edgeless Systems who manages the GPU enclave on Azure that we speak to from our AWS Nitro instance. While not ideal, the power of enclaves and the attestation verification process, we at least know that we're not leaking privacy by going with a third party GPU enclave provider.

blueblimp 9 hours ago

And the most important thing about PCC in my opinion is not the technical aspect (though that's nice) but that Apple views user privacy as something good to be maximized, differing from the view championed by OpenAI and Anthropic (and also adopted by Google and virtually every other major LLM provider by this point) that user interactions must be surveilled for "safety" purposes. The lack of privacy isn't due to a technical limitation--it's intended, and they often brag about it.

flossposse 2 hours ago

If Apple really wanted to maximize privacy, they wouldn't be constantly collecting so much information in the first place (capture the network traffic from an apple device sometime - it's crazy). User interactions on Apple devices definitely seem to be surveilled for "safety" purposes.

From my perspective, Apple's behavior indicates that what they want to maximize is their own control, and their position as the gatekeeper others must pay in order to get access to you.

natch 7 hours ago

Something good to be maximized within the constraints of the systems they have to work within. But at some point with enough compromises it becomes maximizing the perception of privacy, not the reality. Promoting these academic techniques may just be perception management on the part of Apple, if the keys are not controlled solely by the user.