Security research on Private Cloud Compute

(security.apple.com)

82 points | by todsacerdoti 3 hours ago ago

11 comments

  • mmastrac 34 minutes ago

    I feel like this is all smoke and mirrors to redirect from the likelihood intentional silicon backdoors that are effectively undetectable. Without open silicon, there's no way to detect that -- say -- when registers r0-rN are set to values [A, ..., N] and a jump to address 0xCONSTANT occurs, additional access is granted to a monitor process.

    Of course, this limits the potential attackers to 1) exactly one government (or N number of eyes) or 2) one company, but there's really no way that you can trust remote hardware.

    This _does_ increase the trust that the VMs are safe from other attackers, but I guess this depends on your threat model.

    • rvnx 6 minutes ago

      Concrete example of such backdoors: https://www.bloomberg.com/news/features/2018-10-04/the-big-h...

      The system is protecting you against Apple employees, but not against law enforcement.

      No matter how much layer of technology you put, at the end of the day, the US companies have to respect the law of the US.

      The requests can be routed to specific investigation / debugging / beta nodes.

      Just to turn-on a flag on specific users.

      It's not like ultimate privacy, but at least it will prevent Apple engineers from snooping into private chatlogs.

      (like some pervert at Gmail was stalking a little girl https://www.gawkerarchives.com/5637234/gcreep-google-enginee... , or Zuckerberg himself reading chatlogs https://www.vanityfair.com/news/2010/03/mark-zuckerberg-alle... )

      • 0xCMP a minute ago

        iirc, no real proof was ever provided for that bloomberg article (despite it also never being retracted). many looked for the chips and from everything I heard there was never a concrete situation where this was discovered.

        Doesn't make the possible threat less real (see recent news in Lebanon), but that story in particular seems to have not stood up to closer inquiry.

    • yalogin 26 minutes ago

      This is an interesting idea. However what does open hardware mean? How can you prove that the design or architecture that was “opened” is actually what was built? What does the attestation even mean in this scenario?

    • SheinhardtWigCo 17 minutes ago

      Yeah, but, considering the sheer complexity of modern CPUs and SoCs, this is still the case even if you have the silicon in front of you. That ship sailed some time ago.

  • dewey 2 hours ago

    Looks like they are really writing everything in Swift on the server side.

    Repo: https://github.com/apple/security-pcc

    • tessela 37 minutes ago

      I hope this helps people to consider Swift 6 as a viable option for server-side development, since it offers many of the modern safety features of Rust, including simpler memory management through ARC, compared to Rust’s more complex ownership system and more predictable than Go's garbage collector.

  • kfreds 2 hours ago

    Wow! This is great!

    I hope you'll consider adding witness cosignatures on your transparency log though. :)

  • ngneer 15 minutes ago

    How is this different than a bug bounty?

    • alemanek 6 minutes ago

      Well they are providing a dedicated environment from which to attack their infrastructure. But also they have a section called “ Apple Security Bounty for Private Cloud Compute” in the linked article so this is a bug bounty + additional goodies to help you test their security.

    • davidczech 8 minutes ago

      Similar, but a lot of documentation is provided, source code for cross reference, and a VM based research environment instead of applying for a physical security research device.