I feel like this is all smoke and mirrors to redirect from the likelihood intentional silicon backdoors that are effectively undetectable. Without open silicon, there's no way to detect that -- say -- when registers r0-rN are set to values [A, ..., N] and a jump to address 0xCONSTANT occurs, additional access is granted to a monitor process.
Of course, this limits the potential attackers to 1) exactly one government (or N number of eyes) or 2) one company, but there's really no way that you can trust remote hardware.
This _does_ increase the trust that the VMs are safe from other attackers, but I guess this depends on your threat model.
iirc, no real proof was ever provided for that bloomberg article (despite it also never being retracted). many looked for the chips and from everything I heard there was never a concrete situation where this was discovered.
Doesn't make the possible threat less real (see recent news in Lebanon), but that story in particular seems to have not stood up to closer inquiry.
This is an interesting idea. However what does open hardware mean? How can you prove that the design or architecture that was “opened” is actually what was built? What does the attestation even mean in this scenario?
Yeah, but, considering the sheer complexity of modern CPUs and SoCs, this is still the case even if you have the silicon in front of you. That ship sailed some time ago.
I hope this helps people to consider Swift 6 as a viable option for server-side development, since it offers many of the modern safety features of Rust, including simpler memory management through ARC, compared to Rust’s more complex ownership system and more predictable than Go's garbage collector.
Well they are providing a dedicated environment from which to attack their infrastructure. But also they have a section called “ Apple Security Bounty for Private Cloud Compute” in the linked article so this is a bug bounty + additional goodies to help you test their security.
Similar, but a lot of documentation is provided, source code for cross reference, and a VM based research environment instead of applying for a physical security research device.
I feel like this is all smoke and mirrors to redirect from the likelihood intentional silicon backdoors that are effectively undetectable. Without open silicon, there's no way to detect that -- say -- when registers r0-rN are set to values [A, ..., N] and a jump to address 0xCONSTANT occurs, additional access is granted to a monitor process.
Of course, this limits the potential attackers to 1) exactly one government (or N number of eyes) or 2) one company, but there's really no way that you can trust remote hardware.
This _does_ increase the trust that the VMs are safe from other attackers, but I guess this depends on your threat model.
Concrete example of such backdoors: https://www.bloomberg.com/news/features/2018-10-04/the-big-h...
The system is protecting you against Apple employees, but not against law enforcement.
No matter how much layer of technology you put, at the end of the day, the US companies have to respect the law of the US.
The requests can be routed to specific investigation / debugging / beta nodes.
Just to turn-on a flag on specific users.
It's not like ultimate privacy, but at least it will prevent Apple engineers from snooping into private chatlogs.
(like some pervert at Gmail was stalking a little girl https://www.gawkerarchives.com/5637234/gcreep-google-enginee... , or Zuckerberg himself reading chatlogs https://www.vanityfair.com/news/2010/03/mark-zuckerberg-alle... )
iirc, no real proof was ever provided for that bloomberg article (despite it also never being retracted). many looked for the chips and from everything I heard there was never a concrete situation where this was discovered.
Doesn't make the possible threat less real (see recent news in Lebanon), but that story in particular seems to have not stood up to closer inquiry.
This is an interesting idea. However what does open hardware mean? How can you prove that the design or architecture that was “opened” is actually what was built? What does the attestation even mean in this scenario?
Yeah, but, considering the sheer complexity of modern CPUs and SoCs, this is still the case even if you have the silicon in front of you. That ship sailed some time ago.
Looks like they are really writing everything in Swift on the server side.
Repo: https://github.com/apple/security-pcc
I hope this helps people to consider Swift 6 as a viable option for server-side development, since it offers many of the modern safety features of Rust, including simpler memory management through ARC, compared to Rust’s more complex ownership system and more predictable than Go's garbage collector.
Wow! This is great!
I hope you'll consider adding witness cosignatures on your transparency log though. :)
How is this different than a bug bounty?
Well they are providing a dedicated environment from which to attack their infrastructure. But also they have a section called “ Apple Security Bounty for Private Cloud Compute” in the linked article so this is a bug bounty + additional goodies to help you test their security.
Similar, but a lot of documentation is provided, source code for cross reference, and a VM based research environment instead of applying for a physical security research device.