23 comments

  • Lammy 2 hours ago

    Downside of trading privacy for security: anything that makes a network connection creates metadata about you, and the metadata is the real danger for analyzing your social connections: https://kieranhealy.org/blog/archives/2013/06/09/using-metad...

    The problem isn't about the big corporations themselves but about the fact that the network itself is always listening and the systems the big corporations build tend to incentivize making as many metadata-leaking connections as possible, either in the name of advertising to you or in the name of Keeping You Safe™: https://en.wikipedia.org/wiki/Five_Eyes

    Transparent WWW caching is one example of a pro-privacy setup that used to be possible and is no longer feasible due to pervasive TLS. I used to have this kind of setup in the late 2000s when I had a restrictive Comcast data cap. I had a FreeBSD gateway machine and had PF tied in to Squid so every HTTP request got cached on my edge and didn't hit the WAN at all if I reloaded the page or sent the link to a roommate. It's still technically possible if one can trust their own CA on every machine on their network, but in the age of unlimited data who would bother?

    Other example: the Mac I'm typing this on phones home every app I open in the name of “““protecting””” me from malware. Everyone found this out the hard way in November 2020 and the only result was to encrypt the OCSP check in later versions. Later versions also exempt Apple-signed binaries from filters like Little Snitch so it's now even harder to block. Sending those requests at all effectively gives interested parties the ability to run a “Hey Siri, make a list of every American who has used Tor Browser” type of analysis if they wanted to: https://lapcatsoftware.com/articles/ocsp-privacy.html

    • kmeisthax 41 minutes ago

      Transparent HTTP caching as a way to avoid leaking metadata is not pro-privacy. It only works because the network is always listening, to both metadata and message content. The reason why people worry about metadata is because it's a way to circumvent encryption (and the law). Metadata is holographic[0] to message content, so you need to protect it with the same zeal content is protected.

      But letting everyone have the message content so that metadata doesn't leak isn't helpful. Maybe in the context it was deployed, where pervasive deep packet inspection was only something China wasted their CPU cycles on, your proxy made sense. But it doesn't make sense today.

      [0] X is holographic to Y when the contents of X can be used to completely reconstruct Y.

    • anthk an hour ago

      Stop using a damn Mac first.

      • Lammy 41 minutes ago

        It's my work computer — not my choice. At home I use a Corebooted 51nb neo-ThinkPad.

  • WaitWaitWha 2 hours ago

    One key part is that the crypto wars were around export, lest we forget "PGP Source Code and Internals".

    If there was no international business, any-strength crypto would have been and could have been used.

    • convolvatron 2 hours ago

      there was a huge chilling effect on both product and protocol design. In the 90s I had to fill out a form and submit it to RSA in order to get a copy of their library. Which I eventually got after waiting 6 months, but I had to agree not to redistribute it in any way.

      Efforts to design foundational cryptographic protocols were completely hamstrung by the spectre of ITAR and the real possibility that designs would have to US only. Right around the time that the US gave up, the commercial community was taking off and they weren't at all interested in further standardization except was creating moats for their business - which is why we're still stuck in the 90s as far at the network layer goes.

  • g-b-r 20 minutes ago

    Aside from everything else, I don't understand what Whittaker's point was; she seemed to ultimately be advocating for something, but I can't understand what, exactly.

    It's probably in the talk's last sentences:

    > We want not only the right to deploy e2ee and privacy-preserving tech, but the power to make determinations about how, and for whom, our computational infrastructures work. This is the path to privacy, and to actual tech accountability. And we should accept nothing less.

    But who are "we" and "whom", and what "computational infrastructure" is she referring to?

  • hobs 2 hours ago

    In a nutshell I dont think we would have seen much change - corporations only engage in security insofar as much as they are required to - we've seen that even in this "metastatic SSL enabled growth" we've basically sold out security to the lowest common denominator, and core actors in the industry just use these security features as a fig leaf to pretend they give a single crap.

    Now, would CERTAIN industries exist without strong cryptography? Maybe not, but commerce doesn't really care about privacy in most cases, it cares about money changing hands.

    • red_admiral 3 minutes ago

      Cryptocurrency, if you accept it and its ecosystem as an industry, would certainly not exist. And as for privacy, a fairy dies every time some someone praises bitcoin for being anonymous.

    • InDubioProRubio 2 hours ago

      I dont know, they sure make sure the paper-trail is shredded and shedded with the Azure Document Abo 365. When it comes to security from liability everything is top notch.

  • anovikov an hour ago

    How could that be relevant for more than a few more years? The world does not end with the US. Regardless of the ban, strong crypto would have been developed elsewhere, as open source, and proliferated to the point of making continuation of the ban impossible: by ~2005 or earlier, it will be either US closing off from global Internet becoming a digital North Korea of a sort, or allowing strong crypto.

    • wmf an hour ago

      Popular OSes and browsers have almost entirely come from the US. If people had a choice between IE with weak crypto or Opera with strong crypto they absolutely would have chosen IE.

  • ForHackernews 2 hours ago

    I haven't seen the talk, but it sounds plausible to me: Technical people got strong crypto so they didn't worry about legislating for privacy.

    We still have this blind spot today: Google and Apple talk about security and privacy, but what they mean by those terms is making it so only they get your data.

    • MattJ100 an hour ago

      > Technical people got strong crypto so they didn't worry about legislating for privacy.

      The article debunks this, demonstrating that privacy was a primary concern (e.g. Cypherpunk's Manifesto) decades ago. Also that mass surveillance was already happening even further back.

      I think it's fair to say that security has made significantly more progress over the decades than privacy has, but I don't think there is evidence of a causal link. Rather, privacy rights are held back because of other separate factors.

      • thecrash an hour ago

        As you point out, decades ago privacy was a widespread social value among everyone who used the internet. Security through cryptography was also a widespread technical value among everyone (well at least some people) who designed software for the internet.

        Over time, because security and cryptography were beneficial to business and government, cryptography got steadily increasing technical investment and attention.

        On the other hand, since privacy as a social value does not serve business or government needs, it has been steadily de-emphasized and undermined.

        Technical people have coped with the progressive erosion of privacy by pointing to cryptography as a way for individuals to uphold their privacy even in the absence of state-protected rights or a civil society which cares. This is the tradeoff being described.

      • ForHackernews an hour ago

        > demonstrating that privacy was a primary concern (e.g. Cypherpunk's Manifesto) decades ago. Also that mass surveillance was already happening even further back.

        How does that debunk it? If they were so concerned, why didn't they do anything about it?

        One plausible answer: they were mollified by cryptography. Remember when it was revealed that the NSA was sniffing cleartext traffic between Google data centers[0]? In response, rather than campaigning for changes to legislation (requiring warrants for data collection, etc.), the big tech firms just started encrypting their internal traffic. If you're Google and your adversaries are nation state actors and other giant tech firms, that makes a lot of sense.

        But as far as user privacy goes, it's pointless: Google is the adversary.

        [0] https://theweek.com/articles/457590/why-google-isnt-happy-ab...

  • RamAMM 2 hours ago

    The missed opportunity was to provide privacy protection before everyone stepped into the spotlight. The limitations on RSA key sizes etc (symmetric key lengths, 3DES limits) did not materially affect the outcomes as we can see today. What did happen is that regulation was passed to allow 13 year olds to participate online much to the detriment of our society. What did happen was that business including credit agencies leaked ludicrous amounts of PII with no real harm to the bottom lines of these entities. The GOP themselves leaked the name, SSN, sex, and religion of over a hundred million US voters again with no harm to the leaking entity.

    We didn't go wrong in limiting export encryption strength to the evil 7, and we didn't go wrong in loosening encryption export restrictions. We entirely missed the boat on what matters by failing to define and protect the privacy rights of individuals until nearly all that mattered was publicly available to bad actors through negligence. This is part of the human propensity to prioritize today over tomorrow.

    • elric 2 hours ago

      > What did happen is that regulation was passed to allow 13 year olds to participate online much to the detriment of our society.

      That's a very hot take. Citation needed.

      I remember when the US forced COP(P?)A into being. I helped run a site aimed at kids back in those days. Suddenly we had to tell half of those kids to fuck off because of a weird and arbitrary age limit. Those kids were part of a great community, had a sense of belonging which they often didn't have in their meatspace lives, they had a safe space to explore ideas and engage with people from all over the world.

      But I'm sure that was all to the detriment of our society :eyeroll:.

      Ad peddling, stealing and selling personal information, that has been detrimental. Having kids engage with other kids on the interwebs? I doubt it.

      • ryandrake an hour ago

        Kids are not stupid, though. They know about the arbitrary age limit, and they know that if they are under that limit, their service is nerfed and/or not allowed. So, the end effect of COPPA is that everyone under 13 simply knows to use a fake birthdate online that shows them to be over the limit.

        • elric an hour ago

          Sure, it's one of the many rules that's bent and broken on a daily basis. Doesn't make it any less stupid. And it falls on the community owner to enforce, which is doubly stupid, as the only way to prove age is to provide ID, which requires a lot of administration, and that data then becomes a liability.

      • dfxm12 an hour ago

        COP(P?)A

        COPA [0] is a different law which never took effect. COPPA [1] is what you're referring to.

        Ad peddling, stealing and selling personal information, that has been detrimental.

        I agree and what's good for the gander is good for the goose. Why did we only recognize the need for privacy for people under an arbitrary age? We all deserve it!

        0 - https://en.wikipedia.org/wiki/Child_Online_Protection_Act

        1 - https://en.wikipedia.org/wiki/Children%27s_Online_Privacy_Pr...

      • bippihippi1 an hour ago

        the issue with online kids isn't just the availability of the internet to kids but the availability of the kids to the internet

  • ikmckenz 37 minutes ago

    This is a good article, and throughly debunks the proposed tradeoff between fighting corporate vs government surveillance. It seems to me that the people who concentrate primarily on corporate surveillance primarily want government solutions (privacy regulations, for example), and eventually get it in their heads that the NSA are their friends.