US has investigated claims WhatsApp chats aren't private

(bloomberg.com)

85 points | by 1vuio0pswjnm7 4 hours ago ago

239 comments

  • coppsilgold 22 minutes ago

    No closed-source E2EE client can be truly secure because the ends of e2e are opaque.

    Detecting backdoors is only truly feasible with open source software and even then it can difficult.

    A backdoor can be a subtle remote code execution "vulnerability" that can only be exploited by the server. If used carefully and it exfiltrates data in expected client-server communications it can be all but impossible to detect. This approach also makes it more likely that almost no insider will even be aware of it, it could be a small patch applied during the build process or to the binary itself (for example, a bound check branch). This is also another reason why reproducible builds are a good idea for open source software.

    • JasonADrury 9 minutes ago

      >Detecting backdoors is only truly feasible with open source software and even then it can difficult.

      This is absurd. Detecting backdoors is only truly feasible on binaries, there's no way you can understand compiler behavior well enough to be able to spot hidden backdoors in source code.

    • TZubiri 13 minutes ago

      With all due respect to Stallman, you can actually study binaries.

      The claim Stallman would make (after punishing you for using Open Source instead of Free Software for an hour) is that Closed Software (Proprietary Software) is unjust. but in the context of security, the claim would be limited to Free Software being capable of being secure too.

      You may be able to argue that Open Source reduces risk in threat models where the manufacturer is the attacker, but in any other threat model, security is an advantage of closed source. It's automatic obfuscation.

      There's a lot of advantages to Free Software, you don't need to make up some.

      • sigmoid10 2 minutes ago

        This. Closed source doesn't stop people from finding exploits in the same way that open source doesn't magically make people find them. The Windows kernel is proprietary and closed source, but people constantly find exploits in it anyways. What matters is that there is a large audience that cares about auditing. OTOH if Microsoft really wanted to sneak in a super hard to detect spyware exploit, they probably could - but so could the Linux kernel devs. Some exploits have been openly sitting in the Linux kernel for more than a decade despite everyone being able to audit it in theory. Who's to say they weren't planted by some three letter agency who coerced a developer.

      • parhamn 10 minutes ago

        Expalin how you detect a branched/flaged sendKey (or whatever it would be called) call in the compiled WhatsApp iOS app?

        It could be interleaved in any of the many analytics tools in there too.

        You have to trust the client in E2E encryption. There's literally no way around that. You need to trust the client's OS (and in some cases, other processes) too.

        • JasonADrury 6 minutes ago

          >Expalin how you detect a branched/flaged sendKey (or whatever it would be called) call in the compiled WhatsApp iOS app?

          Vastly easier than spotting a clever bugdoor in the source code of said app.

          • refulgentis 2 minutes ago

            Putting it all on the table: do you agree with the claim that binary analysis is just as good as source code analysis?

            • JasonADrury a few seconds ago

              Binary analysis is vastly better than source code analysis, reliably detecting bugdoors via source code analysis requires an unrealistically deep knowledge of compiler behavior.

      • oofbey 6 minutes ago

        What’s the state of the art of reverse engineering source code from binaries in the age of agentic coding? Seems like something agents should be pretty good at, but haven’t read anything about it.

        • JasonADrury 2 minutes ago

          I've been working on this, the results are pretty great when using the fancier models. I have successfully had gpt5.2 complete fairly complex matching decompilation projects, but also projects with more flexible requirements.

        • refulgentis 4 minutes ago

          Agents are sort of irrelevant to this discussion, no?

          Like, it's assuredly harder for an agent than having access to the code, if only because there's a theoratical opportunity to misunderstand the decompile.

          Alternatively, it's assuredly easier for an agent because given execution time approaches infinity, they can try all possible interpretations.

      • refulgentis 10 minutes ago

        This comment comes across as unnecessarily aggressive and out of nowhere (Stallman?), it's really hard to parse.

        Does this rewording reflect it's meaning?

        "You don't actually need code to evaluate security, you can analyze a binary just as well."

        Because that doesn't sound correct?

        But that's just my first pass, at a high level. Don't wanna overinterpret until I'm on surer ground about what the dispute is. (i.e. don't want to mind read :) )

        Steelman for my current understanding is limited to "you can check if it writes files/accesses network, and if it doesn't, then by definition the chats are private and its secure", which sounds facile. (presumably something is being written to somewhere for the whole chat thing to work, can't do P2P because someone's app might not be open when you send)

  • martinralbrecht 5 hours ago

    WhatsApp's end-to-end encryption has been independently investigated: https://kclpure.kcl.ac.uk/ws/files/324396471/whatsapp.pdf

    Full version here: https://eprint.iacr.org/2025/794.pdf

    We didn't review the entire source code, only the cryptographic core. That said, the main issue we found was that the WhatsApp servers ultimately decide who is and isn't in a particular chat. Dan Goodin wrote about it here: https://arstechnica.com/security/2025/05/whatsapp-provides-n...

    • vpShane 2 hours ago

      > We didn't review the entire source code And, you don't see the issue with that? Facebook was bypassing security measures for mobile by sending data to itself on localhost using websockets and webrtc.

      https://cybersecuritynews.com/track-android-users-covertly/

      An audit of 'they can't read it cryptographically' but the app can read it, and the app sends data in all directions. Push notifications can be used to read messages.

      • miduil an hour ago

        > Push notifications can be used to read messages.

        Are you trying to imply that WhatsApp is bypassing e2e messaging through Push notifications?

        Unless something has changed, this table highlights that both Signal and WhatsApp are using a "Push-to-Sync" technique to notify about new messages.

        https://crysp.petsymposium.org/popets/2024/popets-2024-0151....

        • itsthecourier an hour ago

          Push-to-Sync. We observed 8 apps employ a push-to-sync strat- egy to prevent privacy leakage to Google via FCM. In this mitigation strategy, apps send an empty (or almost empty) push notification to FCM. Some apps, such as Signal, send a push notification with no data (aside from the fields that Google sets; see Figure 4). Other apps may send an identifier (including, in some cases, a phone num- ber). This push notification tells the app to query the app server for data, the data is retrieved securely by the app, and then a push notification is populated on the client side with the unencrypted data. In these cases, the only metadata that FCM receives is that the user received some message or messages, and when that push noti- fication was issued. Achieving this requires sending an additional network request to the app server to fetch the data and keeping track of identifiers used to correlate the push notification received on the user device with the message on the app server.

    • digdigdag an hour ago

      > We didn't review the entire source code

      Then it's not fully investigated. That should put any assessments to rest.

      • 3rodents an hour ago

        By that standard, it can never be verified because what is running and what is reviewed could be different. Reviewing relevant elements is as meaningful as reviewing all the source code.

        • giancarlostoro 36 minutes ago

          Or they could even take out the backdoor code and then put it back in after review.

          • taneq 28 minutes ago

            Ah yes, the Volkswagen solution.

      • ghurtado 25 minutes ago

        I have to assume you have never worked on security cataloging of third party dependencies on a large code base.

        Because if you had, you would realize how ridiculous it is to state that app security can't be assessed until you have read 100% of the code

        That's like saying "well, we don't know how many other houses in the city might be on fire, so we should let this one burn until we know for sure"

      • Barrin92 an hour ago

        as long as client side encryption has been audited, which to my understanding is the case, it doesn't matter. That is literally the point of encryption, communication across adversarial channels. Unless you think Facebook has broken the laws of mathematics it's impossible for them to decrypt the content of messages without the users private keys.

        • maqp an hour ago

          Well the thing is, the key exfiltration code would probably reside outside the TCB. Not particularly hard to have some function grab the signing keys, and send them to the server. Then you can impersonate as the user in MITM. That exfiltration is one-time and it's quite hard to recover from.

          I'd much rather not have blind faith on WhatsApp doing the right thing, and instead just use Signal so I can verify myself it's key management is doing only what it should.

          Speculating over the correctness of E2EE implementation isn't productive, considering the metadata leak we know Meta takes full advantage of, is enough reason to stick proper platforms like Signal.

          • subw00f 35 minutes ago

            Not that I trust Facebook or anything but wouldn’t a motivated investigator be able to find this key exfiltration “function” or code by now? Unless there is some remote code execution flow going on.

        • hn_throwaway_99 35 minutes ago

          The issue is what the client app does with the information after it is decrypted. As Snowden remarked after he released his trove, encryption works, and it's not like the NSA or anyone else has some super secret decoder ring. The problem is endpoint security is borderline atrocious and an obvious achilles heel - the information has to be decoded in order to display it to the end user, so that's a much easier attack vector than trying to break the encryption itself.

          So the point other commenters are making is that you can verify all you want that the encryption is robust and secure, but that doesn't mean the app can't just send a copy of the info to a server somewhere after it has been decoded.

    • cookiengineer an hour ago

      Why did you not mention that the WhatsApp apk, even on non-google play installed devices, loads google tag manager's scripts?

      It is reproducibly loaded in each chat, and an MitM firewall can also confirm that. I don't know why the focus of audits like these are always on a specific part of the app or only about the cryptography parts, and not the overall behavior of what is leaked and transferred over the wire, and not about potential side channel or bypass attacks.

      Transport encryption is useless if the client copies the plaintext of the messages afterwards to another server, or say an online service for translation, you know.

      • tptacek 42 minutes ago

        There's a whole section, early, in the analysis Albrecht posted that surfaces these concerns.

    • morshu9001 41 minutes ago

      They also decide what public key is associated with a phone number, right? Unless you verify in person.

    • some_furry 4 hours ago

      Thank you for actually evaluating the technology as implemented instead of speculating wildly about what Facebook can do based on vibes.

      • chaps 2 hours ago

        Unfortunately a lot of investigations start out as speculation/vibes before they turn into an actual evaluation. And getting past speculation/vibes can take a lot of effort and political/social/professional capital before even starting.

  • cosmicgadget 6 hours ago

    > “We look forward to moving forward with those claims and note WhatsApp’s denials have all been carefully worded in a way that stops short of denying the central allegation in the complaint – that Meta has the ability to read WhatsApp messages, regardless of its claims about end-to-end encryption.”

    My money is on the chats being end to end encrypted and separately uploaded to Facebook.

    • gruez 5 hours ago

      >being end to end encrypted and separately uploaded to Facebook

      That's a cute loophole you thought up, but whatsapp's marketing is pretty unequivocal that they can't read your messages.

      >With end-to-end encryption on WhatsApp, your personal messages and calls are secured with a lock. Only you and the person you're talking to can read or listen to them, and no one else, not even WhatsApp

      https://www.whatsapp.com/

      That's not to say it's impossible that they are secretly uploading your messages, but the implication that they could be secretly doing so while not running afoul of their own claims because of cute word games, is outright false.

      • blibble 5 hours ago

        > but whatsapp's marketing is pretty unequivocal that they can't read your messages.

        well that's alright then

        facebook's marketing and executives have always been completely above board and completely honest

        • gruez 5 hours ago

          Read the rest of my comment?

          >That's not to say it's impossible that they are secretly uploading your messages, but the implication that they could be secretly doing so while not running afoul of their own claims because of cute word games, is outright false.

      • a0123 4 minutes ago

        > That's a cute loophole you thought up, but whatsapp's marketing is pretty unequivocal that they can't read your messages.

        If Facebook says it, then... Sorted!

      • codyb 5 hours ago

        The thing is, if they were uploading your messages, then they'd want to do something with the data.

        And humans aren't great at keeping secrets.

        So, if the claim is that there's a bunch of data, but everyone who is using it to great gain is completely and totally mum about it, and no one else has ever thought to question where certain inferences were coming from, and no employee ever questioned any API calls or database usage or traffic graph.

        Well, that's just about the best damn kept secret in town and I hope my messages are as safe!

        And I'm no fan of Meta...

        • 3eb7988a1663 4 hours ago

          Where were the Facebook whistleblowers about the numerous IOS/Android gaps that let the company gain more information than they were to supposed to see? Malicious VPNs, scanning other installed mobile applications, whatever. As far as I know, the big indictments have been found from the outside.

          • gruez 3 hours ago

            >Malicious VPNs

            AFAIK that was a separate app, and it was pretty clear that it was MITMing your connections. It's not any different than say, complaining about how there weren't any whistleblowers for fortinet (who sell enterprise firewalls).

            >scanning other installed mobile applications

            Source?

      • cosmicgadget 5 hours ago

        Are messages and calls data at rest or data in motion? The UI lock feature refers to 'chats' which could be their term for data at rest.

        I wonder what the eula says.

      • conscion 3 hours ago

        My guess is that they are end-to-end encrypted. And because of Facebook's scale that they're able to probabilisticly guess at what's in the encrypted messages (e.g.a message with X hash has Y probability of containing the word "shoes")

        • ghurtado 19 minutes ago

          > they're able to probabilisticly guess at

          That's not how encryption works at all. At least not any encryption used in the last 100 years.

          You'd probably have to go all the way back to the encryption methods of the Roman empire for that statement to make sense

        • gruez 3 hours ago

          That seems unlikely given that they use the signal protocol: https://signal.org/blog/whatsapp-complete/

        • stefs an hour ago

          That would still be very close to educated mind reading

      • netsharc 5 hours ago

        I wonder if keyword/sentiment extraction on the user's device counts as reading "by WhatsApp"...

        There's the conspiracy theory about mentioning a product near the phone and then getting ads for it (which I don't believe), but I feel like I've mentioned products on WhatsApp chats with friends and then got an ad for them on Instagram sometime after.

        Also claiming "no one else can read it" is a bit brave, what if the user's phone has spyware that takes screenshots of WhatsApp... (Technically of course it's outside of their scope to protect against this, but try explaining that to a judge who sees their claim and the reality)

      • blindriver 2 hours ago

        "We can't read your messages! They are encrypted on disk and we don't store the keys!"

        "What encryption do you use?"

        "DES."

    • matthewdgreen 5 hours ago

      I really doubt this. Any such upload would be visible inside the WhatsApp application, which would make it the world's most exciting (and relatively straightforward) RE project. You can even start with a Java app, so it's extra easy.

      • cosmicgadget 5 hours ago

        If you claim REing a flagship FAANG application is "extra easy", either they need to be laughed out of the room or you do.

        • gruez 3 hours ago

          Does FAANG apps have antidebug or code obfuscation? At least for google their apps are pretty lightly protected. The maximum extent of obfuscation is the standard compilation/optimization process that most apps go through (eg. r8 or proguard).

        • quesera 5 hours ago

          Reverse engineering is easy when the source code is available. :)

          The difference between source code in a high-level language, and AArch64 machine language, is surmountable. The effort is made easier if you can focus on calls to the crypto and networking libraries.

          • cosmicgadget 4 hours ago

            The source is available?

            Understanding program flow is very different from understanding the composition of data passing though the program.

            • quesera 4 hours ago

              At some level, the machine code is the source code -- but decompiling AArch64 mobile apps into something like Java is common practice.

              As GP alludes, you would be looking for a secondary pathway for message transmission. This would be difficult to hide in AArch64 code (from a skilled practitioner), and extra difficult in decompiled Java.

              It would be "easy" enough, and an enormous prize, for anyone in the field.

              • cosmicgadget 3 hours ago

                I am familiar with disassembly and decompilation and what you just said is a huge handwave.

                > a secondary pathway for message transmission

                That's certainly the only way messages could be uploaded to Facebook!

                • quesera 3 hours ago

                  I'm curious why you think it's handwavy.

                  I've done this work on other mobile apps (not WhatsApp), and the work is not out of the ordinary.

                  It's difficult to hide subtleties in decompiled code. And anything that looks hairbally gets special attention, if the calling sites or side effects are interesting.

                  (edit for edit)

                  > That's certainly the only way messages could be uploaded to Facebook!

                  Well, there's a primary pathway which should be very obvious. And if there's a secondary pathway, it's probably for telemetry etc. If there are others, or if it isn't telemetry, you dig deeper.

                  All secrets are out in the open at that point. There are no black boxes in mobile app code.

                  • cosmicgadget 3 hours ago

                    > if there's a secondary pathway, it's probably for telemetry etc.

                    Seems like a good channel upon which to piggyback user data. Now all you have to do is obfuscate the serialization.

                    > It's difficult to hide subtleties in decompiled code.

                    Stripped, obfuscated code? Really? Are we assuming debug ability here?

                    > All secrets are out in the open at that point. There are no black boxes in mobile app code.

                    What about a loader with an encrypted binary that does a device attestation check?

                    • quesera 2 hours ago

                      I've lost track of our points of disagreement here. Sure, it's work, but it's all doable.

                      Obfuscated code is more difficult to unravel in its orginal form than the decompiled form. Decompiled code is a mess with no guideposts, but that's just a matter of time and patience to fix. It's genuinely tricky to write code that decompiles into deceptive appearances.

                      Original position is that it'd be difficult to hide side channel leakage of chat messages in the WhatsApp mobile app. I have not worked on the WhatsApp app, but if it's anything like the mobile apps I have analyzed, I think this is the correct position.

                      If the WhatsApp mobile apps are hairballs of obfuscation and misdirection, I would be a) very surprised, and b) highly suspicious. Since I don't do this work every day any more, I haven't thought much about it. But there are so many people who do this work every day, and WhatsApp is so popular, I'd be genuinely shocked if there were fewer than hundreds of people who have lightly scanned the apps for anything hairbally that would be worth further digging. Maybe I'm wrong and WhatsApp is special though. Happy to be informed if so.

        • martinralbrecht 4 hours ago

          Note that WhatsApp as a web client, too: https://eprint.iacr.org/2025/794

    • varenc 5 hours ago

      If this was happening en-masse, wouldn't this be discovered by the many people reverse engineering WhatsApp? Reverse engineering is hard sophisticated work, but given how popular WhatsApp is plenty of independent security researchers are doing it. I'm quite skeptical Meta could hide some malicious code in WhatsApp that's breaking the E2EE without it being discovered.

      • solenoid0937 5 hours ago

        It would be trivial to discover and would be pretty big news in the security community.

        I'd wager most of these comments are from nontechnical people, or technical people that are very far removed from security.

        • cosmicgadget 4 hours ago

          I'm technical and work in security. Since it is trivial, please explain. Ideally not using a strawman like "well just run strings and look for uploadPlaintextChatsToServer()".

          • solenoid0937 4 hours ago

            I don't see why standard RE techniques (DBI/Frida + MITM) wouldn't work, do you?

            WhatsApp is constantly RE'd because it'd be incredibly valuable to discover gaps in its security posture, the community would find any exfil here.

            • martinralbrecht 4 hours ago

              We did reverse engineer it and we're cryptographers not reverse engineering experts https://eprint.iacr.org/2025/794

            • cosmicgadget 4 hours ago

              If people are trivially hooking IOS and Android applications then sure, it's just an exercise in dynamic analysis.

              Mobile applications are outside my domain so I am surprised platform security (SEL, attestation, etc.) has been so easily defeated.

      • palata 5 hours ago

        Before that, Meta employees would know about it. Pretty convinced that someone would leak it.

      • cosmicgadget 5 hours ago

        Well they wouldn't be breaking e2ee, they'd be breaking the implicit promise of e2ee. The chats are still inaccessible to intermediaries, they'd just be stored elsewhere. Like Apple and Microsoft do.

        I am not familiar with the state of app RE. But between code obfuscators and the difficulty of distinguishing between 'normal' phone home data and user chats when doing static analysis... I'd say it's not out of the question.

    • random3 6 hours ago

      That’s because they have such a good track record wrt to privacy? https://www.docketalarm.com/cases/California_Northern_Distri...

      • fn-mote 5 hours ago

        That document is the dismissal of claims by an economist about Facebook’s privacy practices. I don’t see how it supports your argument.

    • steve_taylor 5 hours ago

      > My money is on the chats being end to end encrypted and separately uploaded to Facebook.

      If governments of various countries have compelled Meta to provide a backdoor and also required non-disclosure (e.g. a TCN secretly issued to Meta under Australia's Assistance and Access Act), this is how I imagined they would do it. It technically doesn't break encryption as the receiving device receives the encrypted message.

    • guerrilla 5 hours ago

      > My money is on the chats being end to end encrypted and separately uploaded to Facebook.

      This is what I've suspected for a long time. I bet that's it. They can already read both ends, no need to b0rk the encryption. It's just them doing their job to protect you from fourth parties, not from themselves.

    • RajT88 an hour ago

      Facebook messenger similarly claims to be end to end encrypted, and yet if it thinks you are sending a link to a pirate site, it "fails to send". I imagine there are a great many blacklisted sites which they shadow block, despite "not being able to read your messages".

      My pet conspiracy theory is that the "backup code" which "restores" encrypted messages is there to annoy you into installing the app instead of chatting on the web.

      • loeg an hour ago

        The client probably just downloads a blacklist of banned domains. That doesn't mean messages that are sent are not E2E encrypted.

        • RajT88 an hour ago

          Facebook has lost any benefit of doubt, imo.

    • FabHK 5 hours ago

      It should be detectable if it sends twice the data.

      • rurban an hour ago

        It encrypts it to all the keys with the phone number registered for that user. Because users are switching phones, but keep their number. But each new WhatsApp app gets a new private key, the old key is not shared. This feature was added later, so the old WhatsApp devs wouldn't know.

        So it would be trivial to encrypt to the NSA key also, as done on Windows.

  • prakashn27 2 hours ago

    Ex-WhatsApp engineer here. WhatsApp team makes so much effort to make this end to end encrypted messages possible. From the time I worked I know for sure it is not possible to read the encrypted messages.

    From business standpoint they don’t have to read these messages, since WhatsApp business API provide the necessary funding for the org as a whole.

    • M95D 5 minutes ago

      From what you know about WA, is it possible for the servers to MitM the connection between two clients? Is there a way for a client to independently verify the identity of the other client, such as by comparing keys (is it even possible to view them?), or comparing the contents of data packets sent from one client with the ones received on the other side?

      Thanks.

    • maqp an hour ago

      Nice! Hey, question: I noticed Signal at one point had same address on Google Play Store as WA. Can you tell us if Signal devs shared office space with WA during integration of the Signal protocol? Related to that, did they hold WA devs' hand during the process, meaning at least at the time it was sort of greenlighted by Moxie or something. If this is stuff under NDA I fully understand but anything you can share I'd love to hear.

    • 46493168 an hour ago

      Facebook has never been satisfied with direct funding. The value is in selling attention and influencing users’ behavior.

    • blindriver 2 hours ago

      It only takes one engineer in all the teams at Whatsapp that has different directives to make all your privacy work completely useless.

      • rustyhancock an hour ago

        The legal and liability protection these messaging services get from E2EE is far too big to break it.

        Besides I get the feeling we're so cooked these days from marketing that when I get freaked out that an advert is what I was thinking about. It's probably because they made me think about it.

        Or maybe I need to update my meds?

      • philipallstar 2 hours ago

        Assuming there's no code review or audit, I suppose.

      • cactusfrog an hour ago

        I would be surprised if the code was hidden from other people engineers.

        • maqp an hour ago

          How are you hiding it from IDA pro though?

  • codethief 4 hours ago

    Matthew Green's take from 3 days ago:

    > There’s a lawsuit against WhatsApp making the rounds today, claiming that Meta has access to plaintext. I see nothing in there that’s compelling; the whole thing sounds like a fishing expedition.

    https://bsky.app/profile/matthewdgreen.bsky.social/post/3mdg...

  • ubermonkey 2 minutes ago

    WhatsApp belongs to Meta.

    Why would anyone believe those chats are private?

  • youknownothing 5 hours ago

    Just to throw in a couple of possibly outlandish theories:

    1. as others have said, they could be collecting the encrypted messages and then tried to decrypt them using quantum computing, the Chinese have been reportedly trying to do this for many years now.

    2. with metadata and all the information from other sources, they could infer what the conversation is about without the need to decrypt it: if I visit a page (Facebook cookies, they know), then I share a message to my friend John, and then John visits the same page (again, cookies), then they can be pretty certain that the contain of the message was me sharing the link.

    • solenoid0937 5 hours ago

      (1) made me chuckle. I've worked at nearly every FAANG including Meta. These companies aren't nearly as advanced or competent as you think.

      I no longer work at Meta, but in my mind a more likely scenario than (1) is: a senior engineer proposes a 'Decryption at Scale' framework solely to secure their E6 promo, and writes a 40-page Google Doc to farm 'direction' points for PSC. Five PMs repost this on Workplace to celebrate the "alignment" so they can also include it in their PSCs.

      The TL and PMs immediately abandon the project after ratings are locked because they already farmed the credit for it. The actual implementation gets assigned to an E4 bootcamp grad who is told by a non-technical EM to pivot 3 months in because it doesn't look like 'measurable impact' in a perf packet. The E4 gets fired to fill the layoff quota and everyone else sails off into the sunset.

    • petcat an hour ago

      I think this is the most likely scenario. The US government is not necessarily trying to read the messages right now, in real-time. But it wants to read the messages at some point in the future.

      https://en.wikipedia.org/wiki/Utah_Data_Center

    • instagib 4 hours ago

      2) enough metadata can reveal a person's life, habits, and location which removes the need to analyze the actual bulky content of communications.

      can analyze receivers data or receivers contact trees data which is easier to access.

      The number of free or paid data sources is daunting.

    • wasabi991011 5 hours ago

      Re. quantum computing: no chance, the scientific and engineering breakthroughs they would need are too outlandish, like claiming China already had a 2026-level frontier model back in 2016.

  • mrtksn 7 hours ago

    I wonder how these investigations go? Are they just asking them if it is true? Are they working with IT specialist to technically analyze the apps? Are they requesting the source code that can be demonstrated to be the same one that runs on the user devices and then analyze that code?

    • RenThraysk 6 hours ago

      Multiple governments will already know as they have analyzed and reverse engineered it.

    • mattmaroon 6 hours ago

      That will be step 1. Fear of being caught lying to the government is such that that is usually enough. Presumably at least a handful of people would have to know about it, and nobody likes their job at Facebook enough to go to jail over it.

      But you never know.

      • hsuduebc2 6 hours ago

        Companies lie to governments and the public all the time. I doubt that even if something were found and the case were lost, it would lead to prison or any truly severe punishment. No money was stolen and no lives were put at risk. At worst, it would likely end in a fine, and then it would be forgotten, especially given Meta’s repeated violations of user trust.

        The reality is that most users do not seem to care. For many, WhatsApp is simply “free SMS,” tied to a phone number, so it feels familiar and easy to understand, and the broader implications are ignored.

        • mattmaroon 5 hours ago

          Martha Stewart went to jail for lying to the government. The fact that there would be no punishment is why they would tell the truth.

          The government is pretty harsh when they find out you lied under oath. Corporate officers do not lie to the government frequently.

    • TZubiri 7 hours ago

      Anyone can audit the client binaries

  • londons_explore 6 hours ago

    I want whatsapp to decrypt the messages in a secure enclave and render the message content to the screen with a secure rendering pipeline, as is done with DRM'ed video.

    Compromise of the client side application or OS shouldn't break the security model.

    This should be possible with current API's, since each message could if needed simply be a single frame DRM'ed video if no better approach exists (or until a better approach is built).

    • Retr0id 6 hours ago

      Signal uses the DRM APIs to mitigate threats like Microsoft Recall, but it doesn't stop the app itself from reading its own data.

      I don't really see how it's possible to mitigate client compromise. You can decrypt stuff on a secure enclave but at some point the client has to pull it out and render it.

      • bogwog 6 hours ago

        > I don't really see how it's possible to mitigate client compromise

        Easy: pass laws requiring chat providers to implement interoperability standards so that users can bring their own trusted clients. You're still at risk if your recipient is using a compromised client, but that's a problem that you have the power to solve, and it's much easier to convince someone to switch a secure client if they don't have to worry about losing their contacts.

        • palata 5 hours ago

          > Easy: pass laws requiring chat providers to implement interoperability standards so that users can bring their own trusted clients.

          In Europe that's called the Digital Markets Act.

          • digiown 4 hours ago

            That's not permissionless afaik. "Users" can't really do it. It's frustrating that all these legislations appear to view it as a business problem rather than a private individual's right to communicate securely.

            • palata 2 minutes ago

              Right, I get what you mean.

              But in a way, I feel like sometimes it makes sense to not completely open everything. Say a messaging app, it makes sense to not just make it free for all. As a company, if I let you interoperate with my servers that I pay and maintain, I guess it makes sense that I may want to check who you are before. I think?

        • xvector 6 hours ago

          You seem to think the government wants your messages to be private and would "pass laws" to this effect.

          Methinks you put far too much faith in the government, at least from my understanding of the history of cybersecurity :)

      • londons_explore 6 hours ago

        > don't really see how it's possible to mitigate client compromise.

        Think of the way DRM'ed video is played. If the media player application is compromised, the video data is still secure. Thats because the GPU does both the decryption and rendering, and will not let the application read it back.

        • gruez 3 hours ago

          That's not what signal's doing though. It's just asking the OS nicely to not capture screen contents. There are secure ways of doing media playback, but that's not what signal's using.

        • Retr0id 6 hours ago

          Video decryption+decoding is a well-defined enough problem that you can ship silicon that does it. You can't do the same thing for the UI of a social media app.

          You could put the entire app within TrustZone, but then you're not trusting the app vendor any less than you were before.

          • Retr0id 5 hours ago

            Although now I think about it more, you could have APIs for "decrypt this [text/image] with key $id, and render it as a secure overlay at coordinates ($x, $y)"

            • londons_explore 5 hours ago

              Exactly. Thats how DRM video works, and I don't see why you couldn't do the same for text.

              • Retr0id 4 hours ago

                Actual DRM uses symmetric keys though, figuring out how to do the crypto in an E2EE-comaptible way would be challenging.

        • pennomi 5 hours ago

          There will always, ALWAYS be the analog hole in security models like this.

          • londons_explore an hour ago

            It's pretty hard for the government or service provider to snoop through the analog hole unless they have a camera on your forehead...

      • willis936 6 hours ago

        By avoiding untrustworthy clients. All Windows devices should be considered compromised after last year.

        • Retr0id 6 hours ago

          That's not mitigating client compromise, that's a whole other thing - trying to construct an uncompromiseable client.

          You don't build defense-in-depth by assuming something can't be compromised.

          • willis936 6 hours ago

            Clients can always be compromised. I'm not talking about a client that can't be compromised, but simply a client that is not compromised out-of-the-box.

            • Retr0id 6 hours ago

              That seems orthogonal to the subject of this discussion, i.e. "Compromise of the client side application or OS shouldn't break the security model."

        • cobertos 6 hours ago

          Windows has been sending usage history back to their servers for longer than just last year

        • GraemeMeyer 6 hours ago

          Why last year?

          • willis936 6 hours ago

            Windows recall, intrusive addition of AI features (is there even a pinky promise that they're not training on user data?), more builtin ads, and less user control (most notably the removal of using the OS without an account - something that makes sense in the context of undisclosed theft of private information).

            This was 2025. I'm excited for what 2026 will bring. Things are moving fast indeed.

      • HumblyTossed 6 hours ago

        This. The gap in E2E is the point at which I type in clear text and the point at which I read clear text. Those can be exploited.

    • rsync 3 hours ago

      “I want whatsapp to decrypt the messages in a secure enclave and render the message content to the screen with a secure rendering pipeline, as is done with DRM'ed video.“

      If you are sophisticated enough to understand, and want, these things (and I believe that you are) …

      … then why would you want to use WhatsApp in the first place?

      • londons_explore 43 minutes ago

        Because my goal isn't to have my communication secure - but to have everyone's communication secure.

        And the network effect of whatsapp (3 billion users) seems currently the best route to that.

    • OtherShrezzing 6 hours ago

      This is what a layman would assume happens from Meta’s WhatsApp advertising. They show the e2e process, and have the message entirely unreadable by anyone but the phone owner.

      • kevin_thibedeau 6 hours ago

        e2e means unreadable by a middleman. That is a small inconvenience if you can readily compromise an endpoint.

        • Almondsetat 6 hours ago

          People keep talking about e2ee as if it was some brain-to-brain encoding that truly allowed only the recipient person to decrypt the message

          • dijit 6 hours ago

            because it used to be that the ends and the middlemen were different entities.

            In the universe where they are the same entity (walled-gardens) there is only the middleman.

            In such cases you either trust them or you don’t, anything more is not required because they can compromise their own endpoints in a way you can not detect.

  • lukeschlather 5 hours ago

    It seems obvious that they can. It's my understanding for FB Messenger that the private key is stored encrypted with a key that is derived from the user's password. So it's not straightforward, but Meta is obviously in a position to grab the user's password when they authenticate and obtain their private key. This would probably leave traces, but someone working with company authorization could probably do it.

    For WhatsApp they claim it is like Signal, with the caveat that if you have backups enabled it works like Messenger. Although interestingly if you have backups enabled the key may be stored with Apple/Google rather than Meta, it might be the case that with backup enabled your phone vendor can read your WhatsApp messages but Facebook cannot.

  • solenoid0937 5 hours ago

    So many people that strongly believe WhatsApp isn't E2EE!

    Quick, someone set up a Kalshi or Polymarket or whatever claiming that WhatsApp isn't E2EE.

    I'll gladly bet against the total volume of people that believe it isn't E2EE -- it'll be an easy 2x for you or me.

  • vbezhenar 5 hours ago

    Whatsapp is considered insecure and banned from use for military in Russia. Telegram, on the other hand, is widely used. Of course that's not something definitive, but just a food for thought.

    • vimda 2 hours ago

      Telegram which famously didn't have _any_ end to end encryption for ages, and even now only has very limited opt-in "secret chats"?

      • vbezhenar 19 minutes ago

        I'm not going to promote Telegram, just wanted to highlight that Whatsapp is not considered trustworthy by a geopolitical enemy of US. I don't think that Telegram is bad, and when your life depends on it, you can click "Secret Chat" button, it's not a big deal.

    • gruez 5 hours ago

      > but just a food for thought.

      ...that telegram is backdoored by the russians? The implication you're trying to make seems to be that russians must be choosing telegram because it's secure, but are ignoring the possibility that they're choosing telegram because they have access to it. After all, you think they want the possibility of their military scheming against them?

      • p1anecrazy 3 hours ago

        I guess their point was that Russian military doesn‘t care if Russian intelligence reads their messages

        • gruez 3 hours ago

          Maybe OP should clearly state their thesis rather than beating around the bush with "... just a food for thought", so we don't have to guess what he's trying to say.

  • hiprob 5 hours ago

    I know the default assumption with Telegram is that they can read all your messages, but unlike WhatsApp they seem less cooperative and I never got the notion that they ever read private messages until the Macron incident, and even then they do if the other party reports them. How come they are able to be this exception despite not having end to end encryption by default?

  • moffers an hour ago

    I feel fairly confident an oddly-shaped donation from Mark Z’s foundation will make this go away.

    • Kiboneu an hour ago

      I'd bet that shape would look like a tube with a cap on.

  • ohcmon 5 hours ago

    Next time you use true real independently audited e2e communication channel, don’t forget to check who is the authority who says that the "other end" is "the end" you think it is

  • miohtama 6 hours ago

    Both things cannot be true at the same time

    - WhatsApp encryption is broken

    - EU's and UK's Chat Control spooks demand Meta to insert backdoor because they cannot break the encryption

    The Guardian has its own editorial flavour on tech news, so expect them to use any excuse to bash the subject.

    • Retric 6 hours ago

      Just because Adam has a back door doesn’t mean Eve also has a back door.

    • preisschild 5 hours ago

      > EU's and UK's Chat Control spooks demand Meta to insert backdoor because they cannot break the encryption

      Those are not law, so no the EU doesnt demand that

    • dyauspitr 6 hours ago

      They’re just not sharing the back door with the EU?

  • 0x_rs 5 hours ago

    It's a proprietary, closed-source application. It can do whatever it wants, and it doesn't even need to "backdoor" encryption when all it has to do is just forward everything matching some criteria to their servers (and by extension anyone they comply to). It's always one update away from dumping your entire chat history into a remote bucket, and it would still not be in contradiction with their promise of E2EE. Furthermore, it already has the functionality to send messages when reporting [0]. Facebook's Messenger also has worked that way for years. [1] There were also rumors the on-device scanning practice would be expanded to comply with surveillance proposals such as ChatControl a couple years ago. This doesn't mean it's spying on each and every message now, but it would have potential to do so and it would be feasible today more than ever before, hence the importance of software the average person can trust and isn't as easily subject to their government's tantrums about privacy.

    0. https://www.propublica.org/article/how-facebook-undermines-p...

    1. https://archive.is/fe6zY

    • paxys 4 hours ago

      You are also using proprietary, closed-source hardware and operating system underneath the app that can do whatever they want. This line of reasoning ultimately leads to - unless you craft every atom and every bit yourself your data isn't secure. Which may be true, but is a pointless discussion.

      • threatofrain 3 hours ago

        No it means you calculate how much risk you're taking on, vendor by vendor. Do all companies have the same reputation before your eyes?

      • OutOfHere 3 hours ago

        That's a bad take because the vendors there are different; they're not Meta. As such, it's not pointless.

  • nindalf 5 hours ago

    This reads like a nothingburger. Couple of quotes from the article:

    > the idea that WhatsApp can selectively and retroactively access the content of [end-to-end encrypted] individual chats is a mathematical impossibility

    > Steven Murdoch, professor of security engineering at UCL, said the lawsuit was “a bit strange”. “It seems to be going mostly on whistleblowers, and we don’t know much about them or their credibility,” he said. “I would be very surprised if what they are claiming is actually true.”

    No one apart from the firm filing the lawsuit is actually supporting this claim. A lot of people in this thread seem very confident that it's true, and I'm not sure what precisely makes them so confident.

    • Snoozus 5 hours ago

      I find this wording also "a bit strange".

      It is not a mathematical impossibility in any way.

      For example they might be able to read the backups, the keys might be somehow (accidentaly or not) leaked...

      And then the part about Telegram not having end2end encryption? What's this all about?

      • FabHK 5 hours ago

        Telegram defaults to not e2ee; you have to initiate a "secret" chat to get e2ee.

  • modeless 5 hours ago

    Meanwhile Apple has always been able to read encrypted iMessage messages and everyone decided to ignore that fact. https://james.darpinian.com/blog/apple-imessage-encryption

    • Flere-Imsaho 5 hours ago

      And it's worse if you live in the UK:

      https://support.apple.com/en-us/122234

      In fact on this page they still claim iMessage is end-to-end encrypted.

    • gruez 5 hours ago

      >has always been able to read encrypted iMessage messages

      ...assuming you have icloud backups enabled, which is... totally expected? What's next, complaining about bitlocker being backdoored because microsoft can read your onedrive files?

      • modeless 5 hours ago

        If you read the link you would know that contrary to your expectation other apps advertising E2EE such as Google's Messages app don't allow the app maker to read your messages from your backups. And turning off backups doesn't help when everyone else has them enabled. Apple doesn't respect your backup settings on other people's accounts. Again, other apps address this problem in various ways, but not iMessage.

        • gruez 5 hours ago

          >If you read the link you would know that contrary to your expectation other apps advertising E2EE don't allow the app maker to read your messages.

          What does that even mean? Suppose icloud backups doesn't exist, but you could still take screenshots and save them to icloud drive. Is that also "Apple has always been able to read encrypted iMessage messages"? Same goes for "other people having icloud backups enabled". People can also snitch on you, or get their phones seized. I feel like people like you and the article author are just redefining the threat model of E2EE apps just so they can smugly go "well ackshually..."

          • modeless 5 hours ago

            It means, for example, Google Messages uses E2EE backups. Google cannot read your E2EE messages by default, period. Not from your own backup, not from other peoples' backups. No backup loophole. Most other E2EE messaging apps also do not have a backup loophole like iMessage.

            It's not hard to understand why Apple uploading every message to themselves to read by default is different from somebody intentionally taking a screenshot of their own phone.

            • gruez 3 hours ago

              >Google cannot read your E2EE messages by default, period.

              Is icloud backups opt in or opt out? If it's opt in then would your objection still hold?

              • modeless an hour ago

                I'm less concerned with whether it is technically opt in or opt out and more concerned with whether it is commonly enabled in practice.

                What would resolve my objection is if Apple either made messages backups E2EE always, as Google did and as Apple does themselves for other data categories like Keychain passwords, or if they excluded E2EE conversations (e.g. from ADP people) from non-E2EE backups, as Telegram does. Anything short of that does not qualify as E2EE regardless of the defaults, and marketing it as E2EE is false advertising.

      • Snoozus 5 hours ago

        Absolutly, they intentionally make stuff sound secure and private while keeping full access.

    • razingeden 5 hours ago

      I remember reading this recently. Not saying it’s true but it got my attention

      TUESDAY, NOVEMBER 25, 2025 Blind Item #7 The celebrity CEO says his new chat system is so secure that even he can't read the messages. He is lying. He reads them all the time.

  • sailfast an hour ago

    “Fox has investigated whether henhouse is secure” News at 11.

  • david_allison 7 hours ago

    It was my understanding that the backups are unencrypted. Is that still the case?

    • evanjrowley 6 hours ago

      On Android, if you allow it to backup to your Google cloud storage, it will say the backups are encrypted. That was my experience when I set it up a few weeks ago.

      Exactly who has the ability to decrypt the backup is not totally clear.

      It may be a different situation for non-Android users, Android users who are not signed in with a Google account, Android users who are not using Google Play Services, etc.

      • bayindirh 6 hours ago

        You can explore your Google Cloud's Application Storage part via Rsync, AFAIK. So you can see whether your backups are encrypted or not.

        I remember that you had to extract at least two keys from the android device to be able to read "on-device" chat storage in the days of yore, so the tech is there.

        If you don't have the keys' copies in the Google Drive side, we can say that they are at least "superficially" encrypted.

  • calibas 6 hours ago

    It's vulnerable to man-in-the-middle attacks, and the man-in-the-middle happens to be Meta.

    The tricky part would be doing it and not getting caught though.

  • OutOfHere 3 hours ago

    The issue here is that WhatsApp doesn't work with third-party clients (outside of EU anyway). It does now in EU via BirdyChat and Haiket, but the features are too limiting: https://about.fb.com/news/2025/11/messaging-interoperability...

    Ideally, WhatsApp would fully support third-party open-source clients that can ensure that the mathematics are used as intended.

  • timpera 6 hours ago

    Lots of uninformed conspiratorial comments with zero proof in here, but I'd really like WhatsApp to get their encryption audited by a reliable, independent 3rd party.

  • znpy 6 hours ago

    I always assumed this to be true, to be honest.

    Nowadays all of the messaging pipeline on my phone is closed source and proprietary, and thus unverifiable at all.

    The iPhone operating system is closed, the runtime is closed, the whatsapp client is closed, the protocol is closed… hard to believe any claim.

    And i know that somebody’s gonna bring up the alleged e2e encryption… a client in control of somebody else might just leak the encryption keys from one end of the chat.

    Closed systems that do not support third party clients that connect through open protocols should ALWAYS be assumed to be insecure.

    • gruez 3 hours ago

      >Closed systems that do not support third party clients that connect through open protocols should ALWAYS be assumed to be insecure.

      So you're posting this from an open core CPU running on an open FPGA that you fabricated yourself, right? Or is this just a game of one-upmanship where people come with increasingly high standards for what counts as "secure" to signal how devoted to security they are?

    • solenoid0937 5 hours ago

      it doesn't need to be open source for us to know what it's doing. its properties are well understood by the security community because it's been RE'd.

      > a client in control of somebody else might just leak the encryption keys from one end of the chat.

      has nothing to do with closed/open source. preventing this requires remote attestation. i don't know of any messaging app out there that really does this, closed or open source.

      also, ironically remote attestation is the antithesis of open source.

  • Ms-J 7 hours ago

    Who do they expect to fall for the claims that a Facebook owned messenger couldn't read your "encrypted" messages? It's truly funny.

    Any large scale provider with headquarters in the USA will be subject to backdoors and information sharing with the government when they want to read or know what you are doing.

    • olalonde 6 hours ago

      Me? I'd be very surprised if they can actually read encrypted messages (without pushing a malicious client update). The odds that no one at Meta would blow the whistle seem low, and a backdoor would likely be discovered by independent security researchers.

      • nindalf 5 hours ago

        I'd be surprised as well. I know people who've worked on the WhatsApp apps specifically for years. It feels highly unlikely that they wouldn't have come across this backdoor and they wouldn't have mentioned it to me.

        Happy to bet $100 that this lawsuit goes nowhere.

      • riazrizvi 6 hours ago

        If there is such a back door, it would hardly follow it's widely known within the company. From the sparse reports on why Facebook/Meta has been caught doing this in the past, it's for favor trading and leverage at the highest levels.

      • SoftTalker 5 hours ago

        That was my reaction on reading the headline. Of course Meta can read them, they own the entire stack. The question would really be do they?

      • Snoozus 5 hours ago

        Is there an independent audit of the Whatsapp client and of the servers?

    • Aurornis 6 hours ago

      > Any large scale provider with headquarters in the USA will be subject to backdoors and information sharing with the government when they want to read or know what you are doing.

      Not just the USA. This is basically universal.

      • j45 6 hours ago

        It's not guaranteed or by default.

        This type of generalized defeatism does more harm than not.

        • Aurornis 6 hours ago

          > It's not guaranteed or by default.

          Nation state governments do have the ability to coerce companies within their territory by default.

          If you think this feature is unique to the USA, you are buying too much into a separate narrative. All countries can and will use the force of law to control companies within their borders when they see fit. The USA actually has more freedom and protections in this area than many countries, even though it’s far from perfect.

          > This type of generalized defeatism does more harm than not.

          Pointing out the realities of the world and how governments work isn’t defeatism.

          Believing that the USA is uniquely bad and closing your eyes to how other countries work is more harmful than helpful.

          • j45 4 hours ago

            Understanding the cloud is someone else's computer is something I've repeated many, many, many times in my comments.

            The OP assumption that it's just the way it is and everyone should accept their communication being compromised is the issue.

        • embedding-shape 6 hours ago

          No, assuming that anything besides what you can verify yourself is compromised isn't "defeatism", although I'd agree that it's overkill in many cases.

          But for your data you want to absolutely keep secret? It's probably the only to guarantee someone else somewhere cannot see it, default to assume if it's remote, someone will eventually be able to access it. If not today, it'll be stored and decrypted later.

        • Ms-J 5 hours ago

          This is correct. Yes, every government has the ability to use violence and coerce, but that takes coordination among other things. There are still places, and areas within those places, where enforcement and the ability to keep it secret is almost not possible.

    • huijzer 6 hours ago

      I have reached the point that I think even the chat control discussion might be a distraction because essentially they can already get anything. Yeah government needs to fill in a form to request, but that’s mostly automated I believe

      • gruez 5 hours ago

        >I have reached the point that I think even the chat control discussion might be a distraction because essentially they can already get anything.

        Then why are politicians wasting time and attracting ire attempting pushing it through? Same goes for UK demanding backdoors. If they already have it, why start a big public fight over it?

      • j45 6 hours ago

        Such initiatives are likely trying to make it easier.

    • mattmaroon 6 hours ago

      I think you can safely remove “in the USA” from that sentence.

    • rdtsc 6 hours ago

      > Any large scale provider with headquarters in the USA will be subject to backdoors

      Wonder what large scale provider outside USA won’t do that?

    • preisschild 6 hours ago

      > Any large scale provider with headquarters in the USA will be subject to backdoors and information sharing with the government when they want to read or know what you are doing.

      Thats just wrong. Signal for example is headquartered in the US and does not even have this capability (besides metadata)

    • kgwxd 6 hours ago

      They're only concerned someone at meta, they don't already control, could read their personal messages.

    • hsuduebc2 6 hours ago

      I do not believe them either. The swift start of the investigation by U.S. authorities only suggests there was no obstacle to opening one, not that nothing could be found. By “could not,” I mean it is not currently possible to confirm, not that there is necessarily nothing there.

      Personally, I would never trust anyone big enough that it(in this case Meta) need and want to be deeply entangled in politics.

  • josefrichter 6 hours ago

    I am not into conspiracy theories, but I find it very unlikely that our governments can’t read all our messages across platforms.

  • Ms-J 2 hours ago

    This was slid off the first page of HN so quickly.

    As someone wisely pointed out in this thread, the reason Facebook is doing this is: "it's for favor trading and leverage at the highest levels."

    • dang 2 hours ago

      It set off the flamewar detector, which is the usual reason that happens.

      We'll either turn off that software penalty or merge the thread into a submission of the original Bloomberg source - these things take a bit of time to sort through!

      Edit: thread merged from https://news.ycombinator.com/item?id=46836487 now.

      • Ms-J an hour ago

        It does have an amplifying effect when issues such as this happen to where users who don't read in time won't see this due to the amount of other threads.

        Thank you for the insight as to why it happened.

    • CalRobert 2 hours ago

      Just came here after seeing it in the Guardian and really disappointed it's not on the front page. Telling.

      • dang 2 hours ago

        Telling in what way?

  • oefrha 6 hours ago

    I always assumed Meta has backdoor that at least allows them to compromise key individuals if men in black ask, but law firm representing NSO courageously defending the people? Come the fuck on.

    > Our colleagues’ defence of NSO on appeal has nothing to do with the facts disclosed to us and which form the basis of the lawsuit we brought for worldwide WhatsApp users.

    • zugi 5 hours ago

      > I always assumed Meta has backdoor that at least allows them to compromise key individuals if men in black ask

      According to Meta's own voluntarily published official statements, they do not.

      * FAQ on encryption: https://faq.whatsapp.com/820124435853543

      * FAQ for law enforcement: https://faq.whatsapp.com/444002211197967

      These representations are legally binding. If Meta were intentionally lying on these, it would invite billions of dollars of liability. They use similar terminology as Signal and the best private VPN companies: we can't read and don't retain message content, so law enforcement can't ask for it. They do keep some "meta" information and will provide it with a valid subpoenoa.

      The latter link even clarifies Meta's interpretation of their responsibilities under "National Security Letters", which the US Government has tried to use to circumvent 4th amendment protections in the past:

      > We interpret the national security letter provision as applied to WhatsApp to require the production of only two categories of information: name and length of service.

      I guess we'll see if this lawsuit goes anywhere or discovery reveals anything surprising.

  • foooorsyth 2 hours ago

    The reality that most encryption enthusiasts need to accept is that true E2EE where keys don’t leave on-device HSMs leads to terrible UX — your messages are bound to individual devices. You’re forced to do local backups. If you lose your phone, your important messages are gone. Lay users don’t like this and don’t want this, generally.

    Everything regarding encrypted messaging is downstream of the reality that it’s better for UX for the app developer to own the keys. Once developers have the keys, they’re going to be compelled by governments to provide them when warrants are issued. Force and violence, not mathematical proofs, are the ultimate authority.

    It’s fun to get into the “conspiratorial” discussions, like where the P-256 curve constants came from or whether the HSMs have backdoors. Ultimately, none of that stuff matters. Users don’t want their messages to go poof when their phone breaks, and governments will compel you to change whatever bulletproof architecture you have to better serve their warrants.

  • cft 3 hours ago

    I trust Telegram more: Putin never had any problems with Whatsapp, only with Telegram.

  • webdoodle 2 hours ago

    > US reportedly investigate claims that Meta can read encrypted WhatsApp messages

    Lol, Fox guarding the hen house.

  • AndrewKemendo 6 hours ago

    If your personal threat model at this point is not literally:

    “everything I ever do can be used against me in court”

    …then you are not up-to-date with the latest state of society

    Privacy is the most relevant when you are in a position where that information is the difference between your life or your death

    The average person going through their average day breaks dozens of laws because the world is a Kafkaesque surveillance capitalist society.

    The amount of information that exists about there average consumer is so unbelievably godly such that any litigator could make an argument against nearly any human on the planet that they are in violation of something if there is enough pressure

    If you think you’re safe in this society because you “don’t do anything wrong“ then you’re compromised and don’t even realize it

  • ralusek 7 hours ago

    I mean at the very least if their clients can read it then they can at least read it through their clients, right? And if their clients can read it’ll be because of some private key stored on the client device that they must be able to access, so they could always get that. And this is just assuming that they’ve been transparent about how it’s built, they could just have backdoors on their end.

    • basch 7 hours ago

      they can also just .. brute force passwords. the pin to encrypt fb messenger chat is 6 digits for example.

      • farbklang 6 hours ago

        but that is a pin and can be rate limited / denied, not a cryptograhpic key that can be used to brute force and compare hash generations (?)

        • barbazoo 6 hours ago

          They likely wouldn’t rate limit themselves, rate limiting only applies when you access through their cute little enter your pin UI.

          • solenoid0937 5 hours ago

            The PIN is used when you're too lazy to set an alphanumeric pin or offload the backup to Apple/Google. Now sure, this is most people, but such are the foibles of E2EE - getting E2EE "right" (eg supporting account recovery) requires people to memorize a complex password.

            The PIN interface is also an HSM on the backend. The HSM performs the rate limiting. So they'd need a backdoor'd HSM.

            • barbazoo 4 hours ago

              That added some context I didn’t have yet thanks. I’m not seeing yet how Meta if it was a bad actor wouldn’t be able to brute force the pin of a particular user. Of this was a black box user terminal site, Meta owns the stack here though, seems plausible that you could inject yourself easily somewhere.

              • solenoid0937 4 hours ago

                If you choose an alphanumeric pin they can't brute force because of the sheer entropy (and because the key is derived from the alphanumeric PIN itself.)

                However, most users can't be bothered to choose such a PIN. In this case they choose a 4 or 6 digit pin.

                To mitigate the risk of brute force, the PIN is rate limited by an HSM. The HSM, if it works correctly, should delete the encryption key if too many attempts are used.

                Now sure, Meta could insert itself between the client and HSM and MITM to extract the PIN.

                But this isn't a Meta specific gap, it's the problem with any E2EE system that doesn't require users to memorize a master password.

                I helped design E2EE systems for a big tech company and the unsatisfying answer is that there is no such thing as "user friendly" E2EE. The company can always modify the client, or insert themselves in the key discovery process, etc. There are solutions to this (decentralized app stores and open source protocols, public key servers) but none usable by the average person.

            • basch 5 hours ago

              That might be a different pin? Messenger requires a pin to be able to access encrypted chat.

              Every time you sign in to the web interface or resign into the app you enter it. I don’t remember an option for an alphanumeric pin or to offload it to a third party.

              • solenoid0937 4 hours ago

                Oh my bad! I was talking about WhatsApp.

                The Messenger PIN is rate limited by an HSM, you merely enter it through the web interface.

                Of course, the HSM could be backdoored or the client could exfil the secret but the latter would be easy to discover.

                Harder to do any better here without making the user memorize a master password, which tends to fail miserably in real life.

  • xvector 6 hours ago

    What even are these low effort, uninformed conspiratorial comments saturating the comment section?

    Sure, Meta can obviously read encrypted messages in certain scenarios:

    - you report a chat (you're just uploading the plaintext)

    - you turn on their AI bot (inference runs on their GPUs)

    Otherwise they cannot read anything. The app uses the same encryption protocol as Signal and it's been extensively reverse engineered. Hell, they worked with Moxie's team to get this done (https://signal.org/blog/whatsapp-complete/).

    The burden of proof is on anyone that claims Meta bypassing encryption is "obviously the case."

    I am really tired of HN devolving into angry uninformed hot takes and quips.

  • rambojohnson 3 hours ago

    I mean no shit, right?

  • alex1138 6 hours ago

    Zuck didn't buy it in good faith. It wasn't "we'll grow you big by using our resources but be absolutely faithful to the privacy terms you dictate". Evidence: Brian Acton very publically telling people that they (Zuck, possibly Sandberg) reneged

    Zuck thinks we're "dumb fucks". That's his internet legacy. Copying products, buying them up, wiping out competition

  • mlmonkey 4 hours ago

    I'm shocked, shocked! that there's gambling going on here ...

  • renegade-otter 6 hours ago

    Anyone trusting Facebook to follow basic human decency and, yes, laws, is a fool.

    • dang 2 hours ago

      Maybe so, but please don't post unsubstantive comments to Hacker News. We're trying for something different here.

    • xvector 6 hours ago

      Anyone blindly believing every random allegation is also a fool, especially when the app in question has been thoroughly reverse engineered and you can freely check for yourself that it's using the same protocol as Signal for encryption

      • gherkinnn 5 hours ago

        Allegations against a company who circumvented Android's security to track users?

        I don't have any proof that Meta stores WhatsApp messages but I feel it in my bones that at the very least tried to do so. And if ever that comes to light, precisely nobody will be surprised.

        https://cybersecuritynews.com/track-android-users-covertly/

        • gruez 3 hours ago

          >And if ever that comes to light, precisely nobody will be surprised.

          The amount of ambient cynicism on the internet basically makes this a meaningless statement. You could plausibly make the same claim for tons of other conspiracy theories, eg. JFK was assassinated by the CIA/FBI, Bush did 9/11, covid was intentionally engineered by the chinese/US government, etc.

      • jlarocco 4 hours ago

        That raises the question of why not just use Signal and avoid a company whose founder thinks we're all "dumbfucks" and has a long history of scandals and privacy violations?

        The evidence is pretty clear that Facebook wants to do everything they legally can to track and monitor people, and they're perfectly okay crossing the line and going to court to find the boundaries.

        Using a company like that for encrypted messaging seems like an unnecessary risk. Maybe they're not decrypting it, but they're undoubtedly tracking everything else about the conversation because that's what they do.

    • Forgeties79 6 hours ago

      They got caught torrenting unbelievable amounts of content, an act that committed even just a few times can get my home Internet shut down with no recourse (best outcome). Literally nothing happened. Combine the fact that nothing legally significant ever happens to them with zuckerburg’s colossal ego and complete lack of ethical foundation, and you have quite the recipe.

      And I’m not even getting into the obvious negative social/political repercussions that have come directly from Facebook and their total lack of accountability/care. They make the world worse. Aside from the inconvenience for hobbyist communities and other groups, all of which should leave Facebook anyway, we would lose nothing of value if Facebook was shut down today. The world would get slightly better.

      • gruez 3 hours ago

        >an act that committed even just a few times can get my home Internet shut down with no recourse (best outcome).

        No, the best (and also most likely) outcome is you using a VPN and nothing happens, like 99.9% of pirates out there.

        >Literally nothing happened.

        Isn't there a lawsuit in the works?

        • Forgeties79 an hour ago

          If you have to do a thing that obscures your act it doesn’t change the fact that there are rules for me and not them. We know for a fact they did it. Did their ISP threaten them? Did they get their internet service shut off?

          Edit: they already won their first case in June against authors. I am very curious to see how that lawsuit goes. Obviously we don’t know the results yet but I would be incredibly surprised to see them lose and/or have to “undo” the training. That’s a difficult world to imagine, especially under the current US admin. Smart money is the damage is done and they’ll find some new way to be awful or otherwise break rules we can’t.

          • gruez 38 minutes ago

            >If you have to do a thing that obscures your act it doesn’t change the fact that there are rules for me and not them. We know for a fact they did it. Did their ISP threaten them? Did they get their internet service shut off?

            Is there any indication they didn't use a VPN? If they did use a VPN, how is it "there are rules for me and not them", given that anyone can also use VPN to pirate with impunity?

      • bayarearefugee 6 hours ago

        > Literally nothing happened.

        The true wealthy live by an entirely different set of rules than the rest of us, especially when they are willing to prostrate themselves to the US President.

        This has always been true to some degree, but is both more true than ever (there used to be some limits based on accepted decorum) plus they just dont even try to hide it anymore.

        • Forgeties79 5 hours ago

          I think the not hiding it part is what’s starting to stick in my craw. We all knew it was happening on some level, but we felt that there were at least some boundaries somewhere out there even if they were further than ours. Now it just feels like the federal government basically doesn’t exist and companies can do whatever they want to us.

  • oldestofsports 6 hours ago

    Surprised pikachu face

  • SirFatty 5 hours ago

    Of course they can. Why wouldn't you assume this to be the case?

  • jijji 6 hours ago

    if anybody believes that Facebook would allow people to send a totally encrypted message to somebody, they're out of their mind. they're pretty much in bed with law enforcement at this point. I mean I don't know how many people have been killed in Saudi Arabia this year for writing Facebook messages to each other that were against what the government wanted but it's probably a large number.

    • xvector 5 hours ago

      This reads like another low effort conspiratorial comment.

      WhatsApp has been reverse engineered extensively, they worked with Moxie's team to implement the same protocol as Signal, and you can freely inspect the client binaries yourself!

      If you're confident this is the case, you should provide a comment with actual technical substance backing your claims.

  • oncallthrow 5 hours ago

    This should surprise nobody. Do you really think that the intelligence agencies of the US etc would allow mainstream E2E encryption? Please stop being so naive

  • kachapopopow 6 hours ago

    yes, this is a very known fact that it is not E2EE but Client2Server Encrypted. Otherwise your message history wouldn't work.

    • codexetreme 6 hours ago

      Might be a rookie question. But exactly why would chat history not work?

      • ryanscio 6 hours ago

        It would, just not on new devices without moving keys via already-trusted device. This is what WhatsApp presumably does

    • xvector 6 hours ago

      This is a total misunderstanding of how E2EE works.

      I need to either enter my password or let the app access my iCloud Keychain to let it derive the backup encryption key.

      It's also well known that they worked with the Moxie's team to implement the same E2EE protocol as Signal. So messages are E2EE as well.