82 comments

  • elashri 4 days ago

    This reminds me of the joke that says if you want to harm someone, do it under LLC and you will not go to jail.

    Its a joke of course but behind it a lot of criticism towards the current system.

    • nostrademons 4 days ago

      Even better: do not personally harm someone, but instead create an LLC, hire people, and then create an incentive structure such that if your employees want to continue getting paid, they have to harm people.

      • fma 4 days ago

        Isn't that basically what happened to Michael Cohen.

    • jamiek88 4 days ago

      Same as if you want to kill someone do it with a car.

      • wlesieutre 4 days ago

        That’s not true, you have to do it with a car and say “oops” afterward

        • Ekaros 4 days ago

          Bonus points for getting drunk...

      • Smar 4 days ago

        And if done with a truck, recipient might be in for a surprise ;)

  • aliasxneo 4 days ago

    > control of corporate activity alone is insufficient

    That's very vague. It seems like the accusation brought forth was Zuckerberg knowingly concealed the addictive behavior, but it seems like the best the prosecution came up with was just, "He's CEO, therefore he is liable." Unless I'm completely misunderstanding this.

  • asdefghyk 4 days ago

    Lets be clear, Mark Zuckerberg could make changes to avoid this harm to children, As the controller of these companies he would sure be fully aware of the problem - its all over the media and has been for years. These social media companies place profit over the harm of children.

    • twohaibei 4 days ago

      This has been clear for a long time. The question is who has the power and will to do something sufficient about it to stop it. The money is so huge they can buy enough protection. And social media is not even the only one doing it knowingly. But until fine < profit, fines will be paid. And until execs dont go to jail, they will still be happy to rake in the billions.

    • sirspacey 4 days ago

      Genuine question:

      How is this clear? What lever could Zuck pull that would prevent this harm?

      From the research I’ve read the harm of social media is, almost exclusively, in the zone of what people post on it (ex: bullying) or what they are seeking to accomplish by posting (ex: looksmaxxing). I struggle to see how the harm that causes is caused by the ability to share it.

    • parsimo2010 4 days ago

      Zuck could probably prevent lots of other bad things happening too. He has enough money to feed every starving person in San Francisco. Does that mean that if anyone in SF dies of starvation that we should try Zuck for manslaughter? I don’t think blame falls to him personally because he failed to do something that others wished he had.

      The judge did not say that Meta itself isn’t responsible. I think their corporate priorities are probably misaligned with the good of humanity. But it isn’t Zuck by himself who did all this- it’s teams of researchers and software devs and product managers- that’s a corporation.

      You probably couldn’t impose a large enough fine to prevent Meta optimizing for kids’ attention- they are such a huge source of profit. If you really want to protect the kids you need straight up regulatory bans.

      • jinushaun 4 days ago

        That’s not a really compelling argument. Facebook is his company. He steers the ship. People are complaining that he’s telling the people rowing the boat to run over people in the water. Whether his hand is on the steering wheel is moot.

        Secondly, him not directly feeding starving people in SF is not the same. It’s unrelated. He is not the mayor of SF, but he is the CEO of Meta.

      • MattPalmer1086 4 days ago

        > Does that mean that if anyone in SF dies of starvation that we should try Zuck for manslaughter?

        There is an obvious difference between not preventing a harm that is a direct result of your activity, and not preventing a harm which isn't.

    • ThrowawayTestr 4 days ago

      If you were in his shoes, what would you do?

    • carabiner 4 days ago

      Expect this to grow as Zuck has revealed himself to be a centrist Trump supporter. He's pals with Musk, but he runs Threads as a liberal version of X/Twitter. It's the most blatant form of divide and conquer I've seen from the billionaire class.

      • 4 days ago
        [deleted]
  • system2 4 days ago

    If it wasn't for him, it would be someone else. He is not the inventor of the internet. It is completely the parents' fault for allowing their children to use social media. Do we blame car manufacturers for young people driving and killing themselves or others? No.

    • almatabata 4 days ago

      Remember when facebook got caught running psychological experiments on its users? (https://www.theguardian.com/technology/2014/jul/02/facebook-...)

      I do not trust myself with identifying when companies tweak the algorithm to mess with my psyche. How can I trust myself with identifying with my own kids except when it becomes really obvious in the content or in their behavior?

      > Do we blame car manufacturers for young people driving and killing themselves or others?

      Cars had no seat belts, had no crumple zone, would crush the driver when the engine got pushed in. We literally put more and more requirements on cars to make them as safe as reasonably possible.

      If social media has negative effects why would we not legislate these companies in such a way as to reduce the negatives?

      You can argue there is no negative effect if you want. But if they have negatives effects then don't we need to rein them in?

      • nostrademons 4 days ago

        Pretty much every business runs psychological experiments on their customers. That's what the ad industry is, that's why retailers run sales to gather pricing data, that's what A/B testing is for.

        It's interesting to look at the triple standard of harm across different institutions though:

        Science: your job is to learn things, you must bring every possible harm you can think of from your experiment to an ethical review board, and if there's a chance it might hurt people, you can't run your experiment.

        Business: your job is to maximize profit, you may harm people in the pursuit of this as long as you do so within the bounds of the law.

        Defense: your job is to kill people, do what you want as long as it's our enemies.

        Clearly, people value killing foreigners > making money > discovering new knowledge.

        • almatabata 4 days ago

          > Pretty much every business runs psychological experiments on their customers. That's what the ad industry is, that's why retailers run sales to gather pricing data, that's what A/B testing is for.

          Sure but there are differences between the psychological experiments. I would argue that running a sale is benign compared to manipulating words in social media post to see how it influences your mood. What effect will that have on people with already fragile psyches?

          I understand its legal whatever facebook has done but still I think that there should be restrictions on the kind of psychological tests you can run like this. Unless I overestimate the effect these studies have.

          > Clearly, people value killing foreigners > making money > discovering new knowledge.

          Killing foreigners is mostly a way to maintain or gain influence so I would say its more: power > making money > discovering new. The killing foreigners part is just a means to an end.

          • ahahahahah 4 days ago

            > I would argue that running a sale is benign compared to manipulating words in social media post to see how it influences your mood. What effect will that have on people with already fragile psyches?

            This is such a dumb or disingenuous position.

            Let's imagine you are a social media company, and these things happen:

            1. people (researchers, media, doesn't matter) are reporting that they think the proliferation of negative sentiment on your social media is having a negative affect on user's moods.

            2. based on your extensive experience and understanding of user behavior you think they are wrong. but still you recognize that if they are actually right it would be best for you to know that so you can make changes to fix it.

            3. so, you decide to run an experiment where for some of the users you reduce the likelihood that they will see the types of posts that those researchers have identified as problematic, and you monitor the same sorts of signals that those researchers were looking at.

            4. you get the results of your experiment. it doesn't actually matter what they say, if they say there's no effect, you now know that and apply it in the future, if they say there's a negative effect you try to mitigate it, if they say there's a positive effect maybe you try to amplify it. but really, that's not important because...

            5. you tell people that you've done this research and made or not changes based on it. or it comes out in a leak that you did this by a disgruntled ex employee, or in a court case, also doesn't matter.

            6. those same people as in (1), or others, frame the experiment you've done as a terrible thing where you were manipulating people's emotions for your own gain.

    • croes 4 days ago

      If cars could be driven without keys we would blame car manufacturers.

      How do you prevent kids using social media without 100% surveillance or not allowing internet access at all?

      • kelipso 4 days ago

        Guns don't need keys, do they. Sure they're trying but the cases against the gun companies didn't go anywhere.

        And I am certain car companies won't be held liable for allowing starting without keys. Keys are only there because of theft.

        • trynumber9 4 days ago

          If you don't have a safe with a key for your gun who is to blame? You bought a gun and you stored it unsafely.

          Responsible gun owners have their guns locked up. Perhaps responsible parents can pay attention to their childrens' internet usage?

        • lsllc 4 days ago

          In most places they are required to be locked up by law, for example in Massachusetts:

          > "It shall be unlawful to store or keep any firearm in any place unless such firearm is secured in a locked container or equipped with a tamper-resistant mechanical lock or other safety device, properly engaged so as to render such firearm inoperable by any person other than the owner or other lawfully authorized user."

          https://www.mass.gov/info-details/mass-general-laws-c140-ss-...

        • croes 4 days ago

          Maybe guns companies win in court but we still blame them.

          • richwater 4 days ago

            And the blame will be continually misplaced until society realizes we have a mental health crisis not a trigger finger crisis.

            • croes 4 days ago

              Fixing mental health is a lot harder than making it harder to get and fire a gun.

      • aliasxneo 4 days ago

        > How do you prevent kids using social media without 100% surveillance or not allowing internet access at all?

        I don't give them a phone, and all the screens in my house are in the shared living space. They also don't know the WiFi password, should they try to sneak in a device.

        Could they use a friend's phone? Sure. I'm not really concerned in that case, because there's honestly nothing I can do, but it's such minimal exposure with a huge barrier that I doubt it's effectiveness in sustaining much addiction.

      • fairity 4 days ago

        > How do you prevent kids using social media without 100% surveillance or not allowing internet access at all?

        Do the parental controls on mobile phones not work? I'm not a parent, so I don't know, but it seem unbelievable to me that there isn't an effective way to grant internet access to kids without social media.

        • 4 days ago
          [deleted]
        • croes 4 days ago

          You underestimate the cleverness of kids to circumvent such controls.

          Do you think antivirus software prevents all viruses?

          The same is true for parental control

        • seb1204 4 days ago

          Often not to be honest. And then once 14 they get some permission back on their accounts

        • nuancebydefault 4 days ago

          Parental controls will not completely remove addiction. The algorithms are steering towards that, regardless of the age rating of content.

          Still, is it Zuckerberg's fault or just how capitalism works (optimize for max profit)?

          I believe only strict regulations can help. Is there any regulation against addiction to (social) media?

      • Dalewyn 4 days ago

        >How do you

        The same way as with everything else: Do your job as a parent.

        • croes 4 days ago

          In school and when the play with their friends outside?

          You can’t control them 100% of the time.

          • Dalewyn 4 days ago

            Do your job as a parent and you don't have to control them 100% of the time.

            • croes 4 days ago

              That’s not how humans work.

              Yours sound like the no true scotsman fallacy

    • saghm 4 days ago

      I don't understand this logic. If he was found liable, and then someone else did it instead, they could be sued as well for the same thing? "Multiple people might be motivated to do the same thing for profit that society thinks shouldn't be allowed" isn't an argument against stopping it.

    • nashashmi 4 days ago

      Just because someone else can also kill doesn't absolve the killer for killing.

      But I get your point. Killing is illegal. social media harm is not. The analogy remains.

    • YakBizzarro 4 days ago

      "If it wasn't for him, it would be someone else." wow, I will use this defence, it's perfect!

      • system2 4 days ago

        I already was someone else actually. Many times over and over again. He is not the inventor of social media. His site (or acquiring Instagram) was just another tool someone else already created in the past. His stuff was just more popular. China and Russia have their own versions too. If his stuff shuts down now, tomorrow someone else will create similar ones and fill the void in no time.

        • croes 4 days ago

          Then someone else is liable too.

          You could say the same about drug dealers, if one stops another one replaced him

      • croes 4 days ago

        A german philosopher once said, that is a stupid response to renounce responsibility, at this moment someone in Berlin rapes a woman, if I don’t do it …

  • jinushaun 4 days ago

    Not surprised. It’s an unwinnable argument.

    However, I stand by my belief that Social Media is a cancer on society. I divide history between “before Twitter” and “after Twitter”. Facebook went to shit when they started copying Twitter.

    • eddd-ddde 4 days ago

      I've come to believe that society simply does not scale. Humans were never meant to be exposed to instant global connections.

  • asdefghyk 4 days ago

    Governments need to legislate what they want. In this case - something like children - say under 16 not to have access to social media - such as Facebook. The penalty for allowing access ( contrary to the new proposed legislation ) needs to be hugh. Like jail for company executives. And if can not be done after some date the service needs to be close down the social media service. (This is similar to what happens with gambling services. )

    • rileymat2 4 days ago

      > But the judge found a lack of specifics about what Zuckerberg did wrong, and said “control of corporate activity alone is insufficient” to establish liability. Her decision does not affect related claims against Meta itself.

      It is unclear to me how your position is different, it would seem that any fair law would have the same aspects where you would have to prove specifics. So without specifics hold the company liable, with hold the individuals.

  • ilrwbwrkhv 4 days ago

    - They "trust me". Dumb f*cks - Mark Zuckerberg

  • mrtksn 4 days ago

    Recently there was this post about how bureaucracy structures itself to remove any responsibility.

    Apparently free market enterprises are not that different after all, they happen to collect all profits and those who run these systems don't have liability.

    I'm extremely curious how would the upcoming version of USA work as people who collected all the profits kept complaining endlessly and beared no responsibility for any problems are about to run it. Very interesting times.

  • Spivak 4 days ago

    This has got to be literally the dumbest batch of random blame being thrown around to explain "people get a dopamine hit when people like your posts and comments" which gets compounded when it's people you know IRL. Reddit and HN make it an explicit part of the platform by keeping score.

    Like no shit people, attention is addicting. This has been the cause of people doing stupid things since the dawn of community. Platforms providing the community / audience aren't responsible for the high of being on stage.

    • slibhb 4 days ago

      Strongly agree. I am weirded out by the "Facebook is evil" thing. Facebook (and other tech companies) have become a scapegoat for upper class people who don't find their lives meaningful or are disappointed that their children are on SSRIs.

    • youoy 4 days ago

      There is a difference between keeping score of comments/posts and giving a general feed for everyone vs curating a visual personal feed to maximise addiction...

    • barbazoo 4 days ago

      > Platforms providing the community / audience aren't responsible for the high of being on stage.

      Not at all? I don't think that's how the real world works.

  • endofreach 4 days ago

    "Too rich to be liable". What a shame. Someday people will realize what he is responsible for. But if a poor man sells weed to kids, or even worse, truly addictive drugs...

    • IncreasePosts 4 days ago

      No one said "Too rich to be liable".

      > But the judge found a lack of specifics about what Zuckerberg did wrong, and said “control of corporate activity alone is insufficient” to establish liability. Her decision does not affect related claims against Meta itself.

      • noworriesnate 4 days ago

        Laundering liability is expensive but a great option for those who are rich and unscrupulous. So yes... it is that they are too rich to be liable.

        • IncreasePosts 3 days ago

          How do you differentiate laundering liability and someone legitimately not knowing something?

        • jamiek88 4 days ago

          Zuck takes the wealth, Meta and society take the fall.

          If I could undo anything in this world it would be social media.

          • thrill 4 days ago

            "If I could undo anything in this world it would be social media."

            ... he posted.

            • jamiek88 4 days ago

              Ugh that tired old thought terminator.

              ‘You criticize society yet you are part of society’ isn’t the gotcha you think it is.

              • ThrowawayTestr 4 days ago

                You should act on your principles when you have ample ability to do so.

                • 4 days ago
                  [deleted]
                • jamiek88 4 days ago

                  Again. Thought terminator. Thanks for piling on though! That really helped the conversation.

                  • ThrowawayTestr 4 days ago

                    What is preventing you from not using social media?

                    • noworriesnate 3 days ago

                      Probably because it's how society operates now and while they wish society was different, they don't want to become a hermit.

                      • IncreasePosts 3 days ago

                        No, this is not how society operates. Recall just a few days ago when chronically social-mediaed people got the shock of their life when Trump won handily. Social media did not prepare them for this outcome. Why? Because a large fraction of society doesn't use social media.

                        • noworriesnate 2 days ago

                          > Because a large fraction of society doesn't use social media.

                          They probably do use social media, but not in the bubble of the people who were surprised at the outcome of the election. That's actually one of the problems with social media. People are able to connect exclusively with in group and cut everyone else out of their lives.

        • SauntSolaire 4 days ago

          Even so, putting quotations around something never said is disingenuous.

      • 4 days ago
        [deleted]
    • 4 days ago
      [deleted]
  • nashashmi 4 days ago

    > But the judge found a lack of specifics about what Zuckerberg did wrong, and said “control of corporate activity alone is insufficient” to establish liability. Her decision does not affect related claims against Meta itself.

    So direct evidence that Zuck directed the wrongdoing was not given. If it was, Zuck would know precisely what he has to defend himself against.

    Instead, Zuck was labeled responsible by the plaintiff because he could stop the harm and did not stop the harm. This is not something anyone can defend themselves against. For starters, Zuck would have to admit that harm was done to be able to defend himself.

    Secondly, the method of harm should lead directly to him. Without a trial establishing the method of harm that was taken, he is not able to defend himself.

    What were the AGs thinking? Are they complicit in making the case weak enough for public dismissal.?

  • croes 4 days ago

    And know ask the same judge about copyright violations through social media

    • trynumber9 4 days ago

      Don't the judges need proof to show that the social media company deliberately ignored copyright claims before saying they are liable for copyright infringement?

      Am I wrong? This is simply my understanding for liability.

      • croes 4 days ago

        They need certain controls to prevent uploads in the EU they need processes so others can report claims, and they must accordingly to the claims.

        So you must prove you at least tried.

  • FpUser 4 days ago

    >"control of corporate activity alone is insufficient"

    Compare with Pavel Durov.

    >"Her decision does not affect related claims against Meta itself."

    Does not smell right in combination with the first statement

  • outside1234 4 days ago

    And even if he was, he is a billionaire, so he wouldn't be held accountable