7 comments

  • j_bum a day ago

    > Here's a thought experiment: imagine Instagram, but every single post is a video of paint drying. Same infinite scroll. Same autoplay. Same algorithmic recommendations. Same notification systems. Is anyone addicted? Is anyone harmed? Is anyone suing?

    I do not buy this argument. Of course, most of the content on these platforms is innocuous, and may as well be paint drying.

    What's harmful are the harnesses that these companies have built to exploit the content.

    > Of course not. Because infinite scroll is not inherently harmful.

    Yes it is [0].

    > Autoplay is not inherently harmful. Algorithmic recommendations are not inherently harmful.

    Yes, they can be [1] [2].

    > These features only matter because of the content they deliver. The "addictive design" does nothing without the underlying user-generated content that makes people want to keep scrolling.

    These harnesses only work because people feed the machine. The harnesses are still harmful.

    This whole argument is predicated on a strawman that makes no sense.

    A gun doesn't work without bullets. But if a company designs and hands out the gun to the world, they should be liable for the consequences, even if they rely on users for the ammunition.

    [0] https://doi.org/10.1145/3544548.3580729

    [1] https://doi.org/10.1145/3491101.3519829

    [2] https://counterhate.com/research/deadly-by-design/

  • superkuh a day ago

    Even beyond the dangerous legal precedent it sets, we're all cheering for a legal precedent that human persons don't have volition or free will and that multi-media can somehow bypass normal sensation pathways a act directly on want like drugs do. And that's simply not true. Believing that and setting up a legal precedent means that now the government can use violent force to regulate anything shown on a screen. This is going to cause incredible damage to our society as a whole and to individual peoples lives. Government use of force is far more dangerous than unsupported memes/old-wive tales from the 1970s.

    • voidmain a day ago

      I too fear what governments will actually do in this area. But I think you may be underestimating the threat to personal agency.

      Imagine you are trapped in a groundhog day like time loop - but you are not the person who remembers previous loops. "Z" is. He tries to convince you to do something, over and over and over, thousands or millions of times, refining his approach based on your reactions while you remember nothing. Are you really confident that your free will protects you from being taken advantage of in this situation?

      Now imagine that instead of a time loop, Z has a million clones of you. He tries his persuasion on one of them at a time, refining it until it works reliably before using it on you. You are just as vulnerable.

      Now suppose he has a billion people, not identical to you but drawn from the same distribution. He has a harder computational problem, mapping the high dimensional manifold of their responses to create a model of you sufficiently accurate to manipulate you. But with enough data he can approximate the results of the previous case without more than a tiny fraction of his experimentation being visible to you.

      Any relationship where one party gets to surveil and monitor not only the other party, but millions or billions of like parties, has the potential to be a deeply abusive one. We should not tolerate such situations whether the surveilling party is a government or not.

    • 46493168 10 hours ago

      There’s a few books I recommend for you, if you’re open to learning more about this subject.

      The first is “Addiction by Design: Machine Gambling in Las Vegas” by Natasha Dow Schüll. The second, and arguably more direct and fascinating, is “The Age of Surveillance Capitalism” by Shoshanna Zuboff. Both are incredibly eye-opening in their treatment of technology and how it is designed to influence behavior.

      • superkuh 9 hours ago

        And for you, to help understand the vast gulf that is the difference between drugs that directly modifify incentive salience and simple normal perceptions of multi-media screens via our senses (that don't), https://sites.lsa.umich.edu/berridge-lab/selected-review-art...

        • Balinares 6 hours ago

          I'm not seeing where the content you linked is supporting your argument.

          • superkuh 2 hours ago

            It's background education in the basics so you can understand what drug addiction is and the neurological differences in the active populations for wanting versus liking. I guess I can spell it out.

            Addictive drugs directly increase wanting via directly activating the downstream targets of dopaminergic populations which predict the valence of stimuli and control of wanting and motivation. By taking a chemically addictive drug you don't even have to enjoy the stimuli related to it. You will still be conditioned to want it and be motivated to re-experience the stimuli surrounding it.

            This is vastly different in mechanism and result than simply seeing or hearing a screen. These things cannot directly increase incentive salience regardless of actual valance of the stimuli. You have to actually enjoy the thing and the experiences to form habits.

            Do you see the difference now? One thing, the chemical drugs, are addictive. The other things are enjoyable. One will addict everyone because they're addictive. The other only leads to addiction-like behaviors in the context of say, random interval operant conditioning, if you actually enjoy the thing intrinsically first and are of the fairly small subset of that subset that is predisposed to behavioral addictive behaviors.