The case against social media is stronger than you think

(arachnemag.substack.com)

311 points | by ingve a day ago ago

253 comments

  • narag 3 hours ago

    If you're a politician, you need people to vote for you. "Your" people will. Try not to alienate too much others so you can fish moderates and get to 50%.

    If you're an "influencer" you need engagement. You can live off a 10% easily. And you need retention. So keep the message heated.

    • CM30 2 hours ago

      Hmm, I'm not sure the former holds true anymore. We're seeing societies getting far more polarised with some extreme rhetoric and and proposals coming from political parties, especially in places like the US and Western Europe.

      Kinda makes me wonder if politicians and political parties are fishing for engagement and focusing on the most extreme parts of their supporter base too.

      • daveguy 14 minutes ago

        > Kinda makes me wonder if politicians and political parties are fishing for engagement and focusing on the most extreme parts of their supporter base too.

        They definitely are. The goal for Trump this election was clearly to stoke the base with inflammatory rhetoric bolstered by influencers spouting that same rhetoric.

        "They're eating the cats, they're eating the dogs, they're eating the pets."

    • eastbound 3 hours ago

      We’ve succeeded to make people vote for the fight against global warming, which clearly says people have to reduce their lifestyle, so I think there can be enough audience to make this topic the platform of one party.

      • MangoToupe 2 hours ago

        Have we? I don't recall that option ever being on the ballot in the US.

        • BobbyTables2 2 hours ago

          Well, one party’s stated solution for high oil imports was not to reduce consumption but rather “drill baby, drill!”

          Inefficient regulation also incentivizes car companies to make larger less efficient vehicles because they can’t make the smaller ones efficient enough. And the public has no problem buying enormous vehicles… (Doesn’t everyone need an off road extended cab 4x4 truck for commuting to the office?)

          Frankly, I do feel there is a segment that seems to over focus on conservation to the point of impracticality.

          However, the “single use” consumption has got to end. I don’t even see the debate here. Plastic lids, styrofoam containers, gotta go. Maybe not outright ban, but the culture has to change. Ordered a pastry in a bakery — clerk put it into a large styrofoam container, inch thick stack of napkins, plastic grocery bag, plastic fork/knife.

          Unfortunately I was eating it there… All that waste for one pastry baked there?

          On the other hand, I wonder if Amazon is the devil we assume. If I drive my car around town to get a few items, maybe it’s more fuel efficient to just have them delivered with others’ ?

          • pessimizer 2 hours ago

            > Well, one party’s stated solution for high oil imports was not to reduce consumption but rather “drill baby, drill!”

            It's important to note that the other party's response to that was "Who says we don't want to drill!?" followed by a disaster in the Gulf of Mexico (or "America" I think we're supposed to say now.)

            It's never on the ballot unless it truly does not matter to anyone with any power.

            • tzs an hour ago

              It's also important to note that one party wants to drill while also greatly increasing development of renewables so we can reduce the need for future drilling, increase regulatory limits on emissions over time, provide incentives to adopt more energy efficient appliances, and recognizes that the world needs to reach net zero sometime in the next few decades and is trying to reach that gradually.

              The other party wants to drill while doing everything it can to discourage renewables, is eliminating as many limits on emissions as it can and stopping enforcement of those it cannot yet eliminate, and their views on addressing global warming are a superposition of {it is a hoax by the Chinese to harm the US, it may be happening but humans have no way to influence it, it is good, even if global warming is as bad as predicted and we get a few degrees rise it is no problem because we can increase fossil fuels enough to make cheap air conditioning available so we can get by fine just like Dubai gets by fine with an average temperature of 35F higher than that [1]}. They also want to eliminate funding for satellites that monitor the climate and eliminate emissions reporting requirements for the industries that do most of the emitting.

              [1] https://www.heritage.org/environment/commentary/how-fossil-f...

          • azinman2 2 hours ago

            > Well, one party’s stated solution for high oil imports was not to reduce consumption but rather “drill baby, drill!”

            The idiotic lie here is that the US doesn’t really have the right refinement plants to handle US based oil, so they have to swap oil with other countries who do. Building out new refinement plants isn’t easy or quick, yet would be necessary to actually reduce oil imports and become self sufficient.

  • Lerc 19 hours ago

    Part of me thinks that if the case against social media was stronger, it would not be being litigated on substack.

    A lot of things suck right now. Social media definitely give us the ability to see that. Using your personal ideology to link correlations is not the same thing as finding causation.

    There will be undoubtedly be some damaging aspects of social media, simply because it is large and complex. It would be highly unlikely that all those factors always aligned in the direction of good.

    All too often a collection of cherry picked studies are presented in books targeting the worried public. It can build a public opinion that is at odds with the data. Some people write books just to express their ideas. Others like Jonathan Haidt seem to think that putting their efforts into convincing as many people as possible of their ideology is preferable to putting effort into demonstrating that their ideas are true. There is this growing notion that perception is reality, convince enough people and it is true.

    I am prepared to accept aspects of social media are bad. Clearly identify why and how and perhaps we can make progress addressing each thing. Declaring it's all bad acts as a deterrent to removing faults. I become very sceptical when many disparate threads of the same thing seem to coincidentally turn out to be bad. That suggests either there is an underlying reason that has been left unstated and unproven or the information I have been presented with is selective.

    • procaryote 9 hours ago

      > Part of me thinks that if the case against social media was stronger, it would not be being litigated on substack.

      It's litigated all over and has been for a decade.

      Australia for example has set an age limit of 16 to have social media. France 15. Schools or countries are trying various phone bans. There's research into it. There are whistleblowers telling about Facebook's own research they've suppressed as it would show some of their harm.

      Perhaps you spend too much time on social media?

      • pembrook 8 hours ago

        You’re strengthening OP’s point instead of undermining it.

        The “some governments banned it for kids” argument is an appeal to authority, a logical fallacy.

        The actions of tech-reactionist leftist governments absolutely do not constitute sound science or evidence in this matter.

        And if you’re claiming the French government only makes government policy based on sound data, I will point you to their currently unraveling government over the mathematically impossible social pension scheme they’ve created.

        • procaryote 7 hours ago

          Responding to the point "it's [only] litigated on substack", things like government bans are relevant counter-examples

          The bans might be unfounded or well founded, you might agree with them or not, but clearly the idea that social media might be bad has spread beyond substack

        • throw4847285 7 hours ago

          Your argument contains the fallacy fallacy, a logical fallacy in which one wrongly cites an informal fallacy in order to discredit a valid argument.

          The actions of several democratic governments is evidence that there is enough popular support for these actions to argue for a broader trend. And before you try for a gotcha, I am well aware that a democratic government can enact regulations without a direct vote proving that a majority of people support such an action. But inasmuch as a government reflects the will of the governed, etc etc etc.

          • pembrook 4 hours ago

            Huh? Claiming something is true because a government supports it, is quite possibly the most cut-and-dry definition of an appeal to authority I've ever seen.

            • dcow 3 hours ago

              Governments aren’t banning or restricting it because “god said it was bad”. Nor is the GGGP arguing that we should take it seriously because governments do so. Those would be specific appeals to authority. The GGGP argument uses examples of cases where social media has been taken seriously enough to result in government regulation to directly rebut the GGGGP’s claim that social media is only being discussed on substack and not more broadly.

            • Chris2048 3 hours ago

              "several democratic governments"

      • zarzavat 7 hours ago

        > set an age limit of 16 to have social media

        This just shows how futile it is. How do you actually stop someone from using social media? If a 15 year old signs up for Mastodon what is Australia going to do about it?

        • procaryote 6 hours ago

          I'm guessing it's mostly useful as a guide for parents, but I haven't seen any hard data

          It shows it's not just a debate on substack though

          • danhau 5 hours ago

            Indeed. I think most phones already have some kind of parental control. I know Apple devices do. With screen time you can limit your kids social media use. Shouldn‘t be rocket science to ban those apps automatically, if that isn‘t already possible. OS vendors could use that to implement the country specific bans outright. This does require though, that parents set up their kids‘ phones correctly.

    • Llamamoe 18 hours ago

      I feel like regardless of all else, the fact of algorithmic curation is going to be bad, especially when it's contaminated by corporate and/or political interests.

      We have evolved to parse information as if its prevalence is controlled by how much people talk about it, how acceptable opinions are to voice, how others react to them. Algorithmic social media intrinsically destroy that. They change how information spreads, but not how we parse its spread.

      It's parasocial at best, and very possibly far worse at worst.

      • armchairhacker 16 hours ago

        No doubt the specific algorithms used by social media companies are bad. But what is "non-algorithmic" curation?

        Chronological order: promotes spam, which will be mostly paid actors. Manual curation by "high-quality, trusted" curators: who are they, and how will they find content? Curation by friends and locals: this is probably an improvement over what we have now, but it's still dominated by friends and locals who are more outspoken and charismatic; moreover, it's hard to maintain, because curious people will try going outside their community, especially those who are outcasts.

        EDIT: Also, studies have shown people focus more on negative (https://en.wikipedia.org/wiki/Negativity_bias) and sensational (https://en.wikipedia.org/wiki/Salience_(neuroscience)#Salien...) things (and thus post/upvote/view them more), so an algorithm that doesn't explicitly push negativity and sensationalism may appear to.

        • rightbyte 16 hours ago

          > Chronological order: promotes spam, which will be mostly paid actors.

          If users chose who to follow this is hardly a problem. Also classical forums dealt with spam just fine.

          • squigz 16 hours ago

            > Also classical forums dealt with spam just fine.

            Err... well, no, it was always a big problem, still is, and is made even more so by the technology of our day.

            • doctor_blood 13 hours ago

              Not really? On something like Xenforo2, there's a setting that makes a new account's posts invisible until that account is manually approved by a mod - in conjunction with the spam prevention tools - https://xenforo.com/docs/xf2/spam/#content - we really don't need to do much work.

              Because all new accounts need to be verified by an actual human, we can filter out 99% of spam before other users see it, and between a dozen mods for a community of 140k people we only need to spend ~15 minutes a week cleaning out spam.

              • nradov 12 hours ago

                So then you end up with power tripping mods who abuse their position to push certain narratives. In some cases we've even seen foreign governments paying mods on popular sites such as Reddit to push their propaganda.

                • mid-kid 9 hours ago

                  You mean like how the current twitter owner tweaks the algorithm to push his narrative? This is why there was never one big forum, and there never should've been.

                • camgunz 5 hours ago

                  This is a problem with centralization, not with mods.

          • armchairhacker 16 hours ago

            How will users choose who to follow? This was a real problem when I tried Mastodon/Lemmy/Bluesky, I saw lots of chronological posts but none of them were interesting.

            Unfortunately, classical forums may have dealt with spam better because there were less people online back then. Classical forums that exist today have mitigations and/or are overrun with spam.

            • camgunz 5 hours ago

              What used to happen is there would be human-powered networks ("if you like me, check out X/Y/Z"), rather than algorithm-powered networks. Sadly, the existence and dominance of algorithm-powered networks has withered humans' networking muscle. We can probably build it back though.

        • wkat4242 12 hours ago

          > Also, studies have shown people focus more on negative (https://en.wikipedia.org/wiki/Negativity_bias) and sensational (https://en.wikipedia.org/wiki/Salience_(neuroscience)#Salien...) things (and thus post/upvote/view them more), so an algorithm that doesn't explicitly push negativity and sensationalism may appear to.

          This is exactly why it's a problem. It doesn't even matter whether the algorithm is trained specifically on negative content. The result is the same: negative content is promoted more because it sees more engagement.

          The result is more discontent in society, people are constantly angry about something. Anger makes a reasonable discussion impossible which in turn causes polarisation and extremes in society and politics. What we're seeing all over the world.

          And the user sourced content is a problem too because it can be used for anyone to run manipulation campaigns. At least with traditional media there was an editor who would make sure fact checking was done. The social media platforms don't stand for the content they publish.

          • bluGill 2 hours ago

            It isn't just social media. I'm been identified as a republican and the pervious owners of my house democrats, and since forwardinu has expired I get their 'spam' mail. There names are different, but otherwise the mail from each party is exactly the same 'donate now to stop [other parties'] evil ageneda. they know outrage works and lean into it.

          • nradov 12 hours ago

            Fact checking with traditional media was always pretty spotty. Even supposedly high quality publications like the NY Times frequently reported fake news.

        • mikewarot 13 hours ago

          I've been curating my own feeds manually for decades now. I choose who to follow, and actively seek out methods of social media use that are strictly based on my selections and show things in reverse chronological order. Even Facebook can do thus with the right URL if you use it via the web[1].

          You start with almost nothing on a given platform but over time you build up a wide variety of sources that you can continue to monitor for quality and predictive power over time.

          [1] https://www.facebook.com/?sk=h_chr

        • pyrale 7 hours ago

          > But what is "non-algorithmic" curation?

          Message boards have existed for a very long time, maybe you're too young to remember, but the questions you're raising have very obvious answers.

          They're not without issues, but they have a strong benefit: everyone sees the same thing.

      • Lerc 17 hours ago

        I have wondered if it's not algorithmic curation per-se that is the problem, but personalised algorithmic curation.

        When each person is receiving a personalised feed, there is a significant loss of common experience. You are not seeing what others are seeing and that creates a loss of a basis of communication.

        I have considered the possibility that the solution might be to enable many areas of curation but in each domain the thing people see is the same for everyone. In essence, subreddits. The problem then becomes the nature of the curators, subreddits show that human curators are also not ideal. Is there an opportunity for public algorithm curation. You subscribe to the algorithm itself and see the same thing as everyone else who subscribes sees. The curation is neutral (but will be subject to gaming, the fight against bad actors will be perpetual in all areas).

        I agree about the tendency for the prevalence of conversation to influence individuals, but I think it can be resisted. I don't think humans live their lives controlled by their base instincts, most learn to find a better way. It is part of why I do not like the idea of de-platforming. I found it quite instructional when Jon Stewart did an in-depth piece on trans issues. It made an extremely good argument, but it infuriated me to see a few days later so many people talking about how great it was because Jon agreed with them and he reaches so many people. They completely missed the point. The reason it was good is because it made a good case. This cynical "It's good if it reaches the conclusion we want and lots of people" is what is destroying us. Once you feel like it is not necessary to make your case, but just shout the loudest, you lose the ability to win over people who disagree because they don't like you shouting and you haven't made your case.

        • Llamamoe 9 hours ago

          > the solution might be to enable many areas of curation but in each domain the thing people see is the same for everyone.

          Doesn't this already happen to some extent, with content being classified into advertiser-friendly bins and people's feeds being populated primarily by top content from within the bins the algorithm deems they have an interest in?

          > Once you feel like it is not necessary to make your case, but just shout the loudest, you lose the ability to win over people who disagree because they don't like you shouting and you haven't made your case.

          To some extent, this is how human communication always worked. I think the biggest problem is that the digital version of it is sufficiently different from the natural one, and sufficiently influenceable by popular and/or powerful actors, that it enables very pathological outcomes.

      • enaaem 6 hours ago

        Social media should be liable for the content that their automatic curation put forward. If a telecom company actively gives your number to scammers to call you up, they should not hide behind the argument that it is not them scamming you, but someone else. Applying regular anti-fraud and defamation laws will probably put an end to algorithmic curation.

    • majormajor 18 hours ago

      It's increasingly discussed in traditional media too so let's toss out that first line glib dismissal.

      More and more people declaring it's net-negative is the first step towards changing anything. Academic "let's evaluate each individual point about it on its own merits" is not how this sort of thing finds political momentum.

      (Or we could argue that "social media" in the Facebook-era sense is just one part of a larger entity, "the internet," that we're singling out.)

      • delusional 18 hours ago

        > More and more people declaring it's net-negative is the first step towards changing anything.

        I accept that "net-negative" is a cultural shorthand, but I really wish we could go beyond it. I don't think people are suddenly looking at both sides of the equation and evaluating rationally that their social media interactions are net negative.

        I think what's happening is a change in the novelty of social media. That is, the the net value is changing. Originally, social media was fun and novel, but once that novelty wears away it's flat and lifeless. It's sort of abstractly interesting to discuss tech with likeminded people on HN, but once we get past the novelty, I don't know any of you. Behind the screen-names is a sea of un-identifiable faces that I have to assume are like-minded to have any interesting discussions with, but which are most certainly not like me at all. Its endless discussions with people who don't care.

        I think that's what you're seeing. A society caught up in the novelty, losing that naive enjoyment. Not a realization of met effects.

      • logicchains 18 hours ago

        >It's increasingly discussed in traditional media too so let's toss out that first line glib dismissal.

        Traditional media is the absolute worst possible source for anything related to social media because of the extreme conflict of interest. Decentralised media is a fundamental threat to the business model of centralised media, so of course most of the coverage of social media in traditional media will be negative.

        • alisonatwork 17 hours ago

          Unfortunately most of what people understand as "social media" is not decentralized, and most of the biggest names on Substack in particular come directly out of "traditional media", which is exactly why it's not a real alternative. Substack is just another newspaper except now readers have to pay for every section they want to read.

          • bluebarbet 17 hours ago

            The difference between traditional and social media is not just technical. Traditional media hosts a profession (journalism) with a code of ethics, editorial oversight, minimal standards, a mission of truth-seeking. It's easy to be cynical but those things have generally served us well. The Substack jungle is not a good replacement.

            • ivewonyoung 16 hours ago

              > Traditional media hosts a profession (journalism) with a code of ethics, editorial oversight, minimal standards, a mission of truth-seeking

              Which traditional media outlets follow those things nowadays? Genuine question, looking for information and news to consume.

              • jpalawaga 9 hours ago

                almost all of the major ones? voices on the internet have lead people to believe that the journalists at major publications are biased, and that somehow also means they're lying and unethical.

                what's interesting is that those opinions are taken at face value without ever happening to do any practical evaluation about traditional media outlets.

                the reality is, if you ever read any alt-news publication it becomes evident extremely quickly how deprived of any standards those publications actually are.

            • paganel 5 hours ago

              Yes, they get paid to spill out stuff that materially benefits those that do the paying, there’s another name for that that I won’t use on a Sunday.

        • Theodores 17 hours ago

          I wish to quibble with you on this as there is a love/hate relationship between the conventional media and social media.

          The mainstream media have several sources, including the press releases that get sent to them, the newswires they get their main news from and social media.

          In the UK the press, in particular, the BBC, were early adopters of Twitter. Most of the population would not have heard of it had it not been for the journalists at the BBC. The journalists thought it was the best thing since the invention of the printing press. Latterly Instagram has become an equally useful source to them and, since Twitter became X, there is less copying and pasting tweets.

          The current U.S. President seems capable of dictatorship via social media, so following his messages on social media is what the press do. I doubt any journalist has been on whitehouse.gov for a long time, the regular web and regular sources have been demoted.

      • krapp 18 hours ago

        "net-negative" sounds like a rigidly defined mathematically derived result but it's basically just a vibe that means "I hate social media more than I like it."

        • sedawkgrep 17 hours ago

          I'm struggling to understand your point, especially since the conclusion you posit is rather glib and dismissive.

          Net-negative is not quantifiable. But it is definitely qualifiable.

          I don't think you have to think of things in terms of "hate it more than I like it" when you have actual examples on social media of children posting self-harm and suicide, hooliganism and outright crimes posted for viewership, blatant misinformation proliferation, and the unbelievable broad and deep affect powerful entities can have on public information/opinion through SM.

          I think we can agree all of these are bad, and a net-negative, without needing any mathematic rigor.

          • krapp 17 hours ago

            My point is that "More and more people declaring social media net-negative" doesn't mean anything, and it certainly isn't a valid "first step towards changing anything" because it isn't actionable.

            >I don't think you have to think of things in terms of "hate it more than I like it" when you have actual examples on social media of children posting self-harm and suicide, hooliganism and outright crimes posted for viewership, blatant misinformation proliferation, and the unbelievable broad and deep affect powerful entities can have on public information/opinion through SM.

            Sure, and then there's plenty of children not posting self-harm and suicide, hooliganism and outright crimes posted for viewership, and plenty of information and perfectly normal, non-harmful communication and interaction. "net-negative" implies there is far more harmful content than non-harmful, and that most people using social media are using it in a negative way, which seems more like a bias than anything proven. I can agree that there are harmful and negative aspects of social media without agreeing that the majority of social media content and usage is harmful and negative.

            • sedawkgrep 15 hours ago

              While I appreciate the idea that moving without factual data is often detrimental (which is what I believe you're implying here), I don't share the opinion that SM deserves any benefit of the doubt.

              I'm old enough to have lived as an adult pre-SM, and from my perspective the overwhelming impact of social media has been more inflammatory, degrading, divisive, etc., etc., etc., than whatever positives you think you're getting.

              A family friend's teenage daughter isn't allowed a cell phone, and thus has zero presence or view into SM spaces. Unlike nearly all her friends, she doesn't suffer from depression, anxiety, or any other common malady that is so prevalent today with the youth. Yes it's anecdotal, but it's also stark.

              We got along just fine before SM, and we'd be just fine again without it.

              • krapp 5 hours ago

                That's just your perspective, based on the fact that controversy makes headlines and normality doesn't. One might conclude based on headlines and populist political rhetoric that the US is a crime-filled hellhole, awash in gang violence and illegal aliens swarming over the border raping and pillaging and burning entire cities to the ground, whereas in reality crime is lower than it has been for years. Perceptions created by the media aren't always accurate, and "social media is a cancer" is absolutely a media-driven narrative. Remember when TikTok was a CCP mind-control weapon turning our children into sleeper agents? When Twitter was threat to the very existence of Western democracy that controlled human speech and could topple governments at will? The vast Marxist conspiracy behind all social media that rigged elections for the DNC? The louder such narratives become, the more we should question the motives of whomever holds the bullhorn.

                A lot of people using social media aren't teenagers. A lot of teenagers are depressed and anxious for reasons other than using social media. A lot of teenagers use social media and aren't depressed and anxious because of it. A lot of teenagers find community and support for their issues through social media. Your extrapolation from a sample size of "one teenage girl and her friends that I'm aware of" to the billions of people currently using social media, and your conclusion that social media is responsible for all of the maladies common to youth doesn't really mean much.

                • sedawkgrep 33 minutes ago

                  Your first paragraph is just as applicable to social media as it is to traditional media...possibly moreso. So claiming that the media lies or deceives and shouldn't be believed does not lend credence to anything you're saying. When you say "media-driven narrative", where do you think that's coming from? I probably see 10x social media to traditional media and it's all over the place. So it's not the old guard barking at the new.

                  The reality is social media today lacks most of the rigor and accuracy that traditional media needed to be trustworthy. There's virtually no vested interest in anyone on social media being honest and forthright about anything.

                  Your second paragraph is simply your perspective (and full of broad statements), and like you say, your opinion on that matter doesn't mean any more to me than apparently mine to you.

                  Yet here we are, with more depression, anxiety, and civil unrest nationally than we've had since probably Vietnam. At least all that unrest is what I see predominantly on SM.

      • paganel 5 hours ago

        What’a being discussed in the traditional media has no value anymore because it’s a dead medium, inhabited by dinosaurs.

      • Lerc 17 hours ago

        I did not consider it a glib dismissal, and I would not consider traditional media an appropriate avenue to litigate this either. Trial by media is a term used to describe something that generally think shouldn't occur.

        The appropriate place to find out what is and isn't true is research. Do research, write papers, discuss results, resolve contradictions in findings, reach consensus.

        The media should not be deciding what is true, they should be reporting what they see. Importantly they should make clear that the existence of a thing is not the same thing as the prevalence of a thing.

        >Academic "let's evaluate each individual point about it on its own merits" is not how this sort of thing finds political momentum.

        I think much of my post was in effect saying that a good deal of the problem is the belief that building political momentum is more important than accuracy.

        • alwa 10 hours ago

          Weren’t you, in your initial post, suspicious that the research process was settling on a pessimistic consensus view? Figuring that, because most every formal study is coming up negative (or “no effect supported”), it must be that the research is selective and designed to manipulate? And that a phenomenon can’t exhibit a diversity of uniformly bad effects without “an underlying reason that has been left unstated and unproven”?

          I don’t know how I’d state or prove a single underlying reason why most vices are attractive-while-corrosive and still, on the whole, bad. It feels like priests and philosophers have tried for the whole human era to articulate a unified theory of exactly why, for example, “vanity is bad”. But I’m still comfortable saying gambling feels good and breaks material security, lust feels good and breaks contentment (and sometimes relationships), and social media feels good and breaks spirits.

          I certainly agree that “social media” feels uncomfortably imprecise as a category—shorthand for individualized feeds, incentives toward vain behavior, gambling-like reinforcement, ephemerality over structure, decontextualization, disindividuation, and so on; as well as potentially nice things like “seeing mom’s vacation pics.”

          If we were to accept that social media in its modern form, like other vices, “feels good in the short term and selectively stokes one’s ego,” would that be enough of a positive side to accept the possibility for uniformly negative long-run effects? For that matter, and this is very possible—is there a substantial body of research drawing positive conclusions that I’m not familiar with?

        • non_aligned 14 hours ago

          > The appropriate place to find out what is and isn't true is research. Do research, write papers, discuss results, resolve contradictions in findings, reach consensus.

          Few hot-button social issues are resolved via research, and I'm not sure they should be. On many divisive issues in social sciences, having a PhD doesn't shield you from working back from what you think the outcome ought to be, so political preferences become a pretty reliable predictor of published results. The consensus you get that way can be pretty shoddy too.

          More importantly, a lot of it involves complex moral judgments that can't really be reduced to formulas. For example, let's say that on average, social media doesn't make teen suicides significantly more frequent. But are OK with any number of teens killing themselves because of Instagram? Many people might categorically reject this for reasons that can't be dissected in utilitarian terms. That's just humanity.

        • TheOtherHobbes 7 hours ago

          There's plenty of research. Plenty. None of it is positive.

          Summaries with links here. https://socialmediavictims.org/effects-of-social-media/

          It's really not hard to confirm this.

          The problem isn't that "building political momentum is more important than accuracy", it's that social media is a huge global industry that pumps out psychological, emotional, and political pollution.

          And like all major polluters, it has a very strong interest in denying what it's doing.

        • autoexec 15 hours ago

          > The media should not be deciding what is true, they should be reporting what they see.

          Largely I don't think the media has been dictating anything. They've just been reporting on the growing body of evidence showing that social media is harmful.

          What you'd call "trial by media" is just spreading awareness and discussion of the evidence we have so far which seems like a very good thing. Social media moves faster than scientific consensus, and there's a long history of industry doing everything they can to slow that process down and muddy the waters. We've seen facebook doing exactly that already by burying child safety research.

          A decade or more of "Do thing, say nothing" is not a sound strategy when the alternative is letting the public know about the existing research we have showing real harms and letting them decide for themselves what steps to take on an individual level and what concerns to bring to their representatives who could decide policy to mitigate those harms or even dedicate funding to further study them.

    • beowulfey 20 minutes ago

      There are a lot biochemical hypotheses for why social media is unhealthy that I personally buy into.

    • kmacdough 2 hours ago

      I'm with you on the skepticism, but I also think the underlying point is worth acknowledging:

      Social media represents a step change in how we consume news about current events. No longer are there central sources relied on by huge swaths of the population. Institutions which could be held accountable as a whole and stood to lose from poor reporting. Previous behemoths like NYT, WaPo, Bloomberg are now comparatively niche and fighting for attention. This feels so obvious it's not necessary to litigate, but if someone has statistics to the contrary, I'll be happy to look deeper and re-evaluate.

      I agree, one should not immediately succumb to fear of the new. At the same time, science is slow by design. It takes years to construct, execute and report on proper controlled studies. Decades to iterate and solidify a holistic analysis. In the mean time, it seems naive to run forward headlong, assuming the safest outcome. We'll have raised a generation or two before we can possibly reach analytical confidence. Serious irreparable damage could be done far before we have a chance to prove it.

    • Aunche 15 hours ago

      > I am prepared to accept aspects of social media are bad. Clearly identify why and how and perhaps we can make progress addressing each thing.

      Companies intentionally design social media to be as addictive as possible, which should be enough to declare them as bad. Should we also identify each chemical in a vape and address each one individually as well before banning them for children? I think such a ban for social media would probably be overkill, but it should not be controversial to ban phone use in school.

    • ushiroda80 15 hours ago

      I don't think reasoning needs to be that complex. Addictive things are harmful, social media is design to be addictive (and is increasing). There is a correlation of higher addictiveness with harm. Children in particular are vulnerable for addictive things. So given the above, the expectation for social media which is highly addictive is that it would be highly harmful, unless there are clear reasons that it's not harmful.

    • tempodox 7 hours ago

      > I am prepared to accept aspects of social media are bad. Clearly identify why and how

      That has been done over and over again, but as long as law makers and regulators remain passive, nothing will improve.

    • boppo1 an hour ago

      The Nepalese just elected a govenment on Discord. Who says we can’t litigate on substack? Hell, it might be the future.

    • solid_fuel 18 hours ago

      There a lot of money in social media, literally hundreds of billions of dollars. I expect the case against it will continue to grow, like the case against cigarettes did.

      I will say this, and this is anecdotal, but other events this week have been an excellent case study in how fast misinformation (charitably) and lies (uncharitably) spread across social media, and how much social media does to amp up the anger and tone of people. When I open Twitter, or Facebook, or Instagram, or any of the smaller networks I see people baying for blood. Quite literally. But when I talk to my friends, or look at how people are acting in the street, I don't see that. I don't see the absolute frenzy that I see online.

      If social media turns up the anger that much, I don't think it's worth the cost.

      • Lerc 17 hours ago

        >There a lot of money in social media, literally hundreds of billions of dollars. I expect the case against it will continue to grow, like the case against cigarettes did.

        I don't think it follows that something making money must do so by being harmful. I do think strong regulation should exist to prevent businesses from introducing harmful behaviours to maximise profits, but to justify that opinion I have to believe that there is an ability to be profitable and ethical simultaneously.

        >events this week have been an excellent case study in how fast misinformation (charitably) and lies (uncharitably) spread across social media

        On the other hand The WSJ, Guardian, and other media outlets have published incorrect information on the same events. The primary method that people had to discover that this information was incorrect was social media. It's true that there was incorrect information and misinformation on social media, but it was also immediately challenged. That does create a source of conflict, but I don't think the solution is to accept falsehoods unchallenged.

        If anything education is required to teach people to discuss opposing views without rising to anger or personal attacks.

        • solid_fuel 17 hours ago

          > I don't think it follows that something making money must do so by being harmful.

          My point isn't that it's automatically harmful, simply that there is a very strong incentive to protect the revenue. That makes it daunting to study these harms.

          > On the other hand The WSJ, Guardian, and other media outlets have published incorrect information on the same events. The primary method that people had to discover that this information was incorrect was social media.

          I agree with your point here too, and I don't think the solution is to completely stop or get rid of social media. But, the problem I see is there are tons of corners of social media where you can still see the original lies being repeated as if they are fact. In some spaces they get challenged, but in others they are echoed and repeated uncritically. That is what concerns me - long debunked rumors and lies that get repeated because they feel good.

          > If anything education is required to teach people to discuss opposing views without rising to anger or personal attacks.

          I think many people are actually capable of discussing opposing views without it becoming so inflammatory... in person. But algorithmic amplification online works against that and the strongest, loudest, quickest view tends to win in the attention landscape.

          My concern is that social media is lowering people's ability to discuss things calmly, because instead of a discussion amongst acquaintances everything is an argument is against strangers. And that creates a dynamic where people who come to argue are not arguing against just you, but against every position they think you hold. We presort our opponents into categories based on perceived allegiance and then attack the entire image, instead of debating the actual person.

          But I don't know if that can fixed behaviorally, because the challenge of social media is that the crowd is effectively infinite. The same arguments get repeated thousands of times, and there's not even a guarantee that the person you are arguing against is a real person and not just a paid employee, or a bot. That frustration builds into a froth because the debate never moves, it just repeats.

          • Lerc 17 hours ago

            >My point isn't that it's automatically harmful, simply that there is a very strong incentive to protect the revenue. That makes it daunting to study these harms.

            The problem is that having an incentive to hide harms is being used as evidence for the harm, whether it exists or not.

            Surely the same argument could be applied that companies would be incentivised to make a product that was non-harmful over one that was harmful. Harming your users seems counterproductive at least to some extent. I don't think it is a given that a harmful approach is the most profitable.

            • solid_fuel 17 hours ago

              > The problem is that having an incentive to hide harms is being used as evidence for the harm, whether it exists or not.

              No, the incentive to hide harm is being given as a reason that studies into harm would be suppressed, not as evidence of harm in and of itself. This is a direct response to your original remark that "Part of me thinks that if the case against social media was stronger, it would not be being litigated on substack."

              Potential mechanisms and dynamics that cause harm are in the rest of my comment.

              > Harming your users seems counterproductive at least to some extent.

              Short term gains always take precedence. Cigarette companies knew about the harm of cigarettes and hid it for literally decades. [0] Fossil fuel companies have known about the danger of climate change for 100 years and hid it. [1]

              If you dig through history there are hundreds of examples of companies knowingly harming their users, and continuing to do so until they were forced to stop or went out of business. Look at the Sacklers and the opioid epidemic [2], hell, look at Radithor. [3] It is profitable to harm your users, as long as you get their money before they die.

              [0] https://academic.oup.com/ntr/article-abstract/14/1/79/104820... [1] https://news.harvard.edu/gazette/story/2021/09/oil-companies... [2] https://en.wikipedia.org/wiki/Sackler_family [3] https://en.wikipedia.org/wiki/Radithor

              • Lerc 16 hours ago

                >No, the incentive to hide harm is being given as a reason that studies into harm would be suppressed, not as evidence of harm in and of itself. This is a direct response to your original remark that "Part of me thinks that if the case against social media was stronger, it would not be being litigated on substack."

                That seems like a fair argument. I don't think it means that it grants opinions the weight of truth. I think it would make it fair to identify and criticise suppression of research and advocate for a mechanism by which such research can be conducted. An approach that I would support in this area was a tax or levy on companies with large numbers of users that could be ear-marked for funding independent research regarding the welfare of their user base and on society as a whole.

                >Short term gains always take precedence.

                That seems a far worthier problem to address.

                >If you dig through history there are hundreds of examples of companies knowingly harming their users

                I don't deny that these things exist, I simply believe that it is not inevitable.

                • Eisenstein 14 hours ago

                  > That seems a far worthier problem to address.

                  If we can't fix the underlying problem immediately, treating the symptoms seems reasonable in the meantime.

      • Tade0 14 hours ago

        > If social media turns up the anger that much, I don't think it's worth the cost.

        It doesn't. It's just that when people can publish whatever with impunity, they do just that.

        Faced with the reality of what they're calling for they would largely stop immediately.

        I believe the term for that is "keyboard warrior".

        • NeutralCrane 13 hours ago

          What you are describing is the mechanism by which social media turns up the anger.

    • make_it_sure 6 hours ago

      seems that you're the guy that likes to be against the norm, even if you're wrong. Social media being controlled by corporations and algorithms built to create addiction should be enough, unless you have other motives to ignore all this.

    • nathan_compton 16 hours ago

      All this is good except that to achieve any kind of actual political action in this actual universe in which we live, we must use rhetoric. Asking people to be purely rational is asking them to fail to change anything about the way our culture works.

    • techpineapple 2 hours ago

      I think the problem with social media is it’s easy to exploit, all the most powerful people in the world perceive themselves to benefit from social media. This isn’t true for something like smoking.

    • nobodywillobsrv 10 hours ago

      The problem is this kind of long form "thinks" miss the basics and even uses polarising denialist phrases like "fear mongering"

      There is a an obvious incoherence and even misreasoning present in the people most ruined by the new media.

      For example, you might want to drive the risk of something to zero. To do that, you need to calmly respond with policy every bad event of that type with more restrictions at some cost. This should be uncontentious to describe yet again and again the pattern is to mistake the desires, the costs and the interventions.

      I can't even mention examples of this without risking massive karma attacks. That is the state of things.

      I used to think misreasoning was just something agit prop accounts did online but years ago started hearing the broken calculus being spoken by IRL humans.

      We need a path forward to make people understand they should almost all disagree but they MUST agree on how they disagree else they don't actually disagree.They are just barking animals waiting for a chance to attack.

    • logicchains 18 hours ago

      There's a concerted assault on social media from the powers that be because social media is essentially decentralised media, much harder for authoritarians to shape and control than centralised media. Social media is why the masses have finally risen up in opposition to what Israel's been doing in Gaza, even though the genocide has been going on for over half a century: decentralised information transmission allowed people to see the reality of what's really going on there.

      • beeflet 15 hours ago

        It's not decentralized at all. It represents a total commercialization of the town square.

        The situation you reference with regard to Israel/Gaza is only possible because TikTok is partially controlled by Chinese interests. But it also goes to show that TikTok could have easily been banned or censored by western governments. Just kick them off the App Stores and block the servers. For example, there is no support Net Neutrality in the USA that would defend them if the government wanted to quietly throttle their network speed.

        Social media as it exists now is not decentralized in any meaningful capacity.

  • mihaic 9 hours ago

    I think the biggest problem in arguing against tech and social media is that it in truth you rely on counterfactual positions, which describe how the world would look without that thing.

    A world without online dating for instance wouldn't just be the same as now, except without those apps. Now forms of socializing would emerge, which you could argue are more local and healthier for society.

    When talking about social media, I now ignore the more powerful arguments of how better the world could be without people spending hours on their smartphones, and focus on the problem that it's a surogate for socialization where everyone wants to sell you something, which most people seem to agree is wrong.

  • geoffbp 6 hours ago

    I fell into the trap over some months of installing fb, Reddit, x, Instagram. It really is amazing how additive they are. I’ve since removed the apps but still have an account and limit myself to using the web versions only

  • hedayet 6 hours ago

    I used to spend 4+ hours a day glued to Facebook. Last November I hit a tipping point and quit social media altogether:

    Facebook: deactivated Twitter: deleted my account altogether LinkedIn: removed the app—now I only post and check messages via desktop with a news-feed eradicator Google + Chrome + Youtube on mobile: deleted. now I just use Safari in incognito mode.

    Once the apps were no longer at my fingertips, quitting was surprisingly easy. I don’t miss them at all and I’m enjoying life much more.

    As for HN, I browse only sporadically—and it’s never felt addictive to me anyway.

    • rightbyte 6 hours ago

      A tip is to also have Leechblock on Firefox mobile such that you can't easily cheat.

      I realized I had this muscle memory vising some detrimental sites in a loop so I blocked them.

    • 0xDEAFBEAD 5 hours ago

      Individual quitting could be a bad solution for the systemic effects of social media though, if it leaves the remaining population of social media users even more radicalized. (I'm assuming that individual quitters tend to be more level-headed than the average user. If you self-assess as being high-risk for radicalization, then yeah I support individual quitting in your particular case.)

  • isodev 19 hours ago

    I think to be clear that’s “The case against algorithmic*” social media”, the kind that uses engagement as a core driver.

    • parasti 9 hours ago

      I recently learned that Tiktok has a thing called "Streak Pets". Imagine taking a dopamine addiction-inducing activity and imagine gamifying that to maximize engagement in that activity. Imagine the brainstorming sessions at Tiktok where they navigate around the glaring issue of the fried brain circuitry of their own users.

      • raziel2p 5 hours ago

        mind-blowing. like tamagochis for the modern age.

    • elric 9 hours ago

      Where "engagement" is short for driving eyeballs to ads, optimizing for this ad nauseum, until the platform is raking in the dough and they stop caring about their users in the slightest.

  • stack_framer 16 hours ago

    I did my own informal research study—I quit social media cold turkey. My findings: I feel much better. I don't need any other data.

    • delis-thumbs-7e an hour ago

      Same here. Even very depressing news do not cause and endless circle of anxiety and scrolling, when you don’t have constant new enforcement in form of comments, I just get bored, switch off and take some distance - as we should. I already stopped using FB after 2016 when I noticed that instead of being better informed - like I felt I was - I was actually dumber and knew less what was going on. After that I have gradually quit forums, IG, reddit. Onlyone left is youtube, which I watch occasionally to watch some comedy. Ifeel better, more calm and I feel I am on the driver’s seat.

      I think it is insane that we give these companies this much power and influence in our lives and societirs without them contributing almost nothing.

    • coffeefirst 12 hours ago

      I have informally reproduced your study and reached the same conclusion.

  • zyxzevn an hour ago

    The problem with social media (and all media) is opinion-based censorship, causing group-think. And the chaos of replies that are uncategorized.

    Different opinions do matter. But due to the algorithms, the most emotional responses are promoted. There is no way to promote facts or what people think are facts.

    So most discussion will be extremely emotional and not based on facts and their value. This is even true in scientific discussions.

    Combined with group-think, these emotions can grow and lead to catastrophic outcomes.

    • johnecheck 3 minutes ago

      > There is no way to promote facts or what people think are facts.

      There is no way with existing platforms and algorithms. We need systems that actually promote the truth. Imagine if claims (posts) you see come with a score* that correlates with whether the claim is true or false. Such a platform could help the world, assuming the scores are good.

      How to calculate these scores is naturally the crux of the problem. There's infinite ways to do it; I call these algorithms truth heuristics. These heuristics would consider various inputs like user-created scores and credentials to give you a better estimate of truth than going with your gut.

      Users clearly need algorithmic selection and personalized scores. A one-size-fits-all solution sounds like a Ministry of Truth to me.

      * I suggest ℝ on [-1,1].

      -1 : Certainly false

      -0.5 : Probably false

      0 : Uncertain

      0.5 : Probably true

      1 : Certainly true

    • btreecat an hour ago

      > The problem with social media (and all media) is opinion-based censorship, causing group-think. And the chaos of replies that are uncategorized.

      All people are biased. It's impossible to also avoid bias needed to filter out the firehose of data.

      What your describing is often a form of moderation.

      > Different opinions do matter. But due to the algorithms, the most emotional responses are promoted. There is no way to promote facts or what people think are facts.

      This is tuneable. We have tuned the algos for engagement, and folks engage more with stuff they emotionally react to.

      People could learn to be less emotionally unstable.

      > So most discussion will be extremely emotional and not based on facts and their value. This is even true in scientific discussions.

      I think your over fitting. Moderation drives a lot of how folks behave in a community.

      > Combined with group-think, these emotions can grow and lead to catastrophic outcomes.

      Group think is also how we determined mamales are mamales and the earth isn't the center of the universe. Sometimes a consensus is required.

      • hsartoris an hour ago

        > People could learn to be less emotionally unstable.

        How does it make sense to make billions of people responsible for abating the consequences of choices made by a few social media companies?

  • blitz_skull 18 hours ago

    The last week has taken me from “I believe in the freedom of online anonymity” to “Online anonymity possess a weight that a moral, civil society cannot bear.”

    I do not believe humans are capable of responsibly wielding the power to anonymously connect with millions of people without the real weight of social consequence.

    • jacobedawson 17 hours ago

      The strongest counterpoint to that is the intense chilling effect that zero anonymity would have on political dissent and discourse that doesn't match the status quo or party line. I feel that would be much more dangerous for our society than occasionally suffering the consequence of some radicalized edge cases.

      • slg 17 hours ago

        In that instance, the anonymity is treating the symptom and not the root cause of the problem you fear. The actual problem is a society that does not tolerate dissent.

        • NoahZuniga 17 hours ago

          You might live in an extremely free country and have no fear about political prosecution but still fear social prosecution.

          If someone I was friends with made racist remarks, they wouldn't be prosecuted for that. But I would stop being their friend. Similarly if I was the only one in my friend group against racism and advocate firefly against it, they would probably stop being my friends.

          • slg 17 hours ago

            >If someone I was friends with made racist remarks, they wouldn't be prosecuted for that. But I would stop being their friend.

            So you want your friend to be able to anonymously express their racism while being able to hide it from you? I can't imagine advocating for that as a desired goal rather than a negative side effect.

            >Similarly if I was the only one in my friend group against racism and advocate firefly against it, they would probably stop being my friends.

            If we are talking about a society level problem, I think it is a little silly to think a society as toxic as this hypothetical one could be saved by anonymous internet posting.

            For the record, I'm not as against anonymous posting as the person who started this specific comment thread, I just think this line of argument is advocating for a band-aid over bigger issues.

            • NoahZuniga 16 hours ago

              These were just extreme examples to indicate that there can be social repercussions to dissenting.

              Maybe a more convincing example is that if I advocate for making it easier to build housing because that will lower the cost of housing and many of my friends are homeowners, they might really not like me because lowering the cost of housing directly lowers their net worth.

              Are these people evil for not wanting to lose their retirement savings (wrapped up in their home)?

              Edit: also

              > So you want your friend to be able to anonymously express their racism while being able to hide it from you?

              While on the specific example of racism I'm pretty convinced of my moral correctness, I am not bold enough to declare that every bit of my worldview is the universally correct one. I am also not so bold to say that I will always be instantly convinced of my incorrectnes by a friend challenging my worldview (if they actually do have a better stance on some thing). My conclusion is that my friend should have some place to platform his better opinion without (having to fear) alienating me. And the only way to achieve this as far as I know is anonymous platforms.

            • giardini 11 hours ago

              sig says "So you want your friend to be able to anonymously express their racism while being able to hide it from you? I can't imagine advocating for that as a desired goal rather than a negative side effect."

              Deceit is a characteristic of our humanity. We all deceive others and ourselves. If people are to be allowed to be fully expressive as humans they need to be able to deceive. And so they require anonymity.

              See Robert Trivers' works

              https://www.amazon.com/stores/author/B001ITVRUO/about

              • fercircularbuf 6 hours ago

                I don't see the logic in this argument. What's the difference from your argument if I state that murder is a characteristic of our humanity? If people are to be allowed to be fully expressive as humans they need to be able to murder.

                • Chris2048 3 hours ago

                  > What's the difference from your argument if I state that murder is a characteristic of our humanity?

                  It's unclear that it's true. I think the implication is deceit is a human characteristic because all humans do it, perhaps even subconsciously; Is the same true of murder?

            • foxglacier 16 hours ago

              I live in a society as toxic as that. It's New Zealand. One of the minor parties currently in government aims to undo systemic racism. However, the popular opinion is that they are the racists because of that. I don't dare tell people that I voted for them because I'll be judged as a racist by some of my family members and loose friends. If I say it on the local internet groups, others will be hostile to me for it. Anonymity helps people to speak up about these issues.

              How do we solve those bigger issues when we live in an emperor's new clothes society? Wait for children who haven't learnt the rules to point them out?

              • synecdoche 7 hours ago

                I understand this view is unpopular, but nevertheless. For something to be systematic there needs to be some set of rules governing it. I have yet to see any evidence of discriminatory rules as part of any western company or government policy, except for affirmative action and equivalent policies which do have such rule sets, where some group is prioritised to the detriment of other.

                • hoss1474489 4 hours ago

                  Explicit and obvious encoding in rules isn’t what makes something systemic.

              • Chris2048 3 hours ago

                > aims to undo systemic racism ... they are the racists because of that

                this sounds like a suspicious characterisation - how are they trying to undo systemic racism, and what do they identify as "systemic racism"?

          • imtringued 5 hours ago

            Your case doesn't sound reasonable and it also doesn't fit the current zeitgeist.

            What people these days are worried about isn't that they are racist and have no outlets for their racism. It's that they worry that whatever they say will be reinterpreted as racism when they were making an honest attempt to not be racist.

            • NoahZuniga 4 hours ago

              > What people these days are worried about isn't that they are racist and have no outlets for their racism. It's that they worry that whatever they say will be reinterpreted as racism when they were making an honest attempt to not be racist.

              So you agree with my point that people could face social prosecution for dissenting (even when they are correct), so we should have anonymous platforms where they can champion their ideas.

              > Your case doesn't sound reasonable and it also doesn't fit the current zeitgeist.

              These were just extreme examples to indicate that there can be social repercussions to dissenting.

              Maybe a more convincing example is that if I advocate for making it easier to build housing because that will lower the cost of housing and many of my friends are homeowners, they might really not like me because lowering the cost of housing directly lowers their net worth.

        • tempodox 7 hours ago

          Some ailments (society that does not tolerate dissent) cannot be cured, but that doesn’t invalidate protection against their effects.

        • Spivak 17 hours ago

          I think we should operate on the premise that no society in the history of humanity has tolerated dissent and none ever will. So treating the symptom is all we can do. It's the basis of why privacy is necessary in any respect.

          The rational tolerant society you imagine is so far fetched we don't even pretend it can exist even in fantasies.

      • avazhi 11 hours ago

        Maybe the chilling effect is the point, and maybe it's been demonised unfairly.

        To be clear, I think freedom of speech is a bedrock foundation of intellectual society and should be the starting point for modern societies.

        But perhaps we really should outlaw anonymity when it comes to expression. Allow people to express themselves, but it shouldn't emanate from the void.

      • Barrin92 16 hours ago

        >the intense chilling effect that zero anonymity would have on political dissent

        Chilling the discourse would be a feature, not a bug. In fact what discourse in most places these days needs is a reduction in temperature.

        This kind of defence of anonymity is grounded in the anthropologically questionable assumption that when you are anonymous you are "who you really are" and when you face consequences for what you say you don't. But the reality is, we're socialized beings and anonymity tends to turn people into mini-sociopaths. I have many times, in particular when I was younger said things online behind anonymity that were stupid, incorrect, more callous, more immoral than I would have ever face-to-face.

        And that's not because that's what I really believed in any meaningful sense, it's because you often destroy any natural inhibition to behave like a well-adjusted human through anonymity and a screen. In fact even just the screen is enough when you look at what people post with their name attached, only to be fired the next day.

      • phendrenad2 16 hours ago

        Well, perhaps people should think twice before stirring the pot. Maybe the incentive to get your 20 seconds of fame by making some snappy comment on a public figure's post is part of what's driving incivility online.

        • nathan_compton 16 hours ago

          I actually don't think incivility per se is the problem. The problem is that social media encourages us to be inauthentic because we all subconsciously cater to the gaze, both courting its attention and terrified of it at the same time. This is way worse than people being rude.

    • creata 15 hours ago

      If you're talking about reactions to the murder of Charlie Kirk, I really don't think anonymity is the problem here, because the opinions I've seen people express anonymously aren't much different to the opinions I've seen people express with their names attached.

      If anything, the ones where people have attached their names tend to be a bit more extreme. Maybe attaching your name to something makes it feel more important to signal what group you're in.

      • beeflet 15 hours ago

        Named users are not more brave than anonymous users, but they are more reckless

        • pitched 14 hours ago

          Anonymous users are much less likely to get defensive because they have nothing to defend.

          • userbinator 11 hours ago

            4chan seems to be a counterexample to that.

            • Thorrez 3 hours ago

              Are you saying 4chan users are defensive? They seem pretty unapologetically offensive to me.

    • Longlius 17 hours ago

      Anonymity has no real impact on this. People post heinous things under their full legal names just as readily.

      I'd argue if all it took was people saying some mean things anonymously to change your opinion, then your convictions weren't very strong to begin with.

      • ks2048 17 hours ago

        > People post heinous things under their full legal names just as readily.

        I disagree with "just as readily" (i.e. most of the most heinous things are indeed bots or trolls).

        Also, I imagine that without the huge amount of bots and anonymous trolls, the real-name-accounts would not post as they do now - both because their opinions are shaped by the bots AND because the bots give them the sense that many more people agree with them.

      • numpad0 17 hours ago

        IMO it's a bit of mental gymnastics to think that anonymity has to do with this, when extremist narratives always come attached with a memorable full name and a face.

      • add-sub-mul-div 17 hours ago

        You're right. It's the weakest who are the most susceptible to demagoguery.

    • rkomorn 18 hours ago

      They're unfortunately not much more capable of responsibly connecting with people non-anonymously, I'd say.

      See examples like finding someone's employer on LinkedIn to "out" the employee's objectionable behavior, doxxing, or to the extreme, SWATing, etc.

      • qarl 17 hours ago

        Yeah. People use their real identities on Facebook, and it doesn't help a bit.

        • ks2048 17 hours ago

          > it doesn't help a bit.

          I would replace "it doesn't help a bit" with "it doesn't solve the problem". My casual browsing experience is that X is much more intense / extreme than Facebook.

          Of course, the bigger problem is the algorithm - if the extreme is always pushed to the top, then it doesn't matter if it's 1% or 0.001% - the a big enough pool, you only see extremes.

          • __MatrixMan__ 13 hours ago

            I bet if we didn't tolerate advertising and were instead optimising for what the user wanted we'd come up with something much more palatable.

            • rkomorn 9 hours ago

              A lot of this is driven by the user's behavior, not just advertising, though.

              "The algorithm" is going to give you more of what you engage with, and when it comes to sponsored content, it's going to give you the sponsored content you're most likely to engage with too.

              I'd argue that, while advertising has probably increased the number of people posting stuff online explicitly designed to try and generate revenue for themselves, that type of content's been around since much earlier.

              Heck, look at Reddit or 4chan: they're not sharing revenue with users and I'd say they're at least not without their own content problems.

              I'm not sure there's a convincing gap between what users "want" and what they actually engage with organically.

              • __MatrixMan__ 4 hours ago

                Reddit and 4chan both get their money from advertisers though, so they have an incentive to try to boost engagement above whatever level might be natural for their userbase.

                Social interaction is integrated with our brain chemistry at a very fundamental level. It's a situation we've been adapting to for a million years. We have evolved systems for telling us when its time to disengage, and anybody who gets their revenue from advertising has an incentive to interfere with those systems.

                The downsides of social media: the radicalization, the disinformation, the echo chambers... These problems are ancient and humans are equipped to deal with them to a certain degree. What's insidious about ad-based social media is that the profit motive has driven the platforms to find ways to anesthetize the parts of us that would interfere with their business model, and it just so happens that those are the same parts that we've been relying on to address these evils back when "social media" was shouting into an intersection from a soap box.

                • rkomorn 2 hours ago

                  But neither Reddit nor 4chan really have the feed optimization that you'd find on Meta properties, YouTube, or TikTok.

                  I'm certainly not going to disagree with the notion that ad-based revenue adds a negative tilt to all this, but I think any platforms that tries to give users what they want will end up in a similar place regardless of the revenue model.

                  The "best" compromise is to give people what they ask for (eg: you manually select interests and nothing suggests you other content), but to me, that's only the same system on a slower path: better but still broken.

                  But anyway, I think we broadly are in agreement.

    • ACCount37 7 hours ago

      And this is how we get things like TSA and Patriot Act.

      "I was totally in favor of freedom, until one bad thing happened, and now I think freedom should never have existed in the first place!"

      • neuronic 6 hours ago

        I tend to agree, especially because the most harming political influencers are NOT (pseudo-)anonymous. But... it's not just a "bad thing" that is happening. It is the foundational destruction of free societies as we know them. Debates and democratic discourse are replaced with hate, oppression and violence.

        I believe that social dynamics like shame and consequences are disabled by pseudo-anonymzation. Pretty much the same effect as people becoming more aggressive and vocal in the confines of their cars. You'd never flip off random people in a supermarket as some would do for getting cut off in traffic.

        This substack posts a few interesting theories and ideas how that comes to be. However, the most concerning to me is the asymmetric impact of emotional manipulation due to social media enabled network dynamics.

        In particular:

        > Online discussions are dominated by a surprisingly small, extremely vocal, and non-representative minority. Research on social media has found that, while only 3% of active accounts are toxic, they produce 33% of all content. Furthermore, 74% of all online conflicts are started in just 1% of communities, and 0.1% of users shared 80% of fake news. Not only does this extreme minority stir discontent, spread misinformation, and spark outrage online, they also bias the meta-perceptions of most users who passively “lurk” online.

        The brain responds to alarmist, negative and distressing information with much higher priority. At the same time, very few radical and extreme influencers can utilize this mechanism, amplified by social media trying to boost ad revenue. Counterfactual information which directly appeals to the biases and psychology of users is posted and wrapped into click-baity designs to maximize attention and revenue. Tribes are forming and very few elite users can steer the information consumption of users - not just what but also how.

        This is highly damaging to society and there is no more institutional trust anywhere to retrieve reliable information on which discussions can be based. Everyone selectively chooses their "reliable sources". This is the absolute opposite of how PKI works, it's like everyone just picks the Root Certs they like (for us techies).

        This is of course ironic because all studies and knowledge humanity has to offer are a single search prompt away. But it simply doesn't matter if institutional trust is gone and studies are dismissed because they are coming from "woke" or "radical right wing" sources - completely obliterating what we are trying to achieve with peer review and so on.

    • scandox 6 hours ago

      I'm making 2 assumptions here: that you're American and that you're referring to the recent assassination. What I find odd is that American history is packed with assassinations and domestic terrorism and yet it is this recent event that has affected your thinking. In your own parlance, what gives?

    • scottgg 10 hours ago

      Moreover it’s not even possible for us to engage in _honest debate_ about the impact of social media anymore.

      Absolutist positions without nuance are the norm, and the folks who control these platforms control to a very large extent the narrative to push surrounding them, both directly through the platforms themselves and indirectly through lobbying and the obscene pool of capital they have siphoned off.

    • sporkxrocket 16 hours ago

      Are you talking about the Charlie Kirk thing? What does that have to do with online anonymity? They caught the shooter.

      • moduspol 14 hours ago

        Also there is no shortage of people saying abhorrent things with their real names attached.

    • padjo 7 hours ago

      No offence but if you’re swing between two poles like this in such a short time you probably haven’t considered the topic deeply enough and for long enough.

      Anonymity can be very powerful for marginalised groups and it can be abused by trolls. Its value is contextual and not some simple good/bad dichotomy.

      Successfully integrating technology into society is, like most political topics, complicated, requires a nuanced understanding of issues and a willingness to find compromise and less than perfect solutions. Sadly the political system (and the side in power now particularly) is increasingly offering moral absolutes and simplifications.

    • cramsession 17 hours ago

      Why is that? Some irony as well that you're posting anonymously. Are you comfortable giving us your identification right now?

    • tossaway0 15 hours ago

      I don’t think it has much to do with being named. It’s the assumption that most people have that what they’re reading is being said by someone whose opinion they would actually value if they knew them.

      Disclosing names wouldn’t help. People actually knowing the person would help.

    • balamatom 8 hours ago

      Nice bloody try, guv, I mean "blitz_skull".

      Tell me now will ya, who will effect "the real weight of social consequence" over anonymous 1-to-1M connections, other than other humans, the same kind that by your premise are not "capable of responsibly wielding [...] power" over such things?

      (Or are there multiple kinds? Eh?)

      Would "the real weight of social consequence" work the way you want it to when embodied by a commission? When codified by law? In the form a bot? As crowd? A corp? Me? Nah, you of course.

      It's ever telling how the legitimacy of millions of strangers being able to decide the fate of any one individual is hardly ever called into question - only ever the ability of one to talk back.

    • boplicity 16 hours ago

      Plenty of people are perfectly willing to be publicly despicable online in their social media accounts, using their real names. Pretty easy to find them.

      The problem is the leaders of the large social media organizations do not care about the consequences of their platforms enough to change how they operate. They're fine with hosting extremist and offensive content, and allowing extremists to build large followings using their platforms. Heck, they even encourage it!

    • XorNot 17 hours ago

      What a bizarre conclusion given the multiple high profile individuals and politicians who overtly and directly called for violent oppression and civil war against their political enemies on the last week.

    • __MatrixMan__ 13 hours ago

      Really? With people being tracked down and fired for expressing their political views, it seems like online anonymity is more important than ever.

      Or better yet, we need some kind some zero knowledge doodad which enforces scarcity of anonymous handles such that a given voice is provably a member of your same congressional district, or state, or zip code, or whatever, and is known to not be spinning up new identities all willy nilly like, but can't be identified more precisely than that.

    • tryauuum 18 hours ago

      what happened?

    • mythrwy 14 hours ago

      Would you require identification to copy and tape up a bunch of fliers around town?

      Anonymity is necessary sometimes in my opinion.

    • analognoise 17 hours ago

      We don’t have a moral or civil society anyway; we can’t even prosecute Trumps numerous illegal actions (even when convicted!). Can’t get the Epstein files. Can’t even point out Charlie Kirk was not a great person (while politicians said nothing about the school shooting the same day), and where it’s legal to kill 40,000 of us a year due to poor medical coverage so we can prop up the stock.

      I’m not sure, given the moral dystopia we currently inhabit, what positive benefit would accrue from removing online anonymity?

  • _wire_ 21 hours ago

    These question-begging, click-bait something-is-something-other-than-you-think posts are something less entertaining than the poster thinks.

    • abnercoimbre 19 hours ago

      Yup. Soon as I read:

      > I am going to focus on the putative political impacts of social media

      I closed the tab.

      • IshKebab 18 hours ago

        Yeah I closed it when I saw the size of the scroll bar. If you need 100k words to make your point write a book.

        • stevage 17 hours ago

          Huh, I often have the reverse sentiment with a lot of books: this should have been a blog post. There's often a good intro which lays out the thesis, but each chapter is way too long, spelling out details that are obvious or superfluous.

    • dwedge 7 hours ago

      As soon as I saw that his article has an introduction and the first line of it mentions that he wrote a long essay (as though long == good) I closed it

    • greyadept 17 hours ago

      The author could have made the same points without using words like “polemicizing”, “putative”, and “epistemic”.

  • mallowdram 16 hours ago

    The missing link to our epistemic collapse is language. The acceleration of language, which is arbitrary, accelerates language distortion. The contagion on social media is merely a symptom of the disease of language.

    “Historical language records reveal a surge of cognitive distortions in recent decades” https://www.pnas.org/doi/10.1073/pnas.2102061118

    • throawaywpg 14 hours ago

      acceleration? as in the literal speed at which we translate information through language?

      • mallowdram 12 hours ago

        Not translate it, simply transmitting it. This stuff is just arbitrary. we can say anything we want, it means nothing. Look at high speed conflicts now, each side accuses the other of being the same villain. It means we're saying nothing.

        The initial conditions are arbitrary, very indirect perception. How we ever assumed we were communicating is quite strange. Everything is primate, every word is first a negotiation for status. Then control. Perhaps manipulation. That words words refer directly to anything outside of a momentary context is impossible. Plus every word isn't simply arbitrary, it's metaphors, and they separate things by attributes that are based in folk science/psychology. We basically have to unlearn and replace words.

  • picafrost 8 hours ago

    Designing tools is designing behavior. It shouldn't be surprising that the behavior of a society changes when the primary form of discourse shifts from communication among peers to maximizing the engagement of strangers due to the financial needs of the platform.

  • lattalayta 4 hours ago

    I’m finding that social media is less “social”. Fewer of my friends and family are posting and more and more businesses, ads, and “creators” are filling the gap

  • amatecha 15 hours ago

    "everyone publicly talking in the same room" social media really sucks. I've really enjoyed the smaller-scale, better-curated interaction on mastodon. It feels like a giant step forward in how people can connect and socialize online.

    • beeflet 14 hours ago

      A giant step forward into the echo chamber

      • amatecha 13 hours ago

        I interpret this as "I expect my opinions to be heard by people who don't want to hear them". Show me the ill effects of having an opt-in, consent-based social space where it's not infiltrated by unwelcome participants?

        • zeta0134 3 hours ago

          Well, imagine for a moment that the unwelcome participants are the ones against murder. Everyone currently in the group thinks murdering people who disagree with you is a fine way to solve problems. Outsiders might share insights and opinions to discourage this way of thinking, so they're not allowed to join.

        • beeflet 12 hours ago

          Firstly, there is still pretty centralized moderation done through the bureaucracy of mastodon operators and federation. Similar situation in bluesky. The whole advantage of these "networks" is that they allow you to opt-in to blocklists. A federation essentially becomes a massive aggregation of blocklists, because members that do not obey the blocking policies of others will become defederated. These aren't really opt-in consent-based social spaces here because the vast majority of the "consent" is still delegated by a third party.

          These networks appeal to control freaks who subscribe to many massive blocklists so they don't have to confront challenging ideas. I oppose them on the grounds that being a control freak is bad for the individual and society in general.

          Unfortunately, we are all unwelcome participants in society at large. This is the idea behind protests for example. Real life is not consent-based, so the more time you spend in these networks the more poorly acclimated to reality you will become and the more removed you will be from the public arena of ideas.

          The ill-effects are that whatever (political?) faction that embraces these sort of networks will become mentally weak and will continue to lose debates, and eventually (political?) power.

          • amatecha 7 hours ago

            I don't know about whoever you're referring to but I actually just want to chat with cool people (and hear about what they're up to) and don't want to see any bigoted, ignorant bullshit. It has worked out so well that I have zero interest in any other "social networking" protocol/software whatsoever. In my circles I see exactly zero control freaks, probably because I only associate/interact with people who have respect for the agency of others. I just block/mute people who violate that agency (tho I've only had to do so once or twice) and the server I'm on generally correlates with that vibe.

            The "public arena of ideas" has almost nothing to offer me. If I dare to peruse something like Reddit or Twitter I am immediately aware of the overwhelming averageness of the ideas and degree of insight generally at hand. Such places are poor venues for depthful, nuanced discussion, especially about any difficult topics, especially with the outrage-bolstering "algorithms" in full force, forcing divisive content in everyone's face.

      • tdb7893 10 hours ago

        Echo chambers aren't good but the large scale social media I've tried has a tendency to put me in an echo chamber (specifically one trying to wring out all the engagement possible, often with stuff to make me angry) and also elevate low quality opinions (often factually incorrect or philosophically incoherent).

        Smaller and more personally curated social media has been better for sourcing broad opinions actually if I put just a little work into it.

  • homeonthemtn 17 hours ago

    Social media is a cancer on our society. It is both the asbestos and cigarettes of our generation.

    • SapporoChris 14 hours ago

      Asbestos was used in cigarette filters, notably in Kent Micronite cigarettes from 1952 to 1956.

      https://en.wikipedia.org/wiki/Kent_(cigarette)

    • infotainment 16 hours ago

      Agreed, and I feel like the right answer might be to treat it exactly like cigarettes. For example:

      1. Ban in most places except very specific ones. E.g., "would you like to sit in the social media use section today?"

      2. Make it extremely expensive to access and use. This would likely do wonders to cut down on use, just as it did for cigarettes.

    • lanfeust6 14 hours ago

      You're on social media right now. Probably you could better qualify what it is you think is a problem.

      • delis-thumbs-7e 37 minutes ago

        Are we? In a sense yes that the content is usee-generated and stranger are discussing the content, but is any newsfeed with a comment-section or a hobbyist forum social media? Can I monetise my contributions here, advertise products, sell my karma-points or create networks of like-minded individuals here? Is there any point for me to try to turn my personality into a brand or try to ragebait you to interact with me by saying that your aunt Mary likes Duran Duran, or that your goldfish looks silly, in order to trick the algorithm to show more of my content? Can I channel people from here to a website showing lewd pictures of my parrot in order to get them pay me?

        If all social media was like HN, I think we would be fine y’know.

  • 1vuio0pswjnm7 13 hours ago

    Person A and person B (or group B) want to communicate by using the internet

    Idea: Use person C's website

    This was never a good idea for A and B but turned out to be a great idea for C

    C derives the benefit, C became a billionaire, but it is taking a very long time for A and C to realise they are not getting a good deal

    Sadly in 2025 A and B believe there is no other way to communicate via the internet other than through C

    C could disappear and the internet would live on, and A and B would indeed be able to communicate

    A and B pay internet subscription fees, but generally do not pay subscription fees to C

    The internet is worth something, people are willing to pay for it; C's value is questionable, few would be willing to pay for it

    If not for the internet, C would not be a billionaire

    If not for the internet, A and B could not communicate via C

    The case for the internet is stronger than the case for C

    • 1vuio0pswjnm7 an hour ago

      s/A and C/A and B/

    • meonkeys 4 hours ago

      Yes, this. It's a symptom of late-stage capitalism.

  • vinceguidry 37 minutes ago

    The profit motive is largely what drives these problems.

    My big dream is a social media platform for humans. Self-hostable Zoom / Discord alternative that just works. AGPL-licensed and eventually turned over to the GNU project for long-term maintenance once it's feature-complete. Mastadon is nice and all, but micro-blogging isn't really for ordinary humans.

  • softwaredoug 13 hours ago

    The article mentions political polarization increased most in seniors (65+).

    Social media or not, I would guess it’s largely because many retirees don’t have anything to do. They’re isolated. They want connection and purpose. While younger adults have jobs and obligations.

    My retired dad lived alone. He could talk nonstop about that crazy thing Trump did, but I wasn’t following closely, and somewhat tuned my dad out to not get lost in a rabbit hole. My dad got this from cable news.

    Isolation to me is the root cause at any age. People who only see the world through media (social or otherwise). It’s easy to become radicalized when you don’t have any attachments other than your political affiliations.

  • xnx 18 hours ago

    Social media would be entirely different if there were no monetization on political content. There's a whole lot of ragebaiting/engagement-farming for views. I don't know how to filter for political content, but it's worth a shot. People are free to say whatever they want, but they don't need to get paid for it.

    • stevage 17 hours ago

      Strangely I never see political content on YouTube. Maybe the algorithm worked out quickly I'm simply not interested. Whereas twitter/mastodon/bluesky are awash in it, to the point of making those platforms pretty unusable for me.

      I guess the difference is that YouTube content creators don't casually drop politics in because it will alienate half their audience and lose revenue. Whereas on those other platforms the people I follow aren't doing it professionally and just share whatever they feel like sharing.

      • 0xDEAFBEAD 9 hours ago

        Youtube is the one platform that actually tunes "the algorithm" in a responsible way.

      • timeon 16 hours ago

        Interesting, I do not see politics on Mastodon, while YouTube recommends me not just random politics, but conspiracy theories about politics.

        On Mastodon, those I follow do not post about politics and if they do it is hidden behind content warning.

        YouTube is probably location based as I have no account there and that type of content is relatively mainstream where I live.

    • ants_everywhere 16 hours ago

      they get paid in political power that's why it's so ragebait driven

  • Argonaut998 8 hours ago

    I started using X a few weeks ago and I’m already seeing it impact my mind negatively. It is pure controlled and distilled propaganda that’s clearly made to intentionally shape how we think, across the different skinner boxes that is each different social media platform. I’ll be deleting my account.

    Reddit is by far the worst though since everything is clearly botted yet people pretend it’s organic leading to a kind of false sense of security that what you see is curated and willed by the “people”.

    It’s far more than “engagement” and the “algorithm” - it’s beyond that It’s all blatantly manufactured as some Aquino-esque psyop.

  • dfee 10 hours ago

    I often wonder when I see articles like this if HN counts as social media.

    And then, the continuous re-discovery or the ails of social media on social media is a trip, in itself.

    • AdamN 2 hours ago

      I don't think it has the hallmarks of social media (aggressive engagement mechanisms, etc...). We've had newsgroups for 30+ years and social media requires more than just message boards. The owner needs to push the content such that there is enough engagement to make money from advertising (or monthly fees).

    • mid-kid 9 hours ago

      Technically, yes, but at least it's not filter bubbling everyone.

      • raziel2p 5 hours ago

        there may not be an algorithm at work here as there is on Instagram or TikTok, but there's still a bubble - the name, design and discourse of HN itself works as a filter.

        • mid-kid 3 hours ago

          What you're describing is an echo chamber, when people willingly share only things that they know resonate with the group.

          While still somewhat detrimental, it's at least a visible problem - everyone is aware of it because everyone sees the same thing.

          With filter bubbles, you create personalized echo chambers, that nobody is aware of unless they're in there.

      • topspin 6 hours ago

        There is a "karma" score. You know yours and you've checked the scores of others.

        This is social media. It's a fairly benign manifestation of it, I suppose, but it's social media nonetheless.

        Something to consider while carefully crafting denigrations of what we all think is meant when discussing "social media." Especially if you'd rather not see HN and similar places damaged by righteous politicians.

  • cramsession 17 hours ago

    Without social media, we'd be left with mainstream media, which is a very narrow set of channels that those in power can control. Despite rampant censorship on social media, it's still the best way to circumvent propaganda and give people a voice.

    • sethammons 17 hours ago

      > it's still the best way to circumvent propaganda and give people a voice.

      I think it can amplify propaganda but still give people a voice, which is better than no voice I think

    • n1b0m 16 hours ago

      Its still propaganda just from Russian and Chinese bots.

      • cramsession 16 hours ago

        The vast majority of bots are from Israel.

    • nicce 16 hours ago

      Without social media, people would go out and talk face-to-face or even arrange meetings, like before social media.

      • cramsession 16 hours ago

        That's not media, it's communication with people you know.

    • add-sub-mul-div 16 hours ago

      The idea of social media reducing net propaganda is a wild take.

      • synecdoche 9 hours ago

        Without it, there would be no way to get information from the source. In msm all we get is the msm view. When compared to what was actually said, done or written then you have a chance to make your own opinion. You only then can compare what is in msm and what is not. And the bias is relentless. Which makes it a propaganda machine.

        Of course there is garbage in social media as there is in every field. Find the source if there is one recorded. Msm rarely if ever refer to any. And no wonder. It would risk undermining their publication, which they peddle as unbiased.

      • cramsession 16 hours ago

        We would have no idea what was going on in Gaza if it wasn't for social media. It really exposed how biased (which probably isn't even a strong enough word) our msm is.

  • freshtake 15 hours ago

    The problem is that our thoughts, opinions, and ultimately actions are the product of our exposure. Social media gives a small number of companies (and their algorithms) unparalleled and unchecked control over our exposure.

    We should be educating children at a young age about the benefits and risks of social media. We haven't adapted the way we educate society in light of massive tech changes.

    This will likely be a topic that future humans look back on and wonder why we did this to ourselves.

    • lumost 14 hours ago

      Is the media even “social” anymore? How much of Reddit is simply bots generating catchy takes and then generating commentary on these takes. You can easily be deceived into thinking that a vast number of people believe something, or think the way you do, or think the way you do but were swayed by some thought process.

      Repeat the process long enough and with enough variation and tuning and anyone can be made to believe anything.

    • adrr 14 hours ago

      People have stopped reading the news and rely on social media as their main news source. Thats the scariest thing.

      • topspin 6 hours ago

        > Thats the scariest thing.

        Written "news" is frequently a report about a series of X posts by various authorities, thought leaders and celebrities, embedded directly in the story.

        Square that circle.

      • moduspol 14 hours ago

        At the same time, "reading the news" has become less and less valuable. And social media has an overwhelming impact on the tone and content of "the news," too.

  • amelius 7 hours ago

    I would be in favor of a social media ban around elections.

    In any case, there's nothing wrong with trying it out and seeing what other benefits it brings.

  • drraah 15 hours ago

    I've seen numerous posts from researchers on X demonstrating that people high in psychopathy, low in empathy, and low in cognitive ability are overrepresented on social media. They post more often and fuel polarisation in politics. The extremism is entertaining to others and rewarded with exposure. Political moderates don't tend to get as emotionally invested and are less likely to voice their opinions in the first place. But underlying the extremism and polarisation are real issues. There's often an overlooked middleground that technology can step in to highlight

  • 1970-01-01 16 hours ago

    You reap what you sew. Stupid and uninformed voices receiving equivalent status to wise scientific experts was a mistake. Witnessing the flat Earth crowd growing over the decades encapsulates everything wrong with social media.

    • giardini 10 hours ago

      "1970-01-01" stepped in it 5 hours ago by saying: >"You reap what you sew.<

      "You reap what you sow" is correct. You sew cloth with a needle and thread but sow seeds by throwing them on fertile ground, hoping they will sprout, grow and you will later reap a harvest.

  • alexfromapex 19 hours ago

    My main case against at this point is that everything you post will be accessible by "bad" AI

  • jparishy 18 hours ago

    We, consumers online, are sliced and diced on every single dimension possible in order to optimize our clicks for another penny.

    As a side benefit, when you do this enough, the pendulum that goes over the middle line for any of these arbitrary-but-improves-clicks division builds momentum until it hits the extremes. On either side-- it doesn't matter, cause it will swing back just as hard, again and again.

    As a side benefit the back and forth of the pendulum is very distracting to the public so we do not pay attention to who is pushing it. Billions of collective hours spent fighting with no progress except for the wallets of rich ppl.

    It almost feels like a conspiracy but I think it's just the direct, natural result of the vice driven economy we have these days

  • averageRoyalty 14 hours ago

    The social media problem is very simple to solve. Ban advertising on social media (from platform or users) and ban usage of user data external to the platform.

    When you remove the incentive to engage users, the companies will engage in less abusive practices to push engagement.

    I've never seen this proposed, and I'm confused why.

    • pitched 14 hours ago

      I think the way you define ads and social media would be important. We would end up getting something like the cookie banners again instead of real change.

    • positron26 14 hours ago

      - information silos still exist

      - social incoherence because silos cannot communicate laterally is still there

      - the ads will likely go native to become "content" and more revenue will shift to influencers

      Just saying it's not quite that easy, but yes, ad monetization is a great force of evil.

  • visarga 7 hours ago

    I see this step progression:

    1. people having real problems, like employment, housing, health, or education access

    2. they go online (or watch TV) finding all sorts of extreme takes and biases; these theories provide simple explanations and ways to pin the blame on others

    3. they converge on identity based reasoning, where dialogue becomes impossible, tribal; their posts signal adherence to in-group and rejection of the out-group, no longer tied to reason

    4. they vote against their own best interests, such as recent elections (Trump) and referendums (Brexit) or refuse the vaccine (10x higher death rate, observed in hindsight)

    The thing that is different now is that we have social networks, and that the outcomes are so drastic they are surprising everyone. Could be a coincidence, but I don't think it is.

    So it's: real problems -> toxic social media & tv takes -> identity politics

  • mightyham 15 hours ago

    To me this just reads like fear mongering and shilling for the status quo political establishment. I've recently been learning a bit about Russian history and it has similarities to their conservative nobility throughout the 19th century trying through various means to suppress the spread of liberalism in the public and intelligentsia: the point being that Russia had serious social ills like serfdom and radical political ideas were absolutely warranted. Social media is destabilizing for the influence of establishment sources of information and more of the public (right and left) is finding out more accurate information about how the world works, then coming to natural conclusions about how to address various social ills. Polarization may be increasing, but people forming stronger opinions is also exactly what you would expect in the face of increased revelation about unsolved social problems. Ultimately, I'm optimistic about the long term effects of social media on politics.

  • gerdesj 16 hours ago

    "In conclusion: " "...in particular in the U.S., but probably across Europe as well. ..."

    The world is rather larger than the US and Europe. I physically endure myopia and frankly Mr Witkin seems to figuratively suffer from it.

    I need only mention the name: TikTok.

  • profsummergig 18 hours ago

    I used to be disappointed in myself that I didn't understand Discord well enough to use it.

    Now I'm glad I never understood it well enough to use it.

    • stevage 17 hours ago

      Huh. I'm on a few discords. They're very easy and obvious to use, and I really enjoy them. And because they are generally well divided by channel, it's easy to avoid the bits you don't want.

      • profsummergig 4 hours ago

        You may remember that one needed to use Discord to use MidJourney initially. I was able to use it for that (although a lot of messages that streamed by were confusing to me).

        After that I joined a couple of Discords with tens of thousands of users. Nothing ever seemed to happen on them. I knew I was doing something wrong but I couldn't figure it out.

  • api 18 hours ago

    It's more specific than social media. It's engagement maximizing (read: addiction maximizing) algorithms. Social media wasn't nearly as bad until algorithmic engagement maximizing feeds replaced temporal or topic based feeds and user-directed search.

    Two people walk past you on the street. One says "hi," and the other strips naked and smears themselves with peanut butter and starts clucking like a chicken. Which one maximizes engagement?

    A politician says something sane and reasonable. Another politician mocks someone, insults someone, or says something completely asinine. Which one maximizes engagement?

    This is why our president is a professional troll, many of our public intellectuals are professional trolls, and politics is becoming hyper-polarized into raging camps fixated on crazy extremes. It maximizes engagement.

    The "time on site" KPI is literally destroying civilization by biasing public discourse toward trash.

    I think "trash maximizes engagement" should be considered an established fact at this point. If you A/B test for engagement you will converge on a mix of trolling, tabloid sensationalism, fear porn, outrage porn, and literal porn, and that’s our public discourse.

  • scarface_74 19 hours ago

    I really hate the narrative that social media has increased polarization knowing that my still living parents grew up in the Jim Crow south where they were literally separated from society because of the color of their skin.

    The country has always been hostile to “other”. People just have a larger platform to get their message out.

    • linguae 18 hours ago

      As someone whose grandparents endured Jim Crow, I largely agree in the sense that social media did not create America’s divides. Many of the divides in American society are very old and are very deep, with no easy fixes.

      Unfortunately algorithmic social media is one of the factors adding fuel to the fire, and I believe it’s fair to say that social media has helped increase polarization by recommending content to its viewers purely based on engagement metrics without any regard for the consequences of pushing such content. It is much easier to whip people into a frenzy this way. Additionally, echo chambers make it harder for people to be exposed to other points of view. Combine this with dismal educational outcomes for many Americans (including a lack of critical thinking skills), our two-party system that aggregates diverse political views into just two options, a first-past-the-post election system that forces people to choose “the lesser of two evils,” and growing economic pain, and these factors create conditions that are ripe for strife.

      • dfxm12 17 hours ago

        Unfortunately algorithmic social media is one of the factors adding fuel to the fire

        Saying social media fans the flames is like saying ignorance is bliss. Mainstream media (cable news, radio, newspapers, etc) only gives us one, largely conservative, viewpoint. If you're lucky, you'll get one carefully controlled opposing viewpoint (out of many!). As you say, our choices are usually evil and not quite as evil.

        Anger is not an unreasonable reaction when you realize this. When you realize that other viewpoints exist, the mainstream media and politicians are not acting in anyone's best interest but their own, there really are other options (politically, for news, etc.). Social media is good at bringing these things to light.

        There are no easy fixes to the divides you're talking about, but failing to confront them and just giving in to the status quo, or worse, continuing down our current reactionary transcript, is probably the worst way to approach them.

      • scarface_74 18 hours ago

        So there wasn’t enough fuel in the fire when marauding Klansmen were hanging Black people?

        It was the current President of the US that led a charge that a Black man running for President wasn’t a “real American” and was a secret Muslim trying to bring Shari law to the US and close to half of the US was willing to believe it.

        https://www.youtube.com/watch?v=WErjPmFulQ0

        This was before social media in the northern burbs of Atlanta where I had to a house built in 2016. We didn’t have a problem during the seven years we lived there. But do you think they were “polarized” by social media in the 80s?

        That’s just like police brutality didn’t start with the rise of social media. Everyone just has cameras and a platform

    • 0xDEAFBEAD 9 hours ago

      Race relations were better in the 2000-2010 period, according to Gallup data:

      https://news.gallup.com/poll/1687/race-relations.aspx

      Easy to cherry-pick stuff. You can cherry-pick Jim Crow south; I can cherry-pick Chicago in the 90s:

      https://www.youtube.com/watch?v=rDmAI67nBGU

      I think we have to get past black-and-white thinking and see it as a matter of degree. With 340 million people in the USA, realistically, at least a few of them will always be racist. The question is how powerful and influential the racists are. That's a question which social media feeds into.

      • scarface_74 2 hours ago

        You call 60 years of racial segregation that affected an entire race of people in several states “cherry picking”?

        It’s a huge difference between “a few people being racist” and laws enforcing segregation and laws against interracial marriage.

        The racists have always been in power. You can look at the justice system, the disparity between sentencing for the same crimes across races etc.

        The Supreme Court said you can’t use race as a basis for college admissions. But you can use it as a basis for arresting someone.

        Fox News is the most popular news network and isn’t part of social media.

    • tolerance 19 hours ago

      > The country has always been hostile to “other”. People just have a larger platform to get their message out.

      And a consequence of this is that some people’s perspective of the scale of the nation’s hostilities is limited to the last 5 years or so.

    • nextaccountic 18 hours ago

      One of the factors that led to the Rwandan genocide was the broadcast of the RLTM radio station

      https://en.wikipedia.org/wiki/Rwandan_genocide#Radio_station...

      The radio didn't create the divide, and it wasn't the sole factor in the genocide, but it engrained in the population a sense of urgency in eliminating the Tutsi, along with a stream of what was mostly fake news to show that the other side is already commiting the atrocities against Hutus

      When the genocide happened, it was fast and widespread: people would start killing their own neighbors at scale. In 100 days, a million people were killed.

      The trouble with social media is that they somehow managed to shield themselves from the legal repercussions of heavily promoting content similar to what RTLM broadcast. For example, see the role of Facebook and its algorithmic feed in the genocide in Myanmar

      https://systemicjustice.org/article/facebook-and-genocide-ho...

      It's insane that they can get away with it.

      • scarface_74 17 hours ago

        And there wasn’t a history of genocide of other before then? Hitler in Germany and the mass murder in Tulsa in 1921 didn’t need social media.

        History has shown people don’t need a reason to hate and commit violence against others.

        • macintux 17 hours ago

          People don’t need guns to kill, either, but that doesn’t mean that they don’t make for more effective weapons.

        • ants_everywhere 16 hours ago

          I think you're underestimating the role deliberate propaganda has played in mass murder.

          Propaganda and ideology were a major part of the Nazi rise to power.

          Marx, Engels, and Mussolini were all in the newspaper business. Jean-Paul Marat's newspaper was very influential in promoting the French reign of terror, including some claiming he's directly responsible for the September Massacres. Nationwide propaganda were major priorities day one to Lenin and after him in Soviet Russia.

          Similarly with the Cambodian genocide, Great Leap Forward, Holodomor, etc.

          Propaganda even played a big role in Julias Caesar's campaign against the Gauls some 2 millenia before social media.

    • gdulli 17 hours ago

      But we made progress away from that and now we've regressed back towards it recently, aided by social media.

      • scarface_74 16 hours ago

        Exactly when did we make progress? In 2008 - before social media really took off how much of the population was a yelling that a Black man wasn’t a “real American” and was a “secret Muslim”?

        Before then we had the “Willie Horton ads”. Not to mention that Clinton performatively oversaw the electrocution of a mentally challenged Black man to show that he was tough on crime.

        https://jacobin.com/2016/11/bill-clinton-rickey-rector-death...

        Yes I know that Obama was also a champion of laws like the defense of marriage act. We have always demonized other in this country. It was just hidden before.

        • gdulli 10 hours ago

          That black man you're referring to was elected President twice, despite his haters. Which does not mean that racism was conquered but does indicate progress since the aforementioned Jim Crow era.

          • scarface_74 10 hours ago

            We were talking about “polarization”. Not the fact that he was elected. Was social media to blame in 2008 for the “divisiveness”?

            Right now the Supreme Court said that ICE could target people based on the color of their skin and it’s big like Obama won the hearts and minds of the states where Jim Crow was the law of the land in the 60s.

    • jwilber 19 hours ago

      The article mentions this. It tries to argue the significance of that platform.

  • johnea 19 hours ago

    Man, blah, blah, blah...

    That article needs to have about 80% of the words cut out of it.

    When the author straight up tells you: I'm posting this in an attempt to increase my subscribership, you know you're in for some blathering.

    In spite of that, personally I think algorithmic feeds have had a terrible effect on many people.

    I've never participated, and never will...

  • hbarka 17 hours ago

    Full anonymity in social media should not be allowed. It becomes a cover for bad actors (propagandists, agents, disinformation, bots, age-inappropriate, etc.) It doesn’t have to be a full identity, but knowing your user metadata is open during interactions can instill a sense of responsibility and consequence of social action. As in real life.

    • creata 15 hours ago

      People should be able to say things without those things following them around for the rest of their lives.

      > As in real life.

      No, your proposal is very different to real life. In real life, the things you say will eventually be forgotten. You won't be fired for things you said or did years ago, because people will have moved on.

      Having a convenient index of everything anyone has ever shared is very different to real life.

      • hbarka 13 hours ago

        > You won't be fired for things you said or did years ago, because people will have moved on

        You realize that the evidence is against you on that one. Just recently, who was that UK ambassador that Prime Minister Keir Starmer just fired?

    • makeitdouble 17 hours ago

      Real life needs full anonymity too. Not everywhere, but it's critical to have some.

      For instance a political vote needs to be anonymous. Access to public space typically is (you're not required to identify to walk the street) even if that anonymity can be lifted etc.

      Real life is complex, and for good reasons, if we want to take it as a model we should integrate it's full complexity as well.

      • hbarka 15 hours ago

        In the United States, political votes are not anonymous. There is a database of how someone voted.

        If you’re out in public, you’re also not fully anonymous. You display metadata such as race, gender, age, behavior. Now you could wear a ski mask during broad daylight but I doubt if you’d be allowed inside a bank. And the bank has a right to judge you for that.

        • makeitdouble 13 hours ago

          > There is a database of how someone voted.

          That cannot be right, that's the fundamental core of the voting process in our democracies. You might be thinking about the party registrations or voluntary polls ?

          > You display metadata

          What you show to the world has no requirement to be accurate. If you look like a rich 70 old Asian lady when going to the park there will be no check that's actually what you are (unless the police comes at you for an identity check...). That's particularly impacting for gender, you're typically not required to represent your official assignment, and how you behave isn't stuck to your official identity.

    • idle_zealot 16 hours ago

      Looking at any random fullrealname Facebook account will disabuse you of this notion. People will tie vile shit to their identities without a second thought.

      Rather than sacrifice the cover that anonymity grants vulnerable people, journalists, and activists, I think we should come at this issue by placing restrictions on how social media platforms direct people to information. The impulse to restrict and censor individuals rather than restrict powerful organizations profiting from algorithmic promotion of the content you deem harmful is deeply troubling.

      The first step here is simple: identify social media platforms over some size threshold, and require that any content promotion or algorithmic feed mechanism they use is dead-simple to understand and doesn't target individuals. That avoids the radicalization rabbithole problem. Make the system trivial and auditable. If they fail the audit then they're not allowed to have any recommendation system for a year. Just follows and a linear feed (sorting and filtering are allowed so long as they're exposed to the user).

      To reiterate: none of this applies if you're below some user cutoff.

      Q: Will this kill innovation in social media? A: What fucking innovation?

      • hbarka 15 hours ago

        > cover that anonymity grants [] journalists

        Quite the contrary, a core journalism principle is accountability and transparency. Readers must know who the reporter is to assess credibility, context, and potential conflicts of interest. Attribution builds trust, allows audiences to verify the source, and distinguishes reporting from anonymous or propagandistic material. This is different from covering source anonymity, but the audience is still relying on the journalist’s _known_ integrity that they’re not just making up some bullshit source.

    • krapp 16 hours ago

      Kiwifarms is an obvious object lesson in why anonymity online is necessary, and hardly the only one.

      • creata 16 hours ago

        I agree with you, but it's funny that someone else could say the opposite (i.e., that Kiwifarms shows how anonymity lets people get away with saying and doing horrible things) and still sound reasonable.

        • beeflet 15 hours ago

          Not really. There is a massive crowd of public, named people who harass Chris Chan called "A-logs".

      • beeflet 15 hours ago

        I think the kiwifarms could be a net positive if they incentivize anonymity on the internet through harassment.

  • epolanski 17 hours ago

    Looking at this very comment section the author may have a point.

    • beeflet 14 hours ago

      The solution to social division is to force everyone to use news.epolanski.com, the site where you can only post things that epolanski agrees with

      • epolanski 4 hours ago

        I was referring to the amount of toxicity in the comments.

  • 793212408435807 17 hours ago

    Number 3 will shock you!

    What a shame that these clickbait headlines make it to the front page.