The political effects of X's feed algorithm

(werd.io)

69 points | by benwerd 2 hours ago ago

77 comments

  • periodjet a minute ago

    It couldn’t be possible for a social media feed to influence users in the direction of issues important to the Democratic Party, could it?

    Or would that just be considered an unalloyed good?

  • cyrusradfar an hour ago

    Lovely thought Ben. Good to hear from you!

    I spent a lot of my life and money thinking about building better algorithms (over five years).

    We have a bit of a chicken / egg problem. Is it the algorithm or is it the preference of the Users which is the problem.

    I'd argue the latter.

    What I learned which was counter-intuitive was that the vast majority of people aren't interested in thinking hard. This community, in large part, is an exception where many members pride themselves on intellectually challenging material.

    That's not the norm. We're not the norm.

    My belief that every human was by their nature "curious" and wanting to be engaged deeply was proven false.

    This isn't to claim that this is our nature, but when testing with huge populations in the US (specifically), that's not how adults are.

    The problem, to me, is deeper and is rooted in our education system and work systems that demand compliance over creativity. Algorithms serve what Users engage with, if the Users were to no longer be interested in ragebait, clickbait, focused on thoughtful content -- the algorithms would adapt.

    • stetrain an hour ago

      > Is it the algorithm or is it the preference of the Users which is the problem. I'd argue the latter.

      > Algorithms serve what Users engage with

      User engagement isn't actually the same thing as user preference, even though I think many people and companies take the shortcut of equating the two.

      People often engage more with things they actually don't like, and which create negative feelings.

      These users might score higher on engagement metrics when fed this content, but actually end up leaving the platform or spending less time there, or would at least answer in a survey question that they don't like some or most of the content they are seeing.

      This is a major reason I stopped using Threads many months ago. Their algorithm is great at surfacing posts that make me want to chime in with a correction, or click to see the rest of the truncated story. But that doesn't mean I actually liked that experience.

      • cyrusradfar 40 minutes ago

        Thanks for the thoughtful response.

        Curious about this. Don't have an angle, just trying to survey your perspective.

        You shared: > People often engage more with things they actually don't like, and which create negative feelings.

        Do you think this is innate or learned? And, in either case, can it be unlearned.

    • sonofhans an hour ago

      > Algorithms serve what Users engage with, if the Users were to no longer be interested in ragebait, clickbait, focused on thoughtful content -- the algorithms would adapt.

      Algorithms have been adapted; they are successful at their goals. We’ve put some of the smartest people on the planet on this problem for the last 20 years.

      Humans are notoriously over-sensitive to threats; we see them where they barely exist, and easily overreact. Modern clickbait excels at presenting mundane information as threatening. Of course this attracts more attention.

      Also, loud noises attract more attention than soft noise. This doesn’t mean that humans prefer an environment full of loud noises.

    • socalgal2 an hour ago

      It is not the norm here either.

    • rl3 an hour ago

      >This community, in large part, is an exception where many members pride themselves on intellectually challenging material.

      That's not the norm. We're not the norm.

      I recommend against putting HN on a pedestal. It just leads to disappointment.

      • cyrusradfar an hour ago

        It's true -- I do enjoy this community even though it's failed to serve my every thought with the love that I surely deserve!

    • prometheus76 an hour ago

      All you need to do is read the other comments on this very page and you will see that there are very strict cultural and political norms here too, but for some reason they are invisible as such to those who hold them. They consider their views to be "common knowledge" and "what any reasonable person believes" because they, too, live in curated bubbles.

      Any comment that challenges mainstream science, materialism/physicalism, and leftist politics gets downvoted into oblivion here because HN is definitely not a haven for people who "pride themselves on intellectually challenging material."

      TL;DR: It's an echo chamber here, too, but most people who hold the worldview that is enforced here often cannot see their own presuppositions, nor do they see that their views are political in nature.

      • sonofhans an hour ago

        I’d expect this to be down-voted too. It has nothing to do with the article and makes no concrete claims. It’s easy to slag anything in vague terms, but it adds nothing to this discussion.

      • rl3 an hour ago

        >Any comment that challenges mainstream science ...

        Stupid mainstream science.

        >... and leftist politics ...

        >... nor do they see that their views are political in nature.

        You don't say. Personally, I respect comments that prove their own claims.

  • Aurornis 2 hours ago

    I don’t know if I buy the explanation that this was due to the feed algorithm. It looks like an artifact of being exposed to X’s current user base instead of their old followers. When Twitter switched to X there was a noticeable shift in the average political leanings of the platform toward alignment with Musk, as many left-leaning people abandoned the platform for Bluesky, Mastodon, and Threads.

    So changing your feed to show popular posts on the platform instead of just your friends’ Tweets would be expected to shift someone’s intake toward the average of the platform.

    • noelsusman an hour ago

      I'm not sure what your point is. How is "being exposed to X's current user base instead of their old followers" not equivalent to "turning on the feed algorithm"? You doubt the effect is due to the algorithm, but your alternative explanation describes exactly what the algorithm does.

    • SecretDreams 2 hours ago

      Is this the result of a feedback loop from musk joining or did they just accelerate the overall decline of the platform with him joining? Some might say it was going this way even before he picked it up, but it was certainly an inflection point when he joined either way.

      All modern social media is pretty toxic to society, so I don't participate. Even HN/Reddit is borderline. Nothing is quite as good as the irc and forum culture of the 2000s where everyone was truly anonymous and almost nobody tied any of their worth to what exchanges they had online.

      • bpodgursky an hour ago

        The moderation changes absolutely changed posting behavior. People got banned for even faintly gesturing the wrong direction on many issues and it frightened large accounts into toeing the line.

      • tokyobreakfast an hour ago

        > Even HN/Reddit is borderline.

        It's the proliferation of downvoting. It disincentivizes speaking your honest opinion and artificially boosts mass-appeal ragebait.

        It's detrimental to having organic conversations.

        "But the trolls" they say.

        In practice it's widely abused.

        Using HN as an example, there are legitimate textbook opinions that will boost your comment to the top, and ones that will quickly sink to the bottom and often be flagged away for disagreement. Ignoring obvious spam which is noise, there is no correlation to "right" or "wrong".

        That's one advantage old-school discussion forums and imageboards have. Everyone there and all comments therein are equally shit. No voting with the tribe to reinforce your opinion.

        What's worse is social media allowed the mentally ill to congregate and reinforce their own insane opinions with plenty of upvotes, which reinforces their delusions as a form of positive feedback. When we wonder aloud how things have become more radicalized in the last 20 years — that's why. Why blame the users when you built the tools?

        • SecretDreams an hour ago

          I like voting (up and down) but I also agree with your take. Reddit salts the votes, but maybe the solution is to allocate a certain amount of reasonable votes (up or down) total that a user can use weekly. Make it so when you are voting, it's much more meaningful and truely reflect an opinion you either really agree with or really do not agree with.

          Ultimately, I think it comes back to people value their online persona way too much and this is something we've intentionally marched towards.

    • an hour ago
      [deleted]
    • excalibur an hour ago

      I don't know what changes have been made more recently, but I know there was a definite change to the Twitter algorithm a few months ago that filled the feeds of conservatives with posts from liberals and vice versa. It seemed to be specifically engineered to provoke conflict.

  • jmugan 2 hours ago

    Oddly enough, X is the only platform i've been able to teach to not show me culture war stuff, from either side. It just shows me AI in the "For You."

    • guywithahat an hour ago

      I have the same thought, my X algo has become less political than HackerNews. I suppose it depends on how you use it but my feed is entirely technical blogs, memes, and city planning/construction content

    • PaulHoule an hour ago

      I've been pretty consistent about telling Bluesky I want to see less of anything political and also disciplined about not following anybody who talks about Trump or gender or how anybody else is causing their problems. I see very little trash.

      • jmugan an hour ago

        Maybe it has gotten better recently. I tried and tried with Bluesky, but it would not abide.

        • PaulHoule 2 minutes ago

          It was bad the week Trump got elected, it’s gotten better since then.

    • kypro an hour ago

      The uncomfortable truth to most "the algorithm is biased" takes is that we humans are far more politically biased than the algorithms and we're probably 90% to blame.

      I'm not saying there is no algorithmic bias, and I tend to agree the X algorithm has a slight conservative bias, but for the most part the owners of these sites care more about keeping your attention than trying to get you to vote a certain way. Therefore if you're naturally susceptible to cultural war stuff, and this is what grabs your attention, it's likely the algorithm will feed it.

      But this is far more broad problem. These are the types of people who might have watched political biased cable news in the past, or read politically biased newspapers before that.

      • quirkot 38 minutes ago

        the issue brought up in the article isn't that "the algorithm is biased" but that "the algorithm causes bias". A feed could perfectly alternate between position A and position B and show no bias at all, but still select more incendiary content on topic A and drive bias towards or away from it.

  • arwhatever 2 hours ago

    I deleted my account after many years when X recently made the Chronological Feed setting ephemeral, defaulting back to the Algorithmic Feed each time the page is refreshed.

    No away I'm going to let that level of outrage-baiting garbage even so much as flash before my eyes.

    • dagelf an hour ago

      Train it: I just have to spend 3 minutes every other year to tap the 3 dots on every post and choose "Not Interested", for an epic feed unmatched anywhere.

      • quirkot 37 minutes ago

        Train the algorithm so that you can be the sort of product you want to see in the world

    • socalgal2 an hour ago

      I just click "following" at the top and never see anything I didn't ask to see. It resets once every few months to the other tab which I assume is just the cookie setting expiring.

  • rbanffy 2 hours ago

    And this is why the price for Twitter was, in the end, remarkably low.

    • piloto_ciego 2 hours ago

      Yeah, this was always the play looking back in hindsight. Like, I didn't get it, "why would you pay that kind of money for a web forum?!" It wasn't the forum that was important, Twitter (for better or worse) has wormed it's way into the fabric of American discourse. He was basically buying the ideological thermostat for the country and turning the dial to the right.

      • dlev_pika an hour ago

        This is even worse outside of NA. In many countries it is the de facto communication channel of government and businesses.

      • RetpolineDrama an hour ago

        Or from the other perspective: Meta and Google have had their finger on the scale for more than a decade (along with old twitter).

        In twitters case, you had regime officials directing censorship illegally through open emails and meetings.

        It's no surprise that the needle moves right when you dial back the suppression of free expression even a little bit (X still censors plenty)

        • dylan604 an hour ago

          How is it illegal? It is their platform to do what they want with it. You can disagree and not use it, but it is theirs to do with as they see fit. If this was a government run operation paid for with tax dollars, then it would be an issue.

      • 0ckpuppet an hour ago

        as opposed to the government funding turning it to the reality bending left? There was direct communication from Senators and members of Congress directing twitter to block and ban based on certain topics. And Twitter obliged.

    • cowpig 2 hours ago

      dark

  • ppeetteerr 2 hours ago

    Why anyone is still using X after 2025 is a mystery (I know, it's where everyone is, but the moral implications are wild)

    • spankalee 2 hours ago

      Seriously. The CEO is opening posting white supremacist content like it's Stormfront. If you don't support that, you should get out.

      • Herring an hour ago

        I don’t know which country you’re in, but in the US Trump won the popular vote. Plenty of people here are perfectly happy with Stormfront.

        • spankalee an hour ago

          I think the idea that if you don't support white supremacy you should get off the site owned and run by a clear white supremacist applies regardless of how elections go.

          • yndoendo a few seconds ago

            I recommend _Culture in Nazi Germany_ by Michael H Kater. [0] It is very dry but goes into detail of the culture of the era from late 1920s to end of WWII.

            One aspect he highlights at the end is that Fascism was not rejected by the current and former citizens, those that migrated, of Germany. In their mind it was incorrectly implemented. A number of Zionist that migrated from Germany to Palestine were supporters of Fascism. It was not until mid to late 1960s when people start realize and admitted Fascism was bad.

            I personally will never fund Elon Musk. Anyone that says empathy is bad is a bad person at heart. Empathy is intelligence and those that lack it lack strong intelligence. There is no way to put yourself in the position others have gone through without empathy.

            [0] https://academic.oup.com/ahr/article-abstract/128/3/1512/728...

        • daveguy an hour ago

          Less than you might think.

          He didn't win a majority of the vote, just a plurality. And less than 2 of 3 eligible voters actually voted. So he got about 30% of the eligible population to vote for "yay grievance hate politics!" Which is way more than it should be, but a relatively small minority compared to the voter response after all ambiguity about the hate disappeared. This is why there's been a 20+ point swing in special election outcomes since Trump started implementing all the incompetent corrupt racist asshatery.

    • apparent 43 minutes ago

      Lots of info is shared there first. It shows up in news articles and podcasts 12-24 hours later. Not everything shared there is true, of course, so one has to do diligence. But it definitely surfaces content that wouldn't show up if I just read the top 2-3 news websites.

    • dagelf an hour ago

      I didn't get it either until I trained the algorithm to feed me what I want by just clicking the three dots and selecting Not Interested on anything I never wanted to see again... it listens, whats left is really unmatched anywhere, I've really looked, and occasionally still do out of curiosity.

    • haunter an hour ago

      Live update for sport events. People post highlights and replays before anyone else.

  • apparent an hour ago

    What does it mean to have someone on a chronological feed, versus the algorithmic one? Does that mean a chronological feed of the accounts they follow? I hardly ever use that, since I don't follow many people, and some people I follow post about lots of stuff I don't care about

    from the study:

    > We assigned active US-based users randomly to either an algorithmic or a chronological feed for 7 weeks

  • mikepurvis 2 hours ago

    "We need more funding into open protocols that decentralize algorithmic ownership; open platforms that give users a choice of algorithm and platform provider; and algorithmic transparency across our information ecosystem."

    This sounds like a call to separate the aggregation step from the content. Reasonable enough, but does it really address the root cause? Aren't we just as polarized in a world where there are dozens of aggregators into the same data and everyone picks the one that most indulges their specific predilections for engagement, rage, and clicks?

    What does "open" really buy you in this space?

    Don't get me wrong, I want this figured out too, and maybe this is a helpful first step on the way to other things, but I'm not quite seeing how it plays out.

    • bee_rider an hour ago

      I’d hope people wouldn’t intentionally pick the political extremism feed if they had any other option (although it’s hard to say).

      • tarxvf an hour ago

        From where I'm sitting, it seems obvious people do exactly that.

        • mikepurvis 21 minutes ago

          "It's the most interesting one!"

          For a related example I was talking with a colleague recently about how we had both (independently) purchased Nebula subscriptions in an effort to avoid getting YouTube premium and giving Google more money, but both felt the pull back to YouTube because it is so good at leveraging years of subscription and watch history to populate the landing page with content we find engaging.

          If even two relatively thoughtful individuals choosing to spend money on a platform with the kind of content they'd like to choose to watch can't seem to succeed at beating an engagement-first algorithm, I'm not sure how much hope normies would have, unless it's the real issue is just being terminally online period, and the only way to win is simply not to play.

  • kettlecorn an hour ago

    Underrated in X's changes is how blue checkmark users are shown first underneath popular tweets. Most people who pay for blue checkmarks are either sympathetic to Musk's ideology or indifferent. Many blue checkmark users are there to make money from engagement.

    The result is underneath any tweet that gets traction you will see countless blue checkmark users either saying something trolling for their side or engagement-baiting.

    The people who are more ideologically neutral or not aligned with Musk are completely drowned out below the hundreds of bulk replies of blue checkmarks.

    It used to be that if you saw someone, like a tech CEO, take an interesting position you'd have a varied and interesting discussion in the replies. The algorithm would show you replies in particular from people you follow, and often you'd see some productive exchange that actually mattered. Now it's like entirely drivel and you have to scroll through rage bait and engagement slop before getting to the crumbs of meaningful exchange.

    It has had a chilling effect on productive intellectual conversation while also accelerating the polarization of the platform by scaring away many people who care about measured conversation.

    • bool3max an hour ago

      I automatically tune out any blue checkmark post or reply and just assume it's an LLM responding to earn $.003

  • ortusdux an hour ago

    There was a great study from a decade ago that showed that baseball cards held by lighter skinned hands outsold cards held by darker skinned individuals on eBay.

    An algorithm designed today with the goal of helping users pick the most profitable product photo would probably steer people towards using caucasian models, and because eBay's cut is a percentage, they would be incentivized to use it.

    Studies show that conservatives tend to respond more positively to sponsored content. If this is true, algorithm-driven ad-sponsored social sites will tend towards conservative content.

    https://onlinelibrary.wiley.com/doi/abs/10.1111/1756-2171.12...

    https://www.tandfonline.com/doi/full/10.1080/00913367.2024.2...

  • wtp1saac an hour ago

    It is interesting to see a general bias taken away from the study, which I wouldn't necessarily guess given my own experience. My X "For You" feed mostly does not read pro-Trump - instead mostly pushing very intense pro-European and pro-Canadian economic and political separation from the USA, and pushing very negative narratives of the USA, although I suppose it occasionally also introduces pro-Trump posts, and perhaps those do not sway me in the same way given I am a progressive American.

    That said, the Trending tab does tend to push very heavy MAGA-aligned narrative, in a way that to me just seems comical, but I suppose there must be people that genuinely take it at face value, and maybe that does push people.

    Less to do with the article:

    The more I think about it, I'm not really even sure why I use X these days, other than the fact that I don't really have much of an in-person social life outside of work. Sometimes it can be enjoyable, but honestly the main takeaway I have is that microblogging as a format is genuinely terrible, and X in particular does seem to just feed the most angry things possible. Maybe it's exciting to try and discuss opinions but it is also simultaneously hardly possible to have a nuanced or careful discussion when you have limited characters, and someone on the other end that just wants to shout over you.

    I miss being a kid and going onto some forums like for Scratch or Minecraft or whatever. The internet felt way more fun when it was just making cool things and chatting with people about it. I think the USA sort of felt more that way too, but it's hard to know if that was just my privilege. When I write about X, it uncomfortably parallels to how I would consider how my interactions have evolved with my family and friends in real life.

  • dagelf an hour ago

    Theres much more diversity of thought on the right, did they get more open minded?

  • fluoridation 2 hours ago

    I honestly don't understand how or why people are using Twitter to keep up with the news. The only thing I use it for is to follow artists, and even that has been going down in recent weeks with most of my favorites moving over to BlueSky. Maybe I'm just a long-winded idiot, but the character limits barely let me have a conversation on either platform. How are people consuming news like this?

    It just baffles me how different my experience of using the platform is. I literally do not see any news. I'm not entirely convinced that it's Twitter being biased and not just giving each person what they most engage with.

    • stevage an hour ago

      You follow artists, and they are not tweeting their political opinions? Cool.

      I gave up on Twitter when everyone I followed kept adding politics. Even if I agreed with it, I just don't want to a marinade in the anger all day.

      • fluoridation an hour ago

        The one exception I think of is the guy from Technology Connections, who I stopped following because I got tired of seeing him in my feed complaining about something or other. And I've noticed he's been putting that into his videos as well, so I might have to do it on YouTube as well.

    • Flere-Imsaho an hour ago

      My feed (UK based) seems to give me the major news stories well before the mainstream (BBC), and I'm taking days if not weeks in some cases. Now it could be that's how the mainstream decides to cover a particular story? What's worrying is when a story is all over X but isn't covered.

      To give an example, the recent protests in Iran where being covered on X but the BBC was silent for weeks before finally covering the story (for a few days).

      • dylan604 an hour ago

        Could it also be that "mainstream" news are actually trying to verify information and/or obtain confirmation from other sources? All of that is done in an attempt to avoid promoting false information. People tweeting do not do any of that

    • rishabhaiover 2 hours ago

      It used to be self-expression in an oddly entertaining way but that Nikita Bier ruined the whole thing with his metrics chasing algorithmic shifts.

    • joe_mamba 2 hours ago

      >I honestly don't understand how or why people are using Twitter to keep up with the news.

      Because the MSM news stations themselves pick up the stuff from twitter and just add their own spin flavor. A dozen phone videos from random citizens on-site is always quicker than the time CNN/FOX can send a reporter there. On twitter you at least get the raw footage and can judge for yourself before MSM try to turn it political to rage bait you.

  • ChrisArchitect an hour ago
  • 01HNNWZ0MV43FF an hour ago

    You have to find good people. Bad people will find you.

  • kypro an hour ago

    I really wish these points were made in a non-political / platform-specific way because if you care about this issue it's ultimately unhelpful to frame it as if this is an issue with just X or conservatives given how politically divided people are.

    I do share the author's concerns and was also concerned back in the day when Twitter was quite literally banning people for posting the wrong opinions there. But it's interesting how the people who used to complain about political bias, now seem to not care, and the people who argued "Twitter is a private company they can do what they want" suddenly think the conservative leaning algorithm now on X is a problem. It's hard to get people across political lines to agree when we do this.

    In my opinion there two issues here, neither are politically partisan.

    The first is that we humans are flawed and algorithms can use our flaws against us. I've repeatedly spoken about how much I love YouTube's algorithm because despite some people saying it's an echo chamber, I think it's one of the few recommendation algorithms which will serves you a genuinely diverse range of content. But I suspect that's because I genuinely like consuming a very wide range of political content and I know I'm in a minority there (probably because I'm interested in politics as a meta subject, but don't have strong political opinions myself). But my point is these algorithms can work really well if you genuinely want to watch a diverse range of political content.

    Secondly some recommendation algorithms (and search algorithms) seem to be genuinely biased which I'd argue isn't a problem itself (they are private companies and can do what they want), but that bias isn't transparent. X very clearly has a conservative bias and Bluesky also very clearly has political bias. Neither would admit their bias so people incorrectly assume they're being served something which is fairly representative of public opinion rather than curated – either by moderation or algorithm tweaks.

    What we need is honesty, both from individuals who are themselves seeking out their own bias, and platforms which pretend to not have bias but do, and therefore influence where people believe the center ground is.

    We can all be more honest with ourselves. If you exclusively use X or Bluesky, it's worth asking why that is, especially if you're engaging with political content on these platforms. But secondly I think we do need more regulation around the transparency of algorithms. I don't necessary think it's a problem if some platform recommends certain content above other content, or has some algorithm to ban users posts content they don't like, but these decisions should be far transparent than they are today so people are at least able to feed that into how they perceive the neutrality of the content they're consuming.

  • jmyeet an hour ago

    I blame Google for a lot of this. Why? Because they more than anyone else succedded in spreading the propaganda that "the algorithm" was like some unbiased even all-knowing black box with no human influence whatsoever. They did this for obvious self-serving reasons to defend how Google properties ranked in search results.

    But now people seem to think newsfeeds, which increase the influence of "the algorithm", are just a result of engagement and (IMHO) nothing could be further from the truth.

    Factually accurate and provable statements get labelled "misinformation" (either by human intervention or by other AI systems ostensibly created to fight misinformation) and thus get lower distribution. All while conspiracy theories get broad distribution.

    Even ignoring "misinformation", certain platforms will label some content as "political" and other content as not when a "political" label often comes down to whether or not you agree with it.

    One of the most laughable incidents of putting a thumb on the scale was when Grok started complaining about white genocide in South Africa in completely unrelated posts [1].

    I predict a coming showdown over Section 230 about all this. Briefly, S230 establishes a distinction between being a publisher (eg a newspaper) and a platform (eg Twitter) and gave broad immunity from prosecution for the platform for user-generated content. This was, at the time (the 1990s), a good thing.

    But now we have a third option: social media platforms have become de facto publishers while pretending to be platforms. How? Ranking algorithms, recommendations and newsfeeds.

    Think about it this way: imagine you had a million people in an auditorium and you were taking audience questions. What if you only selected questions that were supportive of the government or a particular policy? Are you really a platform? Or are you selecting user questions to pretend something has broad consensus or to push a message compatible with the views of the "platform's" owner?

    My stance is that if you, as a platform, actively suppresses and promotoes content based on politics (as IMHO they all do), you are a publisher not a platform in the Section 230 sense.

    [1]: https://www.theguardian.com/technology/2025/may/14/elon-musk...

  • dlev_pika 2 hours ago

    [dead]

  • 2 hours ago
    [deleted]
  • jongjong 2 hours ago

    Oh my. Now that X is affecting people's politics (for the better IMO), suddenly people care about the influence of algorithms over politics...

    • GorbachevyChase an hour ago

      I am shocked, shocked to find that there is social engineering in this establishment!

    • beanjuiceII an hour ago

      yep but you wont find common sense on the matter here unfortunately

  • barfiure 2 hours ago

    This person is confused. Trump was a well known pussy grabber for decades. Epstein was anything but a secret, it seems, given how many politicians and celebrities and moguls he rubbed elbows with. Jerry stopping by the island for a lemonade and a spot of lunch with his high school aged girls? Yeah.

    It comes down to this: you can have visibility into things and yet those in power won’t care whatsoever what you may think. That has always been the case, it is the case now, and will continue to be in the future.

    • jongjong 2 hours ago

      This is a defeatist attitude. Don't know what bubble you're in but these official revelations are driving real change in mine. It's kind of subtle at this point but it's the kind of change that cannot be undone.

      • Herring an hour ago

        Unfortunately speed often matters when it comes to outcomes. Eg if you get a cancer diagnosis like Jobs, you probably shouldn’t waste a year drinking juices and doing acupuncture.