'Visual clutter' alters information flow in the brain

(news.yale.edu)

145 points | by gnabgib 5 days ago ago

58 comments

  • brikym 3 hours ago

    This article is a good excuse to get rid of outdoor advertising for safety reasons.

    I was watching some tours of some very nice metro stations in China and Moscow. I didn't quite understand what was so nice about them until one comment pointed it out: no advertising blasting bright colors into your retinas. Advertising in public spaces needs to be banned. It's visual pollution and it's using publicly funded assets to make tangible profits while the losses are externalised easily because they're not well understood. If profit is all you care about then it's easy to justify turning your house into a brothel.

    • leohonexus 22 minutes ago

      I'd say the daily "visual clutter" of living in Chinese cities is on par or more than other cities, like Tokyo.

      In Shenzhen for example, it's not uncommon to see police wearing flashing sirens on their shoulders as part of their uniform. Motorcyclists share the same road as pedestrians, and with so many delivery app drivers you're always on the lookout to avoid being hit.

      On SZ and Beijing metro trains, video ads are projected inside tunnels, matching the speed of the carriage - example: https://youtu.be/sp7KDNKpVhY

      Personally I've seen much more advertising on Chinese shopping apps like Taobao, compared to say Amazon. Cluttercore advertising seems to be a deeply rooted culture there.

  • pxc 10 hours ago

    > we can’t read out of the corner of our eyes, no matter how hard we try

    This is not quite true. Reading via peripheral vision is a skill, but most people never have the right kind of motivation (and perhaps not the right kind of visual stimulus) to learn it. I know this because I have personally known someone who read exclusively via her peripheral vision.

    There's an inherited retinal disease that runs in my family. It's a form of 'cone-rod dystrophy'. Cone-rod dystrophies are diseases where the cones and the rods in the eye eventually stop responding to light. The order of the terms in the naming reflects the typical order of the dystrophy: first the cones (responsible for central vision) go, then the rods (responsible for peripheral vision) go, too. As the cones and rods go bad, this results in loss of visual acuity in central and peripheral vision respectively. (It often causes lots of other, less generic problems, too, such as: blind spots, warping/twisting distortions in the visual field, sometimes flickering artifacts, progressive colorblindness, extreme light sensitivity, and an effective reduction of contrast.)

    Between individual cases, there's a lot of variation in how the disease presents. There's no fixed timeline or ordering for the progression (even though the disease's genetic cause has been identified as just a single gene!).

    Anyhow, in my late aunt's case, her central vision was useless long before her peripheral vision could give out. So she learned, somehow, to do things like read her smartphone using only her peripheral vision.

    We chatted about it once or twice. Onlookers often could not comprehend that she was looking at her phone, since her eyes weren't pointed towards it. She once laughed to me about how someone had asked her 'Why are you sniffing your phone?', while she held it up to her face to read a text message.

    My impression is that learning to rely on your peripheral vision in this way is extremely counter-intuitive and difficult to do. (This may have something to do with the mechanisms discussed in TFA.) I wonder if it can even be done at all without first obscuring one's central vision (which I guess you could do artificially with contacts). But evidently it can be done.

    • GuB-42 8 hours ago

      It makes me think of a common trick in stargazing that is to look at stars with peripheral vision. Fixating them will make them disappear.

      That's because the fovea, the "high resolution" part of the eye we use when focusing on some point is entirely made of cone cells, which give you color vision, but are less sensitive, as opposed to rod cells, that make up most of the peripheral vision. In other words, peripheral vision is better in low light situations.

      • axwdev 7 hours ago

        I discovered this growing up when lying in bed at night. I'd always feel like there was light coming under the door. Then I'd look directly at it and it was pitch black to me. I'd look to the side and the light would seemingly come back. Only years later did I discover the cause you described.

        • fluoridation 3 hours ago

          I discovered it similarly in a camping trip when I was a kid. We were walking around in near pitch darkness, and I saw out of the corner of my eye someone walk away from the group and try to hide behind a table or something. When I turned to look I couldn't see anything at all, but if I averted my eyes I could clearly see there was a person moving around crouched behind a table. I can't remember exactly, but I think I wasn't so much seeing them, but rather their movement, and mentally perceiving their silhouette.

          I guess it looks sort of like this: https://www.youtube.com/watch?v=M83sF7_fYdM

    • Daub an hour ago

      > Anyhow, in my late aunt's case, her central vision was useless long before her peripheral vision could give out. So she learned, somehow, to do things like read her smartphone using only her peripheral vision.

      All I can say is wow.

    • fluoridation 8 hours ago

      I guess what the article means is that it's not possible to read with peripheral vision except with very large font size and up close. Was your aunt able to read normal print besides titles? Also, I just tried out of curiosity. I must imagine only her fovea was unusable, and the area just around it (I think it's like 5° off-axis) was still fine, because I think I have pretty good peripheral vision, and I can't make out anything if I intentionally look away from what I'm trying to read. Certainly not on a phone; I have to get it so close to my face that it goes out of focus.

      By the way, while it is true that cones are far more densely packed in the fovea, their function is color vision, not central vision. Rods are responsible for motion perception and vision in low-light environments.

      • pxc 8 hours ago

        > except with very large font size and up close

        Yeah, it's hard to say what the determinative factors were because her general visual acuity was so low that she generally needed very large fonts very up close anyway. My mom and my sister and I also have this condition and rely on magnification to varying degrees (and for various reasons— sometimes it's truly about acuity but sometimes larger sizes/bolder fonts are a clumsy way to try to make up for contrast issues). But even those of us with usable central vision generally need large fonts anyway. We're also all, for reasons I think are mostly incidental, naturally myopic (although my mom is no longer nearsighted but farsighted (with low acuity— she's legally blind)).

        > Was your aunt able to read normal print besides titles?

        No. Even titles, like titles of chapters in a paperback, she could likely only read with magnification. And that's if she could get enough contrast at all. At some point, screens become much easier to read than paper, even at equal sizes.

        I should learn more about the precise anatomy because it's interesting, but currently I know more about the subjectivity of it than the mechanics.

        > By the way, while it is true that cones are far more densely packed in the fovea, their function is color vision, not central vision. Rods are responsible for motion perception and vision in low-light environments.

        There may be other factors in these inherited retinal dystrophies that affect the way their progressions affect field of view, idk. But what I said about which areas are first distorted for cone-rod dystrophies is true, and its reverse is true for rod-cone dystrophies (i.e., retinitis pigmentosa), where people with that condition lose their peripheral vision first and their field of view shrinks from the outside-in. My assumption so far has been that this corresponds to those density differences you mentioned.

        > Rods are responsible for motion perception and vision in low-light environments.

        I do know that much. :)

        One of the features of this illness that's very prevalent for me and my sister right now is extreme light sensitivity, presumably because our eyes rely increasingly on their rods even during the daytime and even in high-light environments. One related I've written about on HN before is how the need for lower total light emission pushes both of us to high-contrast dark themes with the lowest brightness possible. OLED screens are really nice when your rods are in better shape than your cones!

        One that I don't think I have is that my colorblindness has been getting worse over time. The last time I took a colorblindness test (administered by a medical professional, at my retinal specialist's office), I could hardly read any of the Ishihara plates at all. (When the doctor came in, he asked me if I only saw in black-and-white, which I found mildly irksome but very amusing. I laughed about it with my family afterwards. I do still see many colors! I just have trouble distinguishing a lot of them, too.) My sister, who was not colorblind at all when she was growing up, is now also colorblind, about as much as me based on her tests.

  • Daub an hour ago

    Is this research not stating the obvious?

    As a painter I would differentiate background from foreground by decreasing its saturation, lightness and hue local contrast, softening edges, decreasing neighboring regional contrast and Lessing its average color values in relation to the foreground. Painters have been doing this for hundreds of years.

  • gwern 12 hours ago
  • Rugu16 11 hours ago

    Good article!

    “Therefore, the detailed visual information you’re getting is from the car in front of you, but the information of interest is outside of your focus.”

    This must be one of the reasons you get fatigue and exhaustion during traffic rush hour due to so much visual information.

    • cal85 11 hours ago

      Why does the effect seem to be reversed when out in nature? When I walk in the woods, the visual complexity is arguably much higher than it ever gets in cities, even on a busy highway. But the mental effect seems to be rejuvenating.

      • oscillonoscope 10 hours ago

        It's partly because you're not paying attention. Next time you're out in the woods, try to still hunt for a while. It's a hunting method where you move extremely slowly throughout the woods from cover to cover while watching for animals. You'll find that it takes a lot of mental focus to maintain that level of vigilance.

      • jvanderbot 11 hours ago

        I think the fractal patterns match our million year old brains expectations, "stuff" in the article refers to "stuff i need to focus on" (which is everywhere in traffic but mostly in front of you while hiking, and in general, focusing 100 yards away is better for the eyes, and a good walk helps everyone feel better. But this is mostly off topic opinion.

      • ycombinete 10 hours ago

        That depends on the nature of the nature. Walking through the African veld is also tiring, because you’re constantly processing threat signals.

      • Rugu16 6 hours ago

        I feel the opposite, in nature I find the complexity much less. Things are not really moving, they are static. Colors are also very much within similar range. You can be very much in passive mode and enjoy the scenery vs actively trying to process.

      • _DeadFred_ 5 hours ago

        According to my Claude chat -

        This reality might be like a quantum observation field - cities are full of conscious minds actively observing/measuring/collapsing probability states. Like millions of wave function collapses happening constantly. Nature lets those quantum states breathe, maintaining possibility spaces longer.

        Some people thrive in that urban collapse-field - they want that constant measurement and definition. Others need more quantum coherence time, seeking out spaces where consciousness can maintain superposition longer. It's not about visual complexity or stimulation, but about how much conscious observation is forcing reality into defined states.

        Cities vs nature isn't just about peace or chaos - it's about the density of consciousness collapse. Like the difference between metal (constant forced collapse) and ambient music (sustained possibility states).

    • spike021 10 hours ago

      One of the ways to load balance the visual information is to scan around the car and briefly look at other things.

      Checking mirrors often, looking outside your side window, etc.

      Whenever I do those things it helps refresh me quite a bit.

    • cushpush 11 hours ago

      I noticed intense fatigue wandering Tokyo with my friend who wore magnifying lenses and he could not understand why I was tired. I said it's all the visual stimulation, the signs, the lights, the billboards. I think he was at an advantage with the eyewear, in retrospect.

      • igornadj 2 hours ago

        I found the opposite personally. Something about the neatness and tidiness let my mind relax and see everything similar to a calm flowing stream. Tokyo is one of the most peaceful cities I've been to, even in the busy areas, and by far the biggest and most populated.

      • nox101 8 hours ago

        isn't this the counter example? japanese ads, magazines, documents, websites, are often super visually cluttered. seems counter to the paper to me

        if this clutter has negatie affects why has japanese design settled on it?

        • famahar 6 hours ago

          This may just be one small point, but I recall reading that visual clutter signifies a good bargain while lots of white space gives the impression of luxary. Most consumers want a good deal.

        • Rugu16 6 hours ago

          Its same for TimeSquare still people pay big dollars. Both things can be right, it has negative effect however it is still engaging and effective. It gets the eyeballs

  • aspenmayer 3 hours ago

    Even more compelling reasons to block ads online and in public.

  • cushpush 11 hours ago

    Tangential: staring at a computer screen while having a phone call is distracting. Recommend looking at not-a-screen while talking on-the-phone :)

    • cj 10 hours ago

      I have a (IMO bad) habit of looking away from my computer screen (at visual nothingness) when having concentrated discussions over video calls.

      For whatever reason it’s just easier to talk when staring out the window at a tree than staring at a face on a screen. I call it a bad habit because it results in accidentally ignoring body language of the person on the other end

      100% with phone calls. I typically just slowly pace around around my house when on a phone call without video.

      • 6510 9 hours ago

        Strange, for me it is obvious how it works.

        It goes for locations and activities too but mostly if I look at something it locks and unlocks memories but the thing I'm looking at also becomes part of the active memory.

        You have a bunch of stuff hashed against the tree or against a dead gaze or you don't want the person to be part of the thought process.

        I forgot the code for the warehouse at a previous job. Typing the wrong one locks the place down. I somewhat panicked but went there anyway, got distracted by something and typed the code without even thinking about it. I also remembered it after walking inside. Took some coffee and it was gone again. I thought, I've been typing that code for years but had never realized I only remember it when looking at the door.

        • cj 8 hours ago

          As I reflect and re-read my comment, I think for me it’s simply I’m overloaded.

          Looking at a face while talking vs. a tree just increases the mental energy necessary for the call. When you’re chronically exhausted, you start cutting out the little things that seemingly don’t matter (like someone’s body language or facial expression during a call)

          Definitely a tangent here. Love the warehouse example though. Similarly, I can’t for the life of me recite my iphone, Apple Watch, or home security alarm PIN codes. It’s just pure muscle memory at this point. When I try to recall the PIN codes in my head, my mind immediately tries to recall the numbers by visualizing my phone (or alarm keypad) and attempting to remember the movement of my fingers in order to deduce what the numbers are.

  • hbarka 12 hours ago

    Isn’t this really just signal-to-noise ratio? The information flow can be any input, be it visual, auditory, tactile, etc. “Clutter” is not a neutral word and leads the reader.

    • cushpush 11 hours ago

      Aren't you suggesting that we use the ear term for the eye term instead of the eye term for the ear term? ;)

      • Nevermark 9 hours ago

        Signal-to-noise is often used as a modality independent term.

        Someone might reference signal-to-noise issues with modern news sources, body language, or a data set of any particular origin.

    • ben_w 11 hours ago

      "Visual clutter" is an existing term that's been in use for a while now: https://books.google.com/ngrams/graph?content=%22visual+clut...

    • rootusrootus 11 hours ago

      That seems intuitive. E.g. It is pretty common for people to turn down the radio in their car when they are looking carefully for an address.

  • GuB-42 8 hours ago

    What counts as a "visual stimulus" is unclear to me. Does it mean a static object, as opposed to a blank surface or something that moves, and if it is something that moves, does it move predictably (ex: a rotating fan) or not (ex: a notification).

    It is kind of obvious that stimuli like notifications are disruptive, but less so in the other cases.

  • furyofantares 7 hours ago

    I always looked at my feet while walking as a kid. People sometimes thought I was depressed, but I was just always thinking, and reducing visual noise in service of that.

    • igornadj 2 hours ago

      I wonder if it's the same as looking up and to the left when thinking?

  • 65 13 hours ago

    It's also interesting how your brain processes information in the corner of your eye. It seems to only process the most basic information - not even color.

    If I take a red Coke can and place it in the corner of my eye, I can't even tell what color it is. I can tell there's an object there but the color does not come through until I place it slightly more in my direct line on sight.

    • CoastalCoder 12 hours ago

      I was about to post a link about the fact that our retinas themselves can't sense colors in our peripheral vision.

      But apparently that's been debunked!

      https://www.aop.org.uk/ot/science-and-vision/research/2015/1...

      • jangxx 12 hours ago

        Back in high school a teacher told us this "fact" as well and I remember being very surprised because it did not match my experience at all. Many times have I tested this theory since, e.g. when waiting at a pedestrian traffic light, looking off to the side so that the light is on the very edge of my peripheral vision, seeing if I can perceive it turn green and always being able to. Of course this is not proof that there are no people who can't do this, but I definitely know that I can see color at the edge of my peripheral vision and I've come to assume that this just varies from person to person.

  • sorokod 11 hours ago

    Some figma boards end up looking like someone vomited a pile of confetti. A prime example of visual clutter for me.

  • suyash 4 hours ago

    No wonder why Steve Jobs was so obsessed about focus, he loved not only a clean interface but also wore same type of clothes and lives a minimal lifestyle.

  • DidYaWipe an hour ago

    Another indictment of "transparent" UI (as if we even needed one).

  • krunck 6 hours ago

    Take Amazon's web site for instance....

  • OgsyedIE 13 hours ago

    I figure the OP link and adversarial images (in ML) are special cases of a more general class of tomography problems.

  • mrkeen 9 hours ago

    Interesting! I dragged this icon onto my desktop to read later.

    Oh no!

  • jakupovic 3 hours ago

    Not sure if this counts, but whenever I'm starting a new project or need to figure out something involved I always start by cleaning up the shop and all surfaces. I was wondering why I did that?

  • mattgreenrocks 7 hours ago

    Been yelled at one too many times for being bad at finding things when trying to find something amidst clutter.

    I try my best, and it’s like…I just can’t see it without a lot of effort and more time than you’d think.

  • syngrog66 10 hours ago

    I've learned its best to have everyone's camera off in a VOIP call. For many reasons, one cuz if I see faces on screen live it distracts my brain internals enough it impairs my effective IQ in the moment, and adds a kind of biological lag. I don't need any fancy academic studies to "prove" it, because I've lived through directly hundreds of A-vs-B cases over many years.

    You lose some things without live faces/cameras, but also gain a lot.

    Blog post of mine on it:

    https://synystron.substack.com/p/video-calls-good-and-bad

  • lupire 13 hours ago

    This is a minor technical detail of a well known obvious fact. Research, but news.

    • botanical76 13 hours ago

      I wonder if this is relevant to tech workers. Does a second monitor - if not used as a productivity boost, but rather a neutral convenience - hinder your capability to focus on tasks? Should we turn it off to properly focus?

      I have suspected this might be the case for a while, but I'm not aware that it is obvious.

      • ehnto 11 hours ago

        I went from two screens back to one with a tiling window manager.

        What I was finding, was each tab/window/screen swap was a chance to lose focus. In practice I do lose focus a lot if I'm switching between screens.

        I don't have the same issue with tiles, and I can also setup workspaces to act a bit like focus rooms. I do a lot of little things to retain focus, another one is removing the mouse from my workflow as much as possible as it feels kind of like taking your hands off the wheel to fiddle with the car stereo. You switch mode, and that can disrupt focus.

      • lioeters 13 hours ago

        I don't know about other people, but I seem to be able to focus more with visual and auditory noise. Something to do with "stochastic resonance"?

        A quick search yields:

        > Beneficial effects of noise on higher cognition have recently attracted attention. Hypothesizing an involvement of the mesolimbic dopamine system and its functional interactions with cortical areas, the current study aimed to demonstrate a facilitation of dopamine-dependent attentional and mnemonic functions by externally applying white noise in five behavioral experiments.

        > ..These results suggest that white noise has no general effect on cognitive functions.

        Differential effects of white noise in cognitive and perceptual tasks https://pmc.ncbi.nlm.nih.gov/articles/PMC4630540/

        > Perceptual decision-making relies on the gradual accumulation of noisy sensory evidence. It is often assumed that such decisions are degraded by adding noise to a stimulus, or to the neural systems involved in the decision making process itself. But it has been suggested that adding an optimal amount of noise can, under appropriate conditions, enhance the quality of subthreshold signals in nonlinear systems, a phenomenon known as stochastic resonance.

        Stochastic resonance enhances the rate of evidence accumulation during combined brain stimulation and perceptual decision-making https://pmc.ncbi.nlm.nih.gov/articles/PMC6066257/

      • mega_dean 11 hours ago

        I've noticed that I get very distracted by motion in my peripheral vision, but not by static images. So I can use a second monitor for things like documentation, but not something like Slack that shows new messages, animated gifs/emojis, etc.

        I've also configured my text editor to be very "static": when I type, the only things that can happen are the cursor moves, or text is inserted. I have to manually trigger things like the autocompletion popup, LSP checks, or highlighting the symbol under the cursor.

        • botanical76 10 hours ago

          This reminds me of some of the early linting tools that emerged around the time ES5 / node.js was blowing up. I found the defaults to be insanely distracting, giving me warnings about unused variables etc as I was still typing the code (of course it's unused, I just defined it!)

          GitHub Copilot is similar, defaulting to provide suggestions to finish your LOC whenever you stop typing. While the AI tools can be very useful, the benefit is lost if I can't focus on what I'm writing.

        • alpaca128 9 hours ago

          That's one of the reasons I use Vim. It does absolutely nothing without an input and has nothing unnecessary on the screen. Unfortunately pretty much all modern editors and IDEs aim for the exact opposite, the last time I tried VS Code it even had a button floating above the code for some git related stuff.

          So far I haven't seen a piece of software that tries to do everything under the sun while also being enjoyable to use.

  • m3kw9 12 hours ago

    That opposite of what Einstein said

  • Seattle3503 13 hours ago

    Sending this to my partner next time I ask them to clean up.