• Nibodhika@lemmy.world
    link
    fedilink
    arrow-up
    28
    arrow-down
    1
    ·
    4 days ago

    One thing I find very interesting about how brains process reality is that there’s a disease that makes your eyes have blind spots. However people with that disease don’t see those blind spots because the brain fills the gaps with the information it knows to be there. So you could see a door closed just as it was when you last looked at it directly, but in the meantime someone opened the door and you’re still seeing the door closed until you look at it directly.

    • Lemminary@lemmy.world
      link
      fedilink
      arrow-up
      13
      ·
      4 days ago

      We all have blind spots because there’s a hole in the retina in the back of the eye for the optical nerve. The spots are located on the outer top side of our field of view and you can become aware of them with some visual tests online.

      • Eiri@lemmy.ca
        link
        fedilink
        arrow-up
        2
        ·
        4 days ago

        Another fun thing you can do is look at the sky (not the sun!) on a sunny day and start seeing your blood circulation and blind spot.

  • spittingimage@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    4 days ago

    Here’s an interesting related factoid - your eyes are constantly making tiny micromovements called saccades. During these movements, you don’t receive any visual information. Your actual view of the world comes in stuttering fits and starts. You don’t notice this because your brain literally invents what you think you’re seeing during saccades. It’s good enough not to get you weeded out of the gene pool.

    • Etterra@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      4 days ago

      Yey your brain makes up an approximation of reality at best. It’s the weirdest fucking thing.

  • TachyonTele@lemm.ee
    link
    fedilink
    arrow-up
    48
    arrow-down
    8
    ·
    edit-2
    5 days ago

    Basically, yes. Our eyes capture the light that goes into them at 24 frames per second (please correct me if I goofed on that) and the image is upside down.

    Our brains turn those images upright, and it also fills in the blanks. The brain basically guesses what’s going on between the frames. It’s highly adapt at pattern recognition and estimation.

    My favorite example of this is our nose. Look at you nose. You can look down and see it a little, and you can close one eye and see more of it. It’s right there in the bottom center of our view, but you don’t see it at all everyday.

    That’s because it’s always there, and your brain filters it out. The pattern of our nose being there doesn’t change, so your brain just ignores it unless you want to intentionally see it. You can extrapolate that to everything else. Most things the brain expects to see, and does see through our eyes, is kind of ignored. It’s there, but it’s not as important as say, anything that’s moving.

    Also, and this is fun to think about, we don’t even see everything. The color spectrum is far wider than what our eyes can recognize. There are animals, sea life and insects that can see much much more than we can.

    But to answer more directly, you are right, the brain does crazy heavy lifting for all of our senses, not just sight. Our reality is confined to what our bodies can decifer from the world through our five senses.

    • calabast@lemm.ee
      link
      fedilink
      arrow-up
      55
      arrow-down
      1
      ·
      edit-2
      5 days ago

      We definitely are seeing things faster than 24 Hz, or we wouldn’t be able to tell a difference in refresh rates above that.

      Edit: I don’t think we have a digital, on-off refresh rate of our vision, so fps doesn’t exactly apply. Our brain does turn the ongoing stream of sensory data from our eyes into our vision “video”, but compared to digital screen refresh rates, we can definitely tell a difference between 24 and say 60 fps.

      • TachyonTele@lemm.ee
        link
        fedilink
        arrow-up
        27
        ·
        5 days ago

        Yeah it’s not like frames from a projector. It’s a stream. But the brain skips parts that haven’t changed.

      • Ekky@sopuli.xyz
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        5 days ago

        I think i read that fighter pilots need to be able to identify a plane in one frame at 300 fps, and that the theoretical limit of the eye is 1000+ fps.

        Though, whether the brain can manage to process the data at 1000+ fps is questionable.

        • Fester@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          5 days ago

          I’m using part of this comment to inform my monitor purchases for the rest of my life.

        • nimpnin@sopuli.xyz
          link
          fedilink
          arrow-up
          2
          ·
          4 days ago

          Both of these claims are kinda misguided. The brain is able to detect very short flashes of light (say, 1 thousandth of a second), and other major changes in light perception. Especially an increase in light will be registered near instantly. However, since it doesn’t have a set frame rate, more minor changes in the light perception (say, 100 fps) are not going to be registered. And the brain does try to actively correct discontinuities, that’s why even 12 fps animation feels like movement, although a bit choppy.

      • Caveman@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        4 days ago

        I would believe it if someone told me that an individual rod or cone in the eye was 24fps but they’re most likely not synched up

    • Reyali@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      4 days ago

      If you want a fun experiment of all the things we see but don’t actually process, I recommend the game series I’m On Observation Duty. You flip through a series of security cameras and identify when something changed. It’s incredible when you realize the entire floor of a room changed or a giant thing went missing, and you just tuned it out because your brain never felt a need to take in that detail.

      It’s sorta horror genre and I hate pretty much every other horror thing, but I love those games because they make me think about how I think.

  • zarathustra0@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    5 days ago

    Your brain is constantly processing the inputs from all of your senses and pretty much ignoring them if they fit with what it is already expecting.

    Your brain is lazy. If everything seems to fit with what your brain expects then you believe that what you are seing is reality and you generally ignore it.

    Generally the mind only focuses on what it believes is salient/interesting/unexpected.

    • Eheran@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      5 days ago

      Imagine if we had image sensors that could filter like that. Boom, video 100 times smaller in size. “Autonomous” surveillance cameras running on fractions of the power. Etc. Etc. Just far more efficient.

      • Passerby6497@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        5 days ago

        That’s actually how a lot of video codecs work, they just throw a key frame in every so often that has the full image so you can just do diffs for the rest of the frames till the next key frame.

      • Beacon@fedia.io
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        5 days ago

        We do have those things. That’s how many technologies already work.

        • I_Miss_Daniel@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          5 days ago

          Yes. We get hints of this now and then when digital TV breaks up and only the moving parts are updating until the next key frame arrives.

  • deafboy@lemmy.world
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    4 days ago

    https://omny.fm/shows/inner-cosmos-with-david-eagleman/ep78-does-your-brain-have-one-model-of-the-world-o

    Why do you see a unified image when you open your eyes, even though each part of your visual cortex has access to only a small part of the world? What is special about the wrinkled outer layer of the brain, and what does that have to do with the way that you explore and come to understand the world? Are there new theories of how the brain operates? And in what ways is it doing something very different than current AI? Join Eagleman with guest Jeff Hawkins, theoretician and author of “A Thousand Brains” to dive into Hawkins’ theory of many models running in the brain at once.

    • MonkderVierte@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      4 days ago

      And in what ways is it doing something very different than current AI?

      That’s equal to the question “What differrentiates a screw from a car?”.

  • lath@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    5 days ago

    I heard a similar thing. But a bit more complicated. It wouldn’t be just the eyes, but all senses used by the brain to edit a filtered vision of reality.

    And while the eyes take in everything they’re capable of, the brain only focuses on what it considers important. Which is probably false due to the many, many times one will search for something within their cone of vision, yet are unable to see it.

    So while I’m not sure of the details, the brain can be thought of as choosy with what it shows.

  • BarqsHasBite@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    5 days ago

    I mean all animals have brains to render reality, aka visual, audio, predator awareness. It’s not so special, most animals have tiny little brains.