Ap Cam

Find The Best Tech Web Designs & Digital Insights

Technology and Design

Vision and Hearing Psychology: An Overview

Among the five human senses, vision and hearing are considered the most sophisticated due to their substantial representation within the cortical regions of the central nervous system (CNS). Our eyes take in sensory information that helps us understand the world around us.

The Visual System

The visual system constructs a mental representation of the world around us. This contributes to our ability to successfully navigate through physical space and interact with important individuals and objects in our environments. This section will provide an overview of the basic anatomy and function of the visual system.

The eye is the major sensory organ involved in vision. Light waves are transmitted across the cornea and enter the eye through the pupil. The cornea is the transparent covering over the eye. It serves as a barrier between the inner eye and the outside world, and it is involved in focusing light waves that enter the eye.

The pupil is the small opening in the eye through which light passes, and the size of the pupil can change as a function of light levels as well as emotional arousal. When light levels are low, the pupil will become dilated, or expanded, to allow more light to enter the eye. When light levels are high, the pupil will constrict, or become smaller, to reduce the amount of light that enters the eye.

After passing through the pupil, light crosses the lens, a curved, transparent structure that serves to provide additional focus. The lens is attached to muscles that can change its shape to aid in focusing light that is reflected from near or far objects. In a normal-sighted individual, the lens will focus images perfectly on a small indentation in the back of the eye known as the fovea, which is part of the retina, the light-sensitive lining of the eye.

The fovea contains densely packed specialized photoreceptor cells. These photoreceptor cells, known as cones, are light-detecting cells. The cones are specialized types of photoreceptors that work best in bright light conditions. Cones are very sensitive to acute detail and provide tremendous spatial resolution. While cones are concentrated in the fovea, where images tend to be focused, rods, another type of photoreceptor, are located throughout the remainder of the retina.

We have all experienced the different sensitivities of rods and cones when making the transition from a brightly lit environment to a dimly lit environment. Imagine going to see a blockbuster movie on a clear summer day. As you walk from the brightly lit lobby into the dark theater, you notice that you immediately have difficulty seeing much of anything. After a few minutes, you begin to adjust to the darkness and can see the interior of the theater. In the bright environment, your vision was dominated primarily by cone activity. As you move to the dark environment, rod activity dominates, but there is a delay in transitioning between the phases.

Rods and cones are connected (via several interneurons) to retinal ganglion cells. Axons from the retinal ganglion cells converge and exit through the back of the eye to form the optic nerve. The optic nerve carries visual information from the retina to the brain. There is a point in the visual field called the blind spot: Even when light from a small object is focused on the blind spot, we do not see it.

We are not consciously aware of our blind spots for two reasons: First, each eye gets a slightly different view of the visual field; therefore, the blind spots do not overlap. The optic nerve from each eye merges just below the brain at a point called the optic chiasm. As the optic chiasm is an X-shaped structure that sits just below the cerebral cortex at the front of the brain. Once inside the brain, visual information is sent via a number of structures to the occipital lobe at the back of the brain for processing.

Visual information might be processed in parallel pathways which can generally be described as the “what pathway” and the “where/how” pathway. The “what pathway” is involved in object recognition and identification, while the “where/how pathway” is involved with location in space and how one might interact with a particular visual stimulus (Milner & Goodale, 2008; Ungerleider & Haxby, 1994).

We do not see the world in black and white; neither do we see it as two-dimensional (2-D) or flat (just height and width, no depth). Normal-sighted individuals have three different types of cones that mediate color vision. Each of these cone types is maximally sensitive to a slightly different wavelength of light. According to the trichromatic theory of color vision, all colors in the spectrum can be produced by combining red, green, and blue.

Trichromatic Theory

The trichromatic theory of color vision illustrates the different sensitivities for the three cone types found in a normal-sighted individual.

The trichromatic theory of color vision is not the only theory-another major theory of color vision is known as the opponent-process theory. According to this theory, color is coded in opponent pairs: black-white, yellow-blue, and green-red. The basic idea is that some cells of the visual system are excited by one of the opponent colors and inhibited by the other. So, a cell that was excited by wavelengths associated with green would be inhibited by wavelengths associated with red, and vice versa. One of the implications of opponent processing is that we do not experience greenish-reds or yellowish-blues as colors.

Another implication is that this leads to the experience of negative afterimages. An afterimage describes the continuation of a visual sensation after removal of the stimulus. For example, when you stare briefly at the sun and then look away from it, you may still perceive a spot of light although the stimulus (the sun) has been removed. When color is involved in the stimulus, the color pairings identified in the opponent-process theory lead to a negative afterimage.

But these two theories-the trichromatic theory of color vision and the opponent-process theory-are not mutually exclusive. Research has shown that they just apply to different levels of the nervous system. For visual processing on the retina, trichromatic theory applies: the cones are responsive to three different wavelengths that represent red, blue, and green.

Our ability to perceive spatial relationships in three-dimensional (3-D) space is known as depth perception. Our world is three-dimensional, so it makes sense that our mental representation of the world has three-dimensional properties. We use a variety of cues in a visual scene to establish our sense of depth. Some of these are binocular cues, which means that they rely on the use of both eyes. One example of a binocular depth cue is binocular disparity, the slightly different view of the world that each of our eyes receives.

To experience this slightly different view, do this simple exercise: extend your arm fully and extend one of your fingers and focus on that finger. Now, close your left eye without moving your head, then open your left eye and close your right eye without moving your head. A 3-D movie works on the same principle: the special glasses you wear allow the two slightly different images projected onto the screen to be seen separately by your left and your right eye.

Although we rely on binocular cues to experience depth in our 3-D world, we can also perceive depth in 2-D arrays. Think about all the paintings and photographs you have seen. Generally, you pick up on depth in these images even though the visual stimulus is 2-D. When we do this, we are relying on a number of monocular cues, or cues that require only one eye. An example of a monocular cue would be what is known as linear perspective.

Linear perspective refers to the fact that we perceive depth when we see two parallel lines that seem to converge in an image. We perceive depth in a two-dimensional figure like this one through the use of monocular cues like linear perspective, like the parallel lines converging as the road narrows in the distance.

Linear Perspective

Linear perspective in Giotto's "Entrance of Christ into Jerusalem"

Bruce Bridgeman was born with an extreme case of lazy eye that resulted in him being stereoblind, or unable to respond to binocular cues of depth. He relied heavily on monocular depth cues, but he never had a true appreciation of the 3-D nature of the world around him. The movie the couple was going to see was shot in 3-D, and even though he thought it was a waste of money, Bruce paid for the 3-D glasses when he purchased his ticket. As soon as the film began, Bruce put on the glasses and experienced something completely new. For the first time in his life he appreciated the true depth of the world around him.

There are cells in the nervous system that respond to binocular depth cues. Normally, these cells require activation during early development in order to persist, so experts familiar with Bruce’s case (and others like his) assume that at some point in his development, Bruce must have experienced at least a fleeting moment of binocular vision. It was enough to ensure the survival of the cells in the visual system tuned to binocular cues.

Vision is not an encapsulated system. It interacts with and depends on other sensory modalities. For example, when you move your head in one direction, your eyes reflexively move in the opposite direction to compensate, allowing you to maintain your gaze on the object that you are looking at. This reflex is called the vestibulo-ocular reflex. It is achieved by integrating information from both the visual and the vestibular system (which knows about body motion and position).

You can experience this compensation quite simply. First, while you keep your head still and your gaze looking straight ahead, wave your finger in front of you from side to side. Notice how the image of the finger appears blurry. Now, keep your finger steady and look at it while you move your head from side to side. Notice how your eyes reflexively move to compensate the movement of your head and how the image of the finger stays sharp and stable.

Vision also interacts with your proprioceptive system, to help you find where all your body parts are, and with your auditory system, to help you understand the sounds people make when they speak. Finally, vision is also often implicated in a blending-of-sensations phenomenon known as synesthesia. Synesthesia occurs when one sensory signal gives rise to two or more sensations. The most common type is grapheme-color synesthesia. About 1 in 200 individuals experience a sensation of color associated with specific letters, numbers, or words: the number 1 might always be seen as red, the number 2 as orange, etc.

The Auditory System

The human senses are: hearing, sight, smell, touch, and taste. At first glance, one might think they have nothing in common except that both are placed on the head, thus serving along with the nose to alleviate the monotony of an otherwise unrelieved spheroid. Differences are more numerous than similarities. On the other hand, the similarities between audition and vision are often striking. Both organs serve to collect and sort stimuli impinging on the organism.

Consider the confusion in our perceptual world if our eyes retained, for much longer durations than they do, impressions of objects that had passed beyond our field of vision. Or, consider what our auditory world would be if the ear were so built that it resonated for several seconds to frequencies commonly found in human speech, instead of damping almost immediately as it does within the entire audible range.

There are many correspondences in the ways these two organs have developed for maximum biological efficiency. Both have developed the capacity to respond to stimuli so weak that the limiting value is imposed almost as much by the nature of the physical stimulus as by the nature of the end organ. They have also developed the capacity to respond to intensities so great that other tissues of the body beside the receptor surfaces are likewise threatened.

Many instructive parallels between vision and audition can also be drawn. Other examples could be the quantum theory of light that has led to attempts to apply such thinking to audition. It is significant to note that neither organ produces zero sensation if a stimulus intensity of zero is applied. The eye does not report “black” to the brain if there is no external stimulus, but an idioretinal grey instead. Likewise, the ear manufactures its own slight sound level. This is easily demonstrated in an anechoic room.

An initial sensation is a change in barometric pressure, with other “noises” following, such as the beating of the pulse in the external auditory canal, then very weak but steady sound of wide frequency spectrum caused by slight mandibular movements amplified and resonated in the ear canal, or perhaps by imbalances of the mechanical auditory system proper. It is customary to regard the eye as far superior to the ear in terms of physical energy transduced. Radiant energy, one would think, is on a different, less powerful, level than air waves, the latter of which may be so strong as to be sensed by the skin.

But, as an ear professional, it is recognized that the eardrum has to move in and out only to the distance of 0.1 the diameter of a hydrogen molecule (.000,000,01 mm) before a sound is heard. Further, the basilar membrane then only has to undergo an excursion of 0.1 that at the eardrum. The fact is that in terms of energy at threshold, in spectral regions where the two organs are most efficient, the eye and ear are roughly similar. Vision requires 2.2 to 5.7 x 10-10 ergs, and audition requires about 1 x 10-9 ergs per second. These values are approximately the weight of a mosquito’s wing, dead or alive.

What is interesting is that in both organs, sensitivity is almost at theoretical limits. For the eye, in the scoptic region, the amount of energy needed to stimulate is very close to the magnitude of the ultimate radiation quantum (Hecht). Here, any further increase in sensitivity would be biologically useless. If the ear had a marked increase in sensitivity, it could possibly result in a more noticeable intra-aural noise, together with the masking produced by that noise. For example, for those whose ears that have great sensitivity (say, 15 dB better than “normal), such a condition may occur.

Fifteen dB can be interpreted as an energy ratio of a little more than 1:5, meaning that the average hearing person can only just detect a sound that contains five times the energy of a sound that a particularly sensitive person can detect. A 2013 study that investigated vision versus hearing comparisons of sensitivity and audibility functions, implied that human adults showed a relationship between threshold levels of hearing and vision, at least in terms of the overall stimulus area defined by the contrast sensitivity (CS) and audibility functions.

Harris made other comparisons between vision and audition. These included wave length, energy integration, growth and decay of sensation, bilateral interaction, peripheral and central explanations of acuity, the quantum theory of threshold and of discrimination, and intersensory facilitation.

The Interplay Between Vision and Hearing

Did you know that where you are looking may influence how well you hear? A recent study published in Scientific Reports looked at the effect of gaze direction on hearing, with some interesting results. They found that the brain needs to work harder to hear when we are looking away from what we are listening to. This happened even when participants were put in a dark room and asked to either direct their gaze at a speaker in front of them or look away.

When they looked away the researchers found that the participant’s reaction times were slower, and their brain was more active (working harder to listen for the sound). What this says is that our eyes and ears work together more than we may realize. Looking at what you are listening to helps you hear it better. It is believed that the brain expects us to be looking at what we are listening to, and it has to work harder to fix the misalignment when we are not looking at the sound source.

For everyday listening, this means it will be easier to follow a conversation if we are looking at the people, we are speaking to rather than trying to listen when they are behind us or in another room. This is especially important for people with hearing loss, where the ability to follow conversation is already compromised. Your eyes and ears are also physically connected by nerve pathways that are responsible for the vestibulo-ocular reflex (VOR), which helps keep you balanced.

For example, when you close your eyes, what do you see? What passes before your shut eyes likely depends on the sounds you hear. Research from the Institute of Neuroscience and Psychology at the University of Glasgow has measured how your brain’s visual cortex uses information gleaned from your ears, as well as your eyes, in order to see the world. Sounds help create visual imagery, mental images, and automatic projections. For example, if you are in a street and you hear the sound of an approaching motorbike, you expect to see a motorbike coming around the corner. The visual cortex uses information gleaned from the ears to better predict what might be seen.

Just as your ears and eyes work together to help you create a complete view of your world, they also work hand-in-hand to keep you balanced. Balance and equilibrium help us stay upright when standing and know where we are in relation to gravity. Our balance system, also known as the vestibular system, helps us walk, run, and move without falling. Balance is controlled through signals to the brain from your eyes, the inner ear, and the sensory systems of the body (such as the skin, muscles, and joints).

If the vestibular system is not functioning properly, your eye muscles cannot adjust as they should because the feedback indicators from the ear are damaged, resulting in blurred vision, nausea, or dizziness. This means that you cannot accurately determine where the floor is in relation to where you are, and the risk of tripping increases vastly.

Human intuition tells us that our senses are all separate streams of information, but we now know that isn’t the case. Hearing actually enhances the sense of sight, according to a UCLA study, with both working to help you perceive and participate in the world around you. Visually impaired older adults are more likely to also experience hearing loss, per a study published in the medical journal JAMA Ophthalmology. Vision and hearing loss go hand in hand with cognitive decline, according to research showing that each condition is somehow connected to reduced mental functioning over time. One study, referenced in a news article, found that participants with the most profound vision impairment had the lowest average scores on cognition tests.

Healthy eyes and ears - along with joints, muscles, and brain - help keep you steady on your feet, reducing your risk of falling. It’s pretty obvious that seeing your best helps you stay upright, but many people do not realize that the inner ear also plays an important role in maintaining balance.

Vestibulo-ocular reflex and balance

Summary

Light waves cross the cornea and enter the eye at the pupil. The eye’s lens focuses this light so that the image is focused on a region of the retina known as the fovea. The fovea contains cones that possess high levels of visual acuity and operate best in bright light conditions. Rods are located throughout the retina and operate best under dim light conditions. Visual information leaves the eye via the optic nerve. Information from each visual field is sent to the opposite side of the brain at the optic chiasm.

Two theories explain color perception. The trichromatic theory asserts that three distinct cone groups are tuned to slightly different wavelengths of light, and it is the combination of activity across these cone types that results in our perception of all the colors we see. The opponent-process theory of color vision asserts that color is processed in opponent pairs and accounts for the interesting phenomenon of a negative afterimage.

It is customary to regard the eye as far superior to the ear in terms of physical energy transduced. Radiant energy, one would think, is on a different, less powerful, level than air waves, the latter of which may be so strong as to be sensed by the skin. In terms of energy at threshold, the eye and ear are roughly similar. Both organs operate close to their theoretical limits of sensitivity.

Sense Energy Required
Vision 2.2 to 5.7 x 10-10 ergs per second
Audition About 1 x 10-9 ergs per second

There’s no reason to live with untreated hearing loss. Schedule a hearing test with Amazing Hearing, so you can get help as soon as possible.