Ap Cam

Find The Best Tech Web Designs & Digital Insights

Technology and Design

Multisensory Perception: Examples and Interactions

Although it has been traditional to study the various senses independently, most of the time, perception operates in the context of information supplied by multiple sensory modalities at the same time. For example, imagine if you witnessed a car collision. You could describe the stimulus generated by this event by considering each of the senses independently; that is, as a set of unimodal stimuli. Your eyes would be stimulated with patterns of light energy bouncing off the cars involved. Your ears would be stimulated with patterns of acoustic energy emanating from the collision. Your nose might even be stimulated by the smell of burning rubber or gasoline. However, all of this information would be relevant to the same thing: your perception of the car collision. Indeed, unless someone was to explicitly ask you to describe your perception in unimodal terms, you would most likely experience the event as a unified bundle of sensations from multiple senses. In other words, your perception would be multimodal.

The way we receive the information from the world is called sensation while our interpretation of that information is called perception.

For the last few decades, perceptual research has pointed to the importance of multimodal perception: the effects on the perception of events and objects in the world that are observed when there is information from more than one sensory modality. Most of this research indicates that, at some point in perceptual processing, information from the various sensory modalities is integrated. Although neuroscientists tend to study very simple interactions between neurons, the fact that they’ve found so many crossmodal areas of the cortex seems to hint that the way we experience the world is fundamentally multimodal. Our intuitions about perception are consistent with this; it does not seem as though our perception of events is constrained to the perception of each sensory modality independently. It will probably require many more years of research before neuroscientists uncover all the details of the neural machinery involved in this unified experience. In the meantime, experimental psychologists have contributed to our understanding of multimodal perception through investigations of the behavioral effects associated with it.

These effects fall into two broad classes. The first class-multimodal phenomena-concerns the binding of inputs from multiple sensory modalities and the effects of this binding on perception. Multimodal phenomena concern stimuli that generate simultaneous (or nearly simultaneous) information in more than one sensory modality. As discussed above, speech is a classic example of this kind of stimulus. When an individual speaks, she generates sound waves that carry meaningful information. If the perceiver is also looking at the speaker, then that perceiver also has access to visual patterns that carry meaningful information. Even so, the visual speech pattern alone is sufficient for very robust speech perception. Most people assume that deaf individuals are much better at lipreading than individuals with normal hearing. It may come as a surprise to learn, however, that some individuals with normal hearing are also remarkably good at lipreading (sometimes called “speechreading”). In fact, there is a wide range of speechreading ability in both normal hearing and deaf populations (Andersson, Lyxell, Rönnberg, & Spens, 2001).

Multisensory experiences come from our perception and evaluation of food and drink. These experiences involve multiple senses, including vision, audition, touch, taste, and smell.

Let's explore specific examples of how these senses interact in our perception of food.

Multisensory Perception of Food

The multisensory perception of food involves various sensory inputs. Visual cues are perceived when foodstuffs are outside of the mouth. Typically, gustatory and olfactory food cues are primarily perceived when we are actually consuming food.

Multisensory interactions often take place between oral texture and both olfactory and gustatory cues (see also Bult et al. 2007; Christensen 1980a, 1980b; Hollowood et al. 2002).

Visual and Flavor Perception

Visual cues have a significant impact on our perception and evaluation of flavor. The color of food or drink items can modify flavor perception (e.g., DuBose et al. 1980; Johnson and Clydesdale 1982; Morrot et al. 2001; Oram et al. 1995; Roth et al. 1988; Stillman 1993; Wheatley 1973; Zampini et al. 2007). Lighting conditions, as demonstrated by Wheatley (1973), can also influence flavor perception and even induce nausea.

How Color Affects Taste Perception

For example, in one oft-cited study, DuBose et al. (1980) demonstrated that the perceived flavor of many drinks was significantly influenced by their color, especially when participants were unaware of the appropriate color. A colorless solution was perceived as lime-flavored when colored red; a similar effect was reported for the lime-flavored beverage.

However, the effect of color intensity on perceived flavor intensity is rather less clear. Some studies have shown that adding color to solutions can affect the perceived sweetness (e.g., Alley and Alley 1998; Frank et al. 1986; Gifford et al. 1986; Strugnell 1997). For example, Gifford et al. (1986) found that red-colored pear nectar was rated as being less sweet than colorless pear nectar. However, Pangborn and Hansen (1963) failed to replicate these results.

Regardless of the appropriateness of the color-odor match, the perceived intensity of tastes and odors can be influenced when colors are added to the solutions. In some studies, the perceived sweetness of these solutions was affected (Johnson and Clydesdale 1982; Johnson et al. 1983).

The influence of color on flavor perception might be affected by the taster status of participants. For example, taster status can be determined by their sensitivity to 6-n-propylthiouracil (PROP; e.g., Bartoshuk et al. 1992) as well as to a variety of other tastants. PROP sensitivity is typically measured using paper strips (see Bartoshuk et al. 1994).

Supertasters showed less influence of visual cues on their flavor identification responses than did non-tasters (or else were presented as colorless solutions). This suggests that visual dominance may be reduced in supertasters.

Multisensory Integration

Multisensory Integration

Learned Associations Between Colors and Flavors

Learned associations between specific colors and particular flavors also play a crucial role. These associations are fairly universal due to common experiences with fruits (Maga 1974; see also Morrot et al. 2001). For example, lemons are typically yellow in Europe, whereas in Colombia they are mostly dark green. This difference may seem incongruent to those who live elsewhere (cf. Demattè et al. 2010; Spence 2002; Wheatley 1973).

In a recent study by Levitan et al. (2008; see also Shankar et al. 2009), participants were asked to assess the flavor of colored sugar-coated chocolate sweets, Smarties (Nestlé). The researchers found that people’s perception of the flavor of food can be significantly influenced by color information (see Levitan et al. 2008).

In another recent study, Shankar et al. (2010) examined the independent effects of color or label on multisensory flavor perception. They found that the green-dark color pairing was associated with a more “minty” flavor than the green color alone. These findings highlight the complex interplay between color, expectations, and flavor perception.

Table 37.1: Flavors Most Frequently Associated with Each Colored Solution

Zampini et al. (2007, Experiment 1) conducted a study to determine the flavors most frequently associated with each colored solution. The table below summarizes their findings:

Colored Solution Most Frequent Flavor Association
Red Strawberry, Raspberry, Cherry
Green Lime
Orange Orange
Colorless Variable

Expectancy and Prior Beliefs

Expectations regarding the flavor that is about to be experienced can dramatically impact flavor perception and evaluation in humans. These expectations are often based on prior associations between the visual aspect and the experienced flavor (see Shankar et al. 2010).

Colors and flavors may rely on a similar mechanism, where food properties are systematically combined based on prior experience. This highlights the importance of prior beliefs in shaping multisensory interactions.

Types of Synesthesia

Types of Synesthesia

Impact of Sounds on Food Perception

Auditory cues also play a role in the perception of food. Many foodstuffs produce particular sounds when we eat them. These sounds are closely associated with pleasantness, especially in crunchy foods (i.e., crisps; e.g., Vickers 1983).

Crispy foods are typically higher in pitch than crunchy foods (Vickers 1979). Manipulating these auditory properties can influence the perception or evaluation of particular stimuli. For example, altering the sound level or just the high-frequency components can affect perceived freshness (see also Chen et al. 2008; Varela et al. 2006).

The Science of Sound and Taste

Auditory Cues and Freshness

The sounds produced during the biting action contribute to the perceived freshness of food. Zampini et al. (2004) conducted a study where participants rated the freshness of potato chips while the auditory cues produced during the biting action were selectively amplified (see Figure 37.2).

The study found that participants consistently rated the chips with amplified auditory cues as having a higher level of freshness and, therefore, of crispness. This demonstrates that auditory cues can modulate the perceived texture of food.

Auditory Cues and Carbonation

Auditory cues also influence the evaluation of carbonation of water. The sounds produced by the bubbles can affect the perception of carbonation (e.g., Chandrashekar et al. 2009; Vickers 1991; Yau and McDaniel 1992).

Studies have shown that the perception of carbonation is different when samples are assessed in a cup versus when they are assessed in the mouth. This suggests that auditory cues, such as the sound of bubbles entering the oral cavity, contribute to the overall perception of carbonation (see Koza et al. 2002).

Conclusion

In summary, the multisensory perception of food is a complex process involving interactions between vision, audition, touch, taste, and smell (37.1. vision, audition, touch, taste, and smell). Visual cues, such as color, can significantly influence flavor perception (37.2. MULTISENSORY INTERACTIONS BETWEEN VISUAL AND FLAVOR PERCEPTION37.2.1.). Auditory cues, particularly those associated with biting or carbonation, also play a crucial role in shaping our sensory experience of food (37.3.). Understanding these interactions is essential for a comprehensive understanding of overall food perception (37.4.).