HEARING IS CRUCIAL FOR LOCALIZING and identifying sound; for humans, it is particularly important because of its role in the understanding and production of speech. The auditory system has several noteworthy features. Its subcortical pathway is longer than that of other sensory systems. Unlike the visual system, sounds can enter the auditory system from all directions, day and night, when we are asleep as well as when we are awake. The auditory system processes not only sounds emanating from outside the body (environmental sounds, sounds generated by others) but also self-generated sounds (vocalizations and chewing sounds). The location of sound stimuli in space is not conveyed by the spatial arrangement of sensory afferent neurons but is instead computed by the auditory system from representations of the physical cues.
Sounds Convey Multiple Types of Information to Hearing Animals
Hearing helps to alert animals to the presence of unseen dangers or opportunities and, in many species, also serves as a means for communication. Information about where sounds arise and what they mean must be extracted from the representations of the physical characteristics of sound at each of the ears. To understand how animals process sound, it is useful first to consider which cues are available.
Most vertebrates take advantage of having two ears for localizing sounds in the horizontal plane. Sound sources at different positions in that plane affect the two ears differentially: Sound arrives earlier and is more intense at the ear nearer the source (Figure 28–1A). Interaural time and intensity differences carry information about where sounds arise.
Cues for localizing sound sources in the horizontal plane.
A. Interaural time and intensity differences are cues for localizing sound sources in the horizontal plane, or azimuth. A sound arising in the horizontal plane arrives differently at the two ears: Sounds arrive earlier and are louder at the ear nearer the source. A sound that arises directly in the front or back travels the same distance to the right and left ears and thus arrives at both ears simultaneously. Interaural time and intensity do not vary with the movement of sound sources in the vertical plane, so it is impossible to localize a pure sinusoidal tone in the vertical plane. In humans, the maximal interaural time difference is approximately 600 μs. High-frequency sounds, with short wavelengths, are deflected by the head, producing a sound shadow on the far side. (Adapted, with permission, from Geisler 1998.)
B. Mammals can localize broadband sounds in both the vertical and horizontal planes on the basis of spectral filtering. When a noise that has equal energy at all frequencies over the human hearing range (white noise) is presented through a speaker, the ear, head, and shoulders cancel energy at some frequencies and enhance others. The white noise that is emitted from the speaker has a flat power spectrum, but ...