Physics

Physics of the Ear

The physics of the ear involves the study of how sound waves are collected, transmitted, and processed by the ear. This includes the mechanics of the outer, middle, and inner ear, as well as the conversion of sound waves into electrical signals by the cochlea. Understanding the physics of the ear is crucial for developing hearing aids and other auditory technologies.

Written by Perlego with AI-assistance

9 Key excerpts on "Physics of the Ear"

  • Book cover image for: Physics of the Body
    eBook - PDF

    Physics of the Body

    Revised 2nd Edition

    The inner ear, along with the additional attachment of the vestibular (balance) sensors, is called the membranous labyrinth. 245 Physics of the Ear and Hearing malfunction. While they all involve physics, we know much more about the physics of the mechanical system than about the physics of the other parts. In this chapter, we deal with the sense of hearing only up to the auditory nerve. 11.1 The Ear and Hearing The ear is a cleverly designed converter of very weak mechanical sound waves in air into electrical pulses in the auditory nerve. Figure 11.1 shows most of the structures of the ear that are involved with hearing. The ear is usually divided into three areas: the outer ear, the middle ear, and the inner ear. What we com- monly call the ear (the appendage we use to help hold up our glasses) has no essential role in hearing. The outer ear consists of the ear canal, which terminates at the eardrum (tympanic membrane). The middle ear includes the three small bones (ossicles) and an opening to the mouth (Eustachian tube). The inner ear consists of the fluid-filled, spiral-shaped cochlea containing the organ of Corti. Hair cells in the organ of Corti convert vibrations of sound waves hitting the eardrum into nerve pulses that inform the auditory cortex of our brain of these sound waves. The inner ear is part of the labyrinth which also includes the sensors of the vestibular (sense of balance) system. The latter provides the brain with electrical signals that contain positional information of the head with respect to the direction of gravity and angular and linear motion information (see Section 11.10). One of the first medical physicists to study the Physics of the Ear and hearing was Hermann von Helmholtz (1821–1894). He developed the first modern the- ory of how the ear works. His work was expanded and extended by Georg Von Bekesey (1900–1970), a communications engineer who became interested in the function of the ear as part of the communication system.
  • Book cover image for: Human Information Processing
    eBook - PDF

    Human Information Processing

    An Introduction to Psychology

    • Peter H. Lindsay, Donald A. Norman(Authors)
    • 2013(Publication Date)
    • Academic Press
      (Publisher)
    PREVIEW THE EAR THE PHYSICS OF SOUND The frequency of sound The intensity of sound DECIBELS THE MECHANICS OF THE EAR The inner ear Movements of the hasilar membrane The hair cells ELECTRICAL RESPONSES TO SOUND Tuning curves Temporal coding in neural responses Coding of intensity information REVIEW OF TERMS AND CONCEPTS Terms and concepts you should know The parts of the ear Sound SUGGESTED READINGS The auditory system 4 PREVIEW This is the first chapter of the set of two on the auditory system. The ear is a fascinating piece of machinery. It is composed of tiny bones and mem-branes, with spiral-shaped tubes filled with fluid. When sound waves arrive at the ear, they are directed down precisely shaped passageways through a complex series of membranes and bones, which transform the sound waves to pressure variations in a liquid-filled cavity. These pressure variations cause bulges in a membrane, and the bulges act on a set of hairs that run along the length of the membrane. Each hair is connected to a cell, and when the hair is bent, the cell sends a signal along the acoustic nerve into the brain. Thus, what we hear is determined by which hairs are bent. By comparison with the ear, all the other sensory systems are much simpler. In the eye, for example, the mechanical parts are fairly straight-forward, and all the complexity resides in the interactions of the nerve cells at the back of the eye, in the retina. With the ear, the neural con-nections in the ear itself are relatively simple, and all the complexity is put into the mechanical structures that transform sound waves into par-ticular patterns of bulges along the basilar membrane.
  • Book cover image for: Clinical Otology
    eBook - PDF
    • Myles L. Pensak, Daniel I. Choo, Myles L. Pensak, Daniel I. Choo(Authors)
    • 2014(Publication Date)
    • Thieme
      (Publisher)
    Complex sounds may be described in terms of spec-tra. The spectrum of a sound identifies the frequencies and the relative amplitudes of the various components of a sound. 2.3 Physiology of Hearing The anatomy of the auditory system is addressed in Chapter 1. This chapter traces the course of the auditory stimulus from its generator to the auditory cortex, as shown in ▶ Fig. 2.4. Although the many relay stations in the auditory system all contribute to signal processing in a unique way, we highlight those areas pertinent to the practicing otologist and to those interested in the complexity of this exciting and mysterious sensory system. The natural or usual manner by which humans detect sound is via an airborne or acoustic signal. Once the sound is gener-ated, it travels through the air in a disturbance called a sound wave. This sound is slightly modified by the body and head, specifically the head and shoulders, which a ff ect the frequen-cies below 1,500 Hz by shadowing and reflection. 3 The flange and concha of the pinna collect, amplify, and direct the sound wave to the tympanic membrane by the external auditory mea-tus. At the tympanic membrane, several transformations of the signal occur: (1) the acoustic signal becomes mechanoacous-tic; (2) it is faithfully reproduced; and (3) it is passed along to the ossicular chain, is amplified, or, under certain condi-tions, is attenuated. 2.4 Transmission and Natural Resonance of the External Ear Natural resonance refers to inherent anatomic and physiologi-cal properties of the external and middle ear that allow certain Fig. 2.3 Two tones that are 180 degrees out of phase; note that the starting points for the two waves are the same, creating a mirror image. Fig. 2.4 Block diagram of the auditory system. The sound leaves its generator and travels through the external, middle, and inner ear and travels to the auditory cortex.
  • Book cover image for: The Sounds of Language
    eBook - PDF

    The Sounds of Language

    An Introduction to Phonetics and Phonology

    The Sounds of Language: An Introduction to Phonetics and Phonology, First Edition. Elizabeth C. Zsiga. © 2013 Elizabeth C. Zsiga. Published 2013 by Blackwell Publishing Ltd. The chief function of the body is to carry the brain around. Thomas Edison Chapter outline 9.1 Anatomy and physiology of the ear 174 9.2 Neuro-anatomy 181 9.2.1 Studying the brain 181 9.2.2 Primary auditory pathways 183 9.3 Speech perception 186 9.3.1 Non-linearity 186 9.3.2 Variability and invariance 187 9.3.3 Cue integration 190 9.3.4 Top-down processing 192 9.3.5 Units of perception 192 Chapter summary 194 Further reading 195 Review exercises 195 Further analysis and discussion 196 Go online 197 References 197 9 Hearing and Speech Perception 174 HEARING AND SPEECH PERCEPTION The preceding chapters have covered the topic of how speech sounds are created by the vocal tract – articulatory phonetics. They have also covered the topic of how the speech waves thus created propagate through the air, and how they can be mathematically analyzed – acoustic phonetics. This chapter is about how speech sounds are analyzed by the human ear, and the human brain. What happens in the ear is hearing; what happens in the brain in speech per- ception. Hearing refers to the physiological process of transferring energy from sound waves to nerve impulses. Speech perception refers to the mapping of sounds into linguistic representations. We begin with the anatomy and physiology of the ear, then cover a little bit of neuro-anatomy, and finally discuss some current issues in speech perception, including the nonlinear relationship between acoustic and perceptual measures, problems of variability and invariance (including normalization and categorical perception), cue integration, top- down processing, and the question of the units of perception and thus of linguistic representation. 9.1 anatomy and physiology of the ear The ear has three parts: outer, middle, and inner.
  • Book cover image for: Auditory Physiology
    CHAPTER Anatomy and General Physiology of the Ear Introduction This chapter is concerned with the general function of the ear and the auditory system. The auditory system is divided into the ear and the ner-vous system; the ear may be divided into the outer, middle, and inner ear as shown in the cross-sectional drawing of the human ear in Figure 1.1. The location of the ear in the skull is shown in Figure 1.2, whereas Figure 1.3 gives a schematic diagram of the ear as a whole. The outer and middle ears constitute the sound-conducting part of the ear, which transmits sound from air to the fluid of the inner ear. Thus, sound led through the external auditory canal sets the tympanic membrane into vibration, and these vibrations are transferred to the inner ear via the three small bones of the middle ear when the vibrations of the footplate of the stapes set the fluid in the cochlea into vibratory motion. As discussed later in this chapter, the cochlea, and particularly the basilar membrane in the cochlea, plays an important role in analyzing the sound and converting it into a neural code. That code, after being modified in the different brain nuclei of the ascending auditory pathway, is transferred to the part of the cerebral cortex that receives auditory in-formation. These transformations have not been completely studied, but our knowledge to date indicates that the information in the sound that reaches our ears undergoes substantial transformations. These matters are considered in Chapter 2. 1 1 FIGURE 1.1. Cross-section of the human ear. (Brodel, 1946.) 2 External Ear and Head 3 FIGURE 1.2. Location of the ear in the skull. (Based on Melloni, 1957.) External Ear and Head The external ear consists of the auricle and the ear canal. The external ear is shown in a schematic drawing in Figure 1.4. The groove called the concha is acoustically the most important part of the outer ear, whereas the flange that surrounds the concha is of little importance. Together with
  • Book cover image for: Phonetics
    eBook - ePub

    Phonetics

    Transcription, Production, Acoustics, and Perception

    • Henning Reetz, Allard Jongman(Authors)
    • 2011(Publication Date)
    • Wiley-Blackwell
      (Publisher)

    12

    Physiology and Psychophysics of Hearing

    For historic reasons, speech sounds are categorized according to the way they are produced. This approach to investigating speech, focusing on the production of sounds, was motivated by the limited set of methods that were available for the measurement and representation of speech. The speech production apparatus was more or less easily accessible to investigation, and the chain of events during speech production was studied by means of introspection and careful observation. The ear, on the other hand, is small and inaccessible. Examining what happens within the ear in living subjects requires advanced technical equipment. Furthermore, much of the task of “hearing” relates to processes performed in the brain, where eventually the perception of “sounds” leads to the perception of “speech.” The scientific investigation of the process of hearing is therefore relatively young, and our knowledge still incomplete in many domains.
    With very recent brain imaging techniques (including electro-encephalography [EEG], magneto-encephalography [MEG], and functional magnetic resonance imaging [fMRI]), it has become possible to observe brain activity to a limited extent when a listener hears, for example, an [ɑ] or an [i]. Other aspects of the hearing process are investigated by indirect methods, allowing only for inferences: for example, by asking a subject if she hears an [ɑ] or an [i], and relating the answer to the acoustic properties of the speech signal.
    The observation that a given phoneme can be produced with different articulatory gestures suggests that, in addition to an articulatory description, auditory and perceptual descriptions of speech sounds are important as well. For example, vowels are produced with rather different tongue positions by individual speakers, and the production of “rounded” vowels (like [u]) does not necessarily require a lip gesture after all: just stand in front of a mirror and try to produce an [u] without pursing the lips. This suggests that auditory targets (for example, the distribution of energy in a spectrum) rather than articulatory targets (like the position of the tongue) play the major role in speech perception. In other words, although it is possible to describe speech sounds in articulatory terms, and although the existing articulatory categorizations are generally quite effective, it may be advantageous to use auditory or perceptual categories.
  • Book cover image for: Physical and Applied Acoustics
    eBook - PDF
    One must know which objective improvements in the transmission characteristics will be subjectively perceptible to the ear and which will result only in an unnecessary increase of technical effort and monetary expenditure. For telephone transmission, one must know how speech sounds are formed, for example, and what frequency band is needed for adequate intelligibility. A second, obvious reason for concerning ourselves with these topics naturally is that we constantly carry voice and ears around with us as sound source and receiver and thus we ought to be familiar with their principal characteristics. Finally, the characteristics of our organ of hearing are not without interest in connection with the problems of noise abatement. Physiological acoustics is concerned with the anatomy and function of the human ear and voice; psychological acoustics, with the question of how humans perceive an acoustical stimulus. 7.1. The Ear 7.1.1. A natom ical S tructure of the H uman E ar The structure of the human ear is shown in the simplified diagram of Fig. 7.1. Because the external ear (not shown in Fig. 7.1) is small in comparison with the wavelengths of the essential components of audible sound, it has only a weak directional effect; however, it does cause the frequency response curve for sound coming from the front of the head to differ from that for sound coming from behind. If the palm of the hand is cupped properly around the external ear, the directional selectivity, loudness, and therewith the intelligibility of speech can be increased. The auditory canal that begins at the external ear is, essentially, an acoustically hard tube open at one end and closed at the other end by a compliant membrane, the eardrum ; at the mid-range frequencies, the eardrum (about 1 c m 2) forms an almost reflection-free termination for the auditory canal. The space behind the eardrum (middle ear) is filled with air and connected by the eustachian tube with the throat.
  • Book cover image for: Speech and Audio Processing
    eBook - PDF

    Speech and Audio Processing

    A MATLAB®-based Approach

    4 The human auditory system A study of human hearing and the biomechanical processes involved in hearing reveals several non-linear steps, or stages, in the perception of sound. Each of these stages contributes to the eventual unequal distribution of subjective features against purely physical ones in human hearing. Put simply, what we think we hear is quite significantly different from the physical sounds that may be present in reality (which in turn differs from what might be recorded onto a computer, given the imperfections of microphones and recording technology). By taking into account the various non-linearities in the hearing process, and some of the basic physical characteristics of the ear, nervous system, and brain, it becomes possible to begin to account for these discrepancies between perception and physical measurements. Over the years, science and technology has incrementally improved our ability to understand and model the hearing process using purely physical data. One simple exam- ple is that of A-law compression (or the similar μ-law used in some regions of the world), where approximately logarithmic amplitude quantisation replaces the linear quantisation of PCM (pulse coded modulation): humans tend to perceive amplitude logarithmically rather than linearly, thus A-law quantisation using 8 bits to represent each sample sounds better than linear PCM quantisation using 8 bits (in truth, it can sound better than speech quantised linearly with 12 bits). It thus achieves a higher degree of subjective speech quality than PCM – for a given bitrate [4]. 4.1 Physical processes A cut-away diagram of the human ear (outer, middle and inner) is shown in Figure 4.1. The outer ear includes the pinna, which filters sound and focuses it into the external auditory canal.
  • Book cover image for: Occupational Hearing Loss
    • Robert Thayer Sataloff, Joseph Sataloff, Robert Thayer Sataloff, Joseph Sataloff(Authors)
    • 2006(Publication Date)
    • CRC Press
      (Publisher)
    2 The Physics of Sound Robert T. Sataloff Drexel University College of Medicine, American Institute for Voice and Ear Research, and Graduate Hospital, Philadelphia, Pennsylvania, USA Joseph Sataloff Thomas Jefferson University, Philadelphia, Pennsylvania, USA 1. Sound 3 2. Sound Waves 4 2.1. Characteristics of Sound Waves 4 2.1.1. Components of Sound 5 3. Measuring Sound 9 3.1. Intensity 11 3.2. Decibel 11 3.2.1. A Unit of Comparison 11 3.2.2. Two Reference Levels 11 3.2.3. Formula for the Decibel 12 3.2.4. Important Points 13 4. dBA Measurement 15 References 17 Fortunately, one need not be a physicist in order to function well in professions involved with hearing and sound. However, a fundamental understanding of the nature of sound and terms used to describe it is essential to comprehend the language of otologists, audio-logists, and engineers. Moreover, studying basic physics of sound helps one recognize complexities and potential pitfalls in measuring and describing sound and helps clarify the special difficulties encountered in trying to modify sources of noise. 1. SOUND Sound is a form of motion. Consequently, the laws of physics that govern actions of all moving bodies apply to sound. Because sound and all acoustic conditions behave consist-ently as described by the laws of physics, we are able to predict and analyze the nature of a 3 sound and its interactions. Sound measurement is not particularly simple. The study of physics helps us to understand many practical aspects of our daily encounters with sound. For example, why does an audiologist or otologist use a different baseline for decibels in his office from that used by an engineer or industrial physician who measures noise in a factory? Why is it that when hearing at high frequencies is tested, a patient may hear nothing and then suddenly hear a loud tone? Yet, all the examiner did was move the earphone a fraction of an inch.
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.