The Effect of Residual Acoustic Hearing and Adaptation to Uncertainty on Speech Perception in Cochlear Implant Users: Evidence From Eye-Tracking. 2016

Bob McMurray, and Ashley Farris-Trimble, and Michael Seedorff, and Hannah Rigler
1Departments of Psychological and Brain Sciences, Communication Sciences and Disorders, and Linguistics, University of Iowa, Iowa City, Iowa, USA; 2Department of Linguistics, Simon Fraser University, Burnaby, British Columbia, Canada; 3Department of Biostatistics, University of Iowa, Iowa City, Iowa, USA; and 4Department of Psychological and Brain Sciences, University of Iowa, Iowa City, Iowa, USA.

OBJECTIVE While outcomes with cochlear implants (CIs) are generally good, performance can be fragile. The authors examined two factors that are crucial for good CI performance. First, while there is a clear benefit for adding residual acoustic hearing to CI stimulation (typically in low frequencies), it is unclear whether this contributes directly to phonetic categorization. Thus, the authors examined perception of voicing (which uses low-frequency acoustic cues) and fricative place of articulation (s/∫, which does not) in CI users with and without residual acoustic hearing. Second, in speech categorization experiments, CI users typically show shallower identification functions. These are typically interpreted as deriving from noisy encoding of the signal. However, psycholinguistic work suggests shallow slopes may also be a useful way to adapt to uncertainty. The authors thus employed an eye-tracking paradigm to examine this in CI users. METHODS Participants were 30 CI users (with a variety of configurations) and 22 age-matched normal hearing (NH) controls. Participants heard tokens from six b/p and six s/∫ continua (eight steps) spanning real words (e.g., beach/peach, sip/ship). Participants selected the picture corresponding to the word they heard from a screen containing four items (a b-, p-, s- and ∫-initial item). Eye movements to each object were monitored as a measure of how strongly they were considering each interpretation in the moments leading up to their final percept. RESULTS Mouse-click results (analogous to phoneme identification) for voicing showed a shallower slope for CI users than NH listeners, but no differences between CI users with and without residual acoustic hearing. For fricatives, CI users also showed a shallower slope, but unexpectedly, acoustic + electric listeners showed an even shallower slope. Eye movements showed a gradient response to fine-grained acoustic differences for all listeners. Even considering only trials in which a participant clicked "b" (for example), and accounting for variation in the category boundary, participants made more looks to the competitor ("p") as the voice onset time neared the boundary. CI users showed a similar pattern, but looked to the competitor more than NH listeners, and this was not different at different continuum steps. CONCLUSIONS Residual acoustic hearing did not improve voicing categorization suggesting it may not help identify these phonetic cues. The fact that acoustic + electric users showed poorer performance on fricatives was unexpected as they usually show a benefit in standardized perception measures, and as sibilants contain little energy in the low-frequency (acoustic) range. The authors hypothesize that these listeners may overweight acoustic input, and have problems when this is not available (in fricatives). Thus, the benefit (or cost) of acoustic hearing for phonetic categorization may be complex. Eye movements suggest that in both CI and NH listeners, phoneme categorization is not a process of mapping continuous cues to discrete categories. Rather listeners preserve gradiency as a way to deal with uncertainty. CI listeners appear to adapt to their implant (in part) by amplifying competitor activation to preserve their flexibility in the face of potential misperceptions.

UI MeSH Term Description Entries
D008297 Male Males
D008875 Middle Aged An adult aged 45 - 64 years. Middle Age
D010700 Phonetics The science or study of speech sounds and their production, transmission, and reception, and their analysis, classification, and transcription. (Random House Unabridged Dictionary, 2d ed) Speech Sounds,Sound, Speech,Sounds, Speech,Speech Sound
D003054 Cochlear Implants Electronic hearing devices typically used for patients with normal outer and middle ear function, but defective inner ear function. In the COCHLEA, the hair cells (HAIR CELLS, VESTIBULAR) may be absent or damaged but there are residual nerve fibers. The device electrically stimulates the COCHLEAR NERVE to create sound sensation. Auditory Prosthesis,Cochlear Prosthesis,Implants, Cochlear,Auditory Prostheses,Cochlear Implant,Cochlear Prostheses,Implant, Cochlear,Prostheses, Auditory,Prostheses, Cochlear,Prosthesis, Auditory,Prosthesis, Cochlear
D003638 Deafness A general term for the complete loss of the ability to hear from both ears. Deafness Permanent,Hearing Loss Permanent,Prelingual Deafness,Deaf Mutism,Deaf-Mutism,Deafness, Acquired,Hearing Loss, Complete,Hearing Loss, Extreme,Acquired Deafness,Complete Hearing Loss,Deafness, Prelingual,Extreme Hearing Loss,Permanent, Deafness,Permanent, Hearing Loss,Permanents, Deafness
D005260 Female Females
D006801 Humans Members of the species Homo sapiens. Homo sapiens,Man (Taxonomy),Human,Man, Modern,Modern Man
D000222 Adaptation, Physiological The non-genetic biological changes of an organism in response to challenges in its ENVIRONMENT. Adaptation, Physiologic,Adaptations, Physiologic,Adaptations, Physiological,Adaptive Plasticity,Phenotypic Plasticity,Physiological Adaptation,Physiologic Adaptation,Physiologic Adaptations,Physiological Adaptations,Plasticity, Adaptive,Plasticity, Phenotypic
D000328 Adult A person having attained full growth or maturity. Adults are of 19 through 44 years of age. For a person between 19 and 24 years of age, YOUNG ADULT is available. Adults
D000368 Aged A person 65 years of age or older. For a person older than 79 years, AGED, 80 AND OVER is available. Elderly

Related Publications

Bob McMurray, and Ashley Farris-Trimble, and Michael Seedorff, and Hannah Rigler
May 2012, The Journal of the Acoustical Society of America,
Bob McMurray, and Ashley Farris-Trimble, and Michael Seedorff, and Hannah Rigler
January 2015, Ear and hearing,
Bob McMurray, and Ashley Farris-Trimble, and Michael Seedorff, and Hannah Rigler
January 2012, The Journal of the Acoustical Society of America,
Bob McMurray, and Ashley Farris-Trimble, and Michael Seedorff, and Hannah Rigler
May 2004, Archives of otolaryngology--head & neck surgery,
Bob McMurray, and Ashley Farris-Trimble, and Michael Seedorff, and Hannah Rigler
January 2020, Frontiers in neuroscience,
Bob McMurray, and Ashley Farris-Trimble, and Michael Seedorff, and Hannah Rigler
September 2020, International journal of environmental research and public health,
Bob McMurray, and Ashley Farris-Trimble, and Michael Seedorff, and Hannah Rigler
July 2018, Hearing research,
Bob McMurray, and Ashley Farris-Trimble, and Michael Seedorff, and Hannah Rigler
February 2013, The Journal of the Acoustical Society of America,
Bob McMurray, and Ashley Farris-Trimble, and Michael Seedorff, and Hannah Rigler
August 2001, The Journal of otolaryngology,
Bob McMurray, and Ashley Farris-Trimble, and Michael Seedorff, and Hannah Rigler
January 2015, Ear and hearing,
Copied contents to your clipboard!