Tehran University of Medical Sciences

Science Communicator Platform

Stay connected! Follow us on X network (Twitter):
Share this content! On (X network) By
Neural Correlates of Audiotactile Phonetic Processing in Early-Blind Readers: An Fmri Study Publisher Pubmed



Pishnamazi M1, 2 ; Nojaba Y1 ; Ganjgahi H1, 3 ; Amousoltani A1 ; Oghabian MA1
Authors
Show Affiliations
Authors Affiliations
  1. 1. Neuroimaging and Analysis Group, Research Center for Molecular and Cellular Imaging, Tehran University of Medical Sciences, Imam Khomeini Hospital Complex, Keshavarz Blvd., PO Box 14185-171, Tehran, Iran
  2. 2. Students’ Scientific Research Center, Tehran University of Medical Sciences, Tehran, Iran
  3. 3. Department of Statistics, University of Warwick Coventry, Warwick, United Kingdom

Source: Experimental Brain Research Published:2016


Abstract

Reading is a multisensory function that relies on arbitrary associations between auditory speech sounds and symbols from a second modality. Studies of bimodal phonetic perception have mostly investigated the integration of visual letters and speech sounds. Blind readers perform an analogous task by using tactile Braille letters instead of visual letters. The neural underpinnings of audiotactile phonetic processing have not been studied before. We used functional magnetic resonance imaging to reveal the neural correlates of audiotactile phonetic processing in 16 early-blind Braille readers. Braille letters and corresponding speech sounds were presented in unimodal, and congruent/incongruent bimodal configurations. We also used a behavioral task to measure the speed of blind readers in identifying letters presented via tactile and/or auditory modalities. Reaction times for tactile stimuli were faster. The reaction times for bimodal stimuli were equal to those for the slower auditory-only stimuli. fMRI analyses revealed the convergence of unimodal auditory and unimodal tactile responses in areas of the right precentral gyrus and bilateral crus I of the cerebellum. The left and right planum temporale fulfilled the ‘max criterion’ for bimodal integration, but activities of these areas were not sensitive to the phonetical congruency between sounds and Braille letters. Nevertheless, congruency effects were found in regions of frontal lobe and cerebellum. Our findings suggest that, unlike sighted readers who are assumed to have amodal phonetic representations, blind readers probably process letters and sounds separately. We discuss that this distinction might be due to mal-development of multisensory neural circuits in early blinds or it might be due to inherent differences between Braille and print reading mechanisms. © 2015, Springer-Verlag Berlin Heidelberg.
Experts (# of related papers)