Researchers who study how babies acquire language have demonstrated that babies begin learning very early which sounds are relevant to the language that surrounds them. Babies learn so early to categorize sounds, in fact, that infants being raised as bilingual can even differentiate between the sounds of two languages they have never before heard.

One theory of how babies recognize a sound as belonging to one language or another is that they are using not only their ears but their eyes to understand language. They pay attention to visual cues, to the movement of the face, especially the lips, as a person speaks.

The idea that visual information affects speech perception is not new and is probably something you've noticed. Watching a dubbed film, for example, presents a problem to most viewers at first because they cannot reconcile the sounds they are hearing with the shapes of the mouths speaking the film's original language. Such incongruity makes words hard to understand. About 35 years ago, in fact, researchers Harry McGurk and John McDonald demonstrated that humans use both visual and auditory information in speech perception. For most of us, vision triumphs; what we see can alter what we hear.

It's called the McGurk effect.

SPEAKING UP

What does the McGurk effect mean to you as a presenter? Primarily that it's important to help your audience receive the visual signals that accompany your speech. In short, make sure that audience members can see your face.

Maintaining eye focus will keep your face up and directed out so that listeners can see its movement and expression as they listen to your voice. Then articulate clearly, forming each syllable of each word as distinctly as possible without exaggerating. If you are using a script, practice the Read-Speak technique so that the script becomes the source for the words that you deliver with your head up and visible. Keep gestures large and away from your face, and don't hide it even for a moment by scratching or rubbing at it.

HEADING OFF

In a few situations, you'll want to be especially alert to the McGurk effect. Even in a relatively quiet room, listeners seated below a buzzing light fixture or a whooshing air conditioning vent may have difficulty hearing some words. Use your face and its movements to help them overcome the competition of such ambient noise.

Second language or non-native speakers in your audience may often rely upon facial cues, especially lip movements, for fuller comprehension. So will listeners who may be less familiar with a specialized or technical vocabulary. And some words are just difficult to discern - "fifteen" and "fifty" as well as "can" and "can't" are often indistinguishable. While substituting "cannot" for the contraction may help with the latter pair of words, supplying adequate visual information could clarify other problematical word pairs. The ramifications of the McGurk effect may also pop up when your company films a presentation. Capturing a full video screen of the speaker's face could make a positive difference in message comprehension. And a speaker with a moustache should make sure it's trimmed so that it doesn't obscure the mouth.

Communication succeeds in many ways. Aligning visual cues with spoken words contributes to that success. So as you rehearse and as you present, keep the channels of speech perception clear - your audience has been tuned and receptive to their messages since infancy. They signal understanding.