The Colgate Scene
January 2004

Hand-eye coordination
How do gestures impact language perception?

Wearing a sensor net with 128 electrodes attached to his scalp, Dan Wakeman '05 works through a pilot test which measures his response to verbal stimuli on a video screen and accompanying gestures by interpreting the signals picked up by the electrode net. [Photos by Timothy D. Sofranko]

In a fluorescent-lit classroom on the first floor of Olin Hall, Assistant Professor of Psychology Spencer Kelly holds a headdress of 128 electrodes and asks if anyone wants to volunteer to wear the apparatus and have their brainwaves analyzed.

The question is met with laughter. Dangling the electrode net, Kelly turns to the class of 10 high school students and nods towards a girl sitting in the back row.

"Kathryn, how about you?"

"Okay," she says, walking towards Kelly and his two student assistants, seniors Nikki Pratt and Corey Kravitz.

The electrode net has been soaking in an electrolyte fluid. Kelly quickly pulls the net over Kathryn's head and nestles the electrodes on her scalp. Pratt and Kravitz immediately douse each electrode with pipettes full of electrolyte solution. If the electrodes dry up, it will be difficult to detect brain waves.

Kelly leads Kathryn and the rest of the class, students at various local high schools, to a small electromagnetic-proof laboratory behind a heavy steel door. Everyone is sandwiched inside. He starts a video on the television in front of Kathryn and instructs her to press "one" when she hears the word "tall" and "two" when she hears the word "wide." The student in the video gestures as he speaks.

"Wait, when he says tall or when he does it?" she asks.

"I know it is hard to ignore the gesture, but press the button only when he says the word," Kelly replies.

The computer monitor will mark each time Kathryn presses a button. Kelly will later analyze the minute changes in her brainwaves at the moments she presses the button.

Everyone is crowded around Kathryn and the computer monitor. The wires from each of the 128 electrodes connect to a small box that Kelly plugs into a steel arm protruding from the computer. Once she is plugged in, Kathryn's brain waves float across the screen.

"Now that," another student remarks, "is the ultimate hair net."


Assistant Professor of Psychology Spencer Kelly conducts the test on Wakeman.

Event-related potentials
Kelly's research on the relationship between hand gestures and the brain's comprehension of language has potentially far-reaching educational and sociological implications. He argues that gesture fundamentally changes the brain's perception of language at a very early stage. Gesture can alter one's understanding of a word, and Kelly theorizes that gestures can help children with learning impairments learn language faster.

Specifically, he is interested in how and when hand gestures influence language comprehension and whether gestures influence language learning.

"Language is very limited. It's good when you're reading a book, or talking on the phone: it can get the job done. But in face-to-face interactions, there's so much more there," he said.

Many psychologists have researched the role of gesture in language comprehension, and results are mixed. Some speculate that gesture plays an integral part, and that gestures directly influence one's processing of speech at a very early stage. Others argue that gesture and speech are independent of one another, and that gesture serves as an "add-on" source of information after speech has been processed.

Researchers have not been able to conclude one way or another, Kelly said, because their studies rely on behavioral responses to speech and gesture rather than the underlying neurological response. Kelly, whose recent research focuses on brain responses in addition to behavioral responses, began to consider the possibility that gesture and speech are fundamentally integrated in the brain while he was in graduate school at the University of Chicago.

"I had this gut sense that nonverbal behavior such as gesture isn't merely this add-on information," he said. "At the deep cognitive psychological level, when you're understanding someone's speech, that gesture is influencing your basic understanding of the speech."

Further postgraduate work confirmed his view, and he is now proving his theory at Colgate. He is using event-related potentials (ERPs), which are the brain's electrical responses to specific stimuli. Kelly's is one of the first ERP studies to use online audiovisual technology to measure when and how gesture influences speech.

Using the electrode net, Kelly measures the subject's response to verbal stimuli on a video screen at the exact instant the word is presented. In the video, Kelly presents words, which are preceded by gestures, to subjects. The crucial manipulation in the experiment is that the gesture sometimes precedes the word. If the meanings of the gesture and the word correlate, it is called a "match," and disparate words and gestures are called "mismatches." He presents the stimulus repeatedly, he says, to average out background noise and other impedances.

Kelly's latest research project is broken into two yearlong phases. This past fall, he concluded the first part of phase one, working with adults, and confirmed that gestures do influence ERPs to speech. He plans to repeat the experiments in the second part of phase one with children beginning in the spring.

Kelly has been surprised by his initial results.

"The brain integrates these gestures early on in the processing of speech. Within the first couple hundred milliseconds after hearing a word, the gesture that preceded that word changes how you're even hearing that word," he said. "It's as if you're not even hearing the same thing phonologically when you've got these different gestural contexts. It's an earlier effect than I anticipated, so that's really exciting news. It has big epistemological and philosophical implications: is our understanding of reality pure? Is it affected by context? Clearly, context is infiltrating even the lowest level of processing."

What's at stake?
Kelly has conducted pilot studies with adults for the second phase.

He is using fabricated words with objects the subject has never seen before to measure how gesture influences language learning. As with the first phase, he will repeat these experiments in phase two with children.

In an article he co-authored with Kravitz and former student assistant Mike Hopkins '03, Kelly published his pilot data from the first phase in the scientific journal Brain and Language. Kelly expanded on that data for a National Institutes of Health grant proposal, which is currently under review. The proposal is partially based on research gathered by former student Melinda Schwoegler '03, whose findings from her yearlong independent study influenced the direction of Kelly's research.

"I was interested in what goes on at the neural level when we see gestures. [Kelly] was interested in language learning. We married the two ideas together to look at the neural effects of gesture on language learning," said Schwoegler, who is planning to attend veterinary school.

Kelly presented pilot data from the proposal at a recent conference of the Cognitive Development Society held in Utah. Many researchers present, he said, were surprised at his neuroscientific view of language learning.

"There were 300 people there, and I was one of five who even mentioned the word `brain.' So a lot of people were looking at me askew," Kelly said. "There was some surprise and skepticism, but others thought this was a cool way at getting at some important questions about language learning."

Kelly hopes his research will impact teaching techniques, given that the evidence suggests that gestures help children learn language. Gestures may help "strengthen the memory traces of that word," he said, allowing the child to retain knowledge. Kelly posits that if gestures fundamentally change how the brain learns a word, then gestures could help visual- and tactile-oriented learners, as well as children with specific language impairments and learning disabilities, learn language faster.

"Using these multiple modalities, how can you go wrong? If you're a verbal learner, you pay attention to speech, but if you're a visual or tactile learner, you pay attention to gesture. So, throw the kitchen sink at the kid, and then they'll use whatever it takes to help them learn," Kelly said.

Teaching and learning processes are what attracted Kelly to Colgate in the first place.

"The reason I ended up at Colgate and not a big research university was that Colgate takes teaching and learning very seriously. In addition, it seriously values research. This makes it a great place to do research on the teaching and learning process," Kelly said. "This has obvious implications for educational policy. Unfortunately, there are some educational policies out there that aren't driven by research. It's `well, that's what we've been doing for 15 years,' or `that just makes sense.' That's a bad way to drive educational policy, because what's at stake here are kids, and they're more important than reputation, money, and ego."

Top of page
Table of contents
<< Previous: Fireworks going off... Next: Genetic code warrior >>