Projects

Project Description

When we speak, we often utilize hand gestures to convey information. In neuroatypical adults, such as in post-stroke aphasia (a language disorder that affects producing and understanding speech), co-speech manual gestures are often used to disambiguate or replace speech in order to help compensate for speech and language production difficulties. There are various types of hand gestures that individuals produce during speech. Iconic gestures have a close relationship to the semantic content of speech, which is why we are particularly interested in their use by individuals with and without aphasia during storytelling. Our ongoing research aims to identify, (1) how often individuals with and without brain injury employ gesture during storytelling, (2) how gesturing changes according to the types of stories being told (for example: stories related to oneself versus a fictional story, (3) and how gestures are consistent across retellings (i.e., from one day to another day). This research project will give students skills in identifying and coding gestures using software called ELAN, and to be a part of a much larger study team working on this project. This project will also enable a self-motivated student to become creative in their own research questions and explore other interesting pathways with the data.

Technology or Computational Component

We use a free piece of software, ELAN, to analyze pre-collected video samples of individuals telling stories. We will train on ELAN. No prior experience is necessary.