Multisensory Emotion Perception and American Sign Language Proficiency
Open Access
- Author:
- Mingle, Sydney
- Area of Honors:
- Psychology
- Degree:
- Bachelor of Science
- Document Type:
- Thesis
- Thesis Supervisors:
- Kathryn Suzanne Scherf, Thesis Supervisor
Jeff M Love, Thesis Honors Advisor - Keywords:
- American Sign Language
Emotion Perception
Emotion Recognition - Abstract:
- Emotion recognition is a crucial element of communication that underlies interpersonal skills and empathy. Emotions can be expressed through facial expressions, posturing, gestures, and speech tone. American Sign Language (ASL) is a visual manual language that relies heavily on the use of facial expressions and gestures to convey emotion information during communication, as well as marking grammatical and syntactic properties. Thus, we suspect that frequent use and exposure to ASL may improve the perceptual abilities of its users to better identify emotions through facial expressions. Previous studies have supported that ASL experience provides an advantage in both encoding and decoding facial expressions of emotion (Goldstein & Feldman, 1996; Goldstein, Sexton, Feldman, 2000). However, the existing research only explores emotion recognition of the face. No research attempts to compare ASL users’ performance in both facial and vocal emotion recognition tasks, nor are there investigations into the perception of affect more generally. The present study compares emotion recognition performance of ASL-users against non-ASL users in two affective conditions. Given the experience of ASL-users in extracting affective information from visual input, we hypothesize that they will outperform non ASL-users on the face task, but will show no advantage on the voice task.