Breaking Barriers: Study Uses AI to Interpret American Sign Language in Real-time

A study is the first-of-its-kind to recognize American Sign Language (ASL) alphabet gestures using computer vision. Researchers developed a custom dataset of 29,820 static images of ASL hand gestures. Each image was annotated with 21 key landmarks on the hand, providing detailed spatial information about its structure and position. Combining MediaPipe and YOLOv8, a deep learning method they trained, with fine-tuning hyperparameters for the best accuracy, represents a groundbreaking and innovative approach that hasn’t been explored in previous research.

Language of Care: University of Utah Health Researchers Co-Design Health Care With the Deaf Community

Navigating health care is hard enough when English is your first language—imagine the difficulty when American Sign is your first language. How can we bridge the linguistic and cultural gaps needed to better care for patients? University of Utah Health is proud to present Language of Care, an incredible short film of how a community of Deaf patients are breaking barriers by co-designing their own care with U of U Health researchers.