A study is the first-of-its-kind to recognize American Sign Language (ASL) alphabet gestures using computer vision. Researchers developed a custom dataset of 29,820 static images of ASL hand gestures. Each image was annotated with 21 key landmarks on the hand, providing detailed spatial information about its structure and position. Combining MediaPipe and YOLOv8, a deep learning method they trained, with fine-tuning hyperparameters for the best accuracy, represents a groundbreaking and innovative approach that hasn’t been explored in previous research.
Tag: human-computer interaction
Just believing that an AI is helping boosts your performance
New research suggests that people perform better if they think they have an AI assistant – even when they’ve been told it’s unreliable and won’t help them.
JMIR Neurotechnology Invites Submissions on Brain-Computer Interfaces (BCIs)
JMIR Publications is pleased to announce a new theme issue in JMIR Neurotechnology exploring brain-computer interfaces (BCIs) that represent the transformative convergence of neuroscience, engineering, and technology.
Designers find better solutions with computer assistance, but sacrifice creative touch
A computer-guided approach to design can propose more solutions and balance out human inexperience and design fixation.