A study is the first-of-its-kind to recognize American Sign Language (ASL) alphabet gestures using computer vision. Researchers developed a custom dataset of 29,820 static images of ASL hand gestures. Each image was annotated with 21 key landmarks on the hand, providing detailed spatial information about its structure and position. Combining MediaPipe and YOLOv8, a deep learning method they trained, with fine-tuning hyperparameters for the best accuracy, represents a groundbreaking and innovative approach that hasn’t been explored in previous research.
Tag: Sign Language
The Self-Taught Vocabulary of Homesigning Deaf Children Supports Universal Constraints on Language
Thousands of languages spoken throughout the world draw on many of the same fundamental linguistic abilities and reflect universal aspects of how humans categorize events. Some aspects of language may also be universal to people who create their own sign languages.
3D hand-sensing wristband signals future of wearable tech
In a potential breakthrough in wearable sensing technology, researchers from Cornell University and the University of Wisconsin, Madison, have designed a wrist-mounted device that continuously tracks the entire human hand in 3D.