Transfer learning for sign language recognition
Loading...
Date
2023
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Thesis (Ph.D.) - Bogazici University. Institute for Graduate Studies in Science and Engineering, 2023.
Abstract
Sign languages are visual languages that use hands, arms, and faces to communicate concepts. In the last decade, sign language recognition (SLR) research has made significant progress but still requires massive amounts of data to recognize signs. Despite efforts to create large annotated sign language datasets, applications that can translate for ordinary users in daily settings are yet to be produced. Most SLR research focuses on a few popular sign languages, leaving most sign languages, especially Turkish Sign Language (TID), under- resourced for sign language technology development. This dissertation addresses several open research questions about the development of SLR technology for TID from several perspectives. We generated BosphorusSign22k, an isolated SLR dataset for TID with 22k videos, and benchmarked state-of-the- art techniques on it. We proposed aligned temporal accumulative features (ATAF) to efficiently model sign language movements as dynamic and static subunits. Combined with methods using other modalities, the method achieves state-of-the-art performance on BosphorusSign22k. We then used regularized regression-based multi-task learning and presented task-aware canonical time warping for isolated SLR. The technique aligned and grouped signs to minimize discrepancies across different sources and emphasize class differences. Finally, we established a benchmark for cross-dataset transfer learning in isolated SLR. We evaluated supervised transfer learning algorithms using a temporal graph convolution-based SLR method. Experiments with closed and partial-set cross-dataset transfer learning reveal a substantial improvement over combined training and fine-tuning- based baseline techniques. NOTE Keywords : Convolutional neural networks, Image processing- computer assisted.