AI RECOGNITION OF JAPANESE SIGN LANGUAGE AND ITS APPLICATION
Tsutomu Kimura, Teppei Miura, Kazuyuki Kanda
Pages: 1-10
Published: 22 Oct 2024
DOI: 10.62991/LIS1996383664
Views: 353
Downloads: 58
Abstract: We are researching Machine Recognition of Japanese Sign Language using deep learning technology. JSL consists of the main parameters i.e., handshape, location, and movement of hands, otherwise non- manual markers such as facial expressions, posture, and others. JSL has unique morphemes called CL=classifiers, and some signs using fingerspelling. Understanding JSL needs recognizing all these elements, which makes the research entangled. We limit the first target as a simple sign and a simple sentence. In our research, we use Conformer (a combination of CNN and Transformer) and CTC, which are used generally in speech recognition. The system learned 170 types of signs with each 100 tokens. The correct answer rate (Accuracy) was about 95%. For sign language sentences, unlike a sign word recognition, the problem is the transition between words which make the sign recognition harder. Our system recognized over 58% for the sign sentences with 96 types of signs.
Keywords: sign language, ai recognition, sign language engineering, sign language dictionary
Cite this article: Tsutomu Kimura, Teppei Miura, Kazuyuki Kanda. AI RECOGNITION OF JAPANESE SIGN LANGUAGE AND ITS APPLICATION. Journal of International Scientific Publications: Language, Individual & Society 18, 1-10 (2024). https://doi.org/10.62991/LIS1996383664
Back to the contents of the volume
© 2025 The Author(s). This is an open access article distributed under the terms of the
Creative Commons Attribution License https://creativecommons.org/licenses/by/4.0/, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. This permission does not cover any third party copyrighted material which may appear in the work requested.
Disclaimer: The Publisher and/or the editor(s) are not responsible for the statements, opinions, and data contained in any published works. These are solely the views of the individual author(s) and contributor(s). The Publisher and/or the editor(s) disclaim any liability for injury to individuals or property arising from the ideas, methods, instructions, or products mentioned in the content.