skip to main content
research-article

Recent Developments in Tibetan NLP

Published:23 April 2021Publication History
First page image

References

  1. C. Faggionato and M. Meelen. 2019. Developing the old Tibetan treebank. In Proceedings of the International Conference Recent Advances in Natural Language Processing (RANLP’19). 304–312. DOI: 10.26615/978-954-452-056-4_035Google ScholarGoogle Scholar
  2. N. W. Hill and D. Jiang. 2016. Introduction (to special issue on Tibetan natural language processing). Himal. Ling. 15, 1 (2016), 1–11Google ScholarGoogle Scholar
  3. N. W. Hill and M. Meelen. 2017. Segmenting and POS tagging classical Tibetan using a memory-based tagger. Himal. Ling. 16, 2 (2017), 64–86Google ScholarGoogle Scholar
  4. T. Jiang, H. Sun, Y. G. Dai, and D. Liu. 2020. Tibetan-Chinese neural machine translation combining attention mechanism. Journal of Physics: Conference Series, IOP PublishingGoogle ScholarGoogle Scholar
  5. Y. Li, J. Li, M. Zhang, Y. Li, and P. Zou. 2020. Improving neural machine translation with linear interpolation of a short-path unit. ACM Trans. Asian Low-resour. Lang. Inf. Process. 19, 3 (2020). DOI: 10.1145/3377851 Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. H. Liu and C. Long. 2018. CTTC: A collection of Tibetan text corpora. In Proceedings of the 11th International Conference on Language Resources and Evaluation (LREC’18) E. Yang and L. Sun (Eds.).Google ScholarGoogle Scholar
  7. W. Ma and K. Zhao. 2019. Tibetan location name recognition using Tibetan-Chinese cross-lingual word embeddings. In Proceedings of the International Conference on Machine Learning, Big Data and Business Intelligence (MLBDBI’19). 384–387. DOI: 10.1109/MLBDBI48998.2019.00087Google ScholarGoogle Scholar
  8. L. Ma, C. Long, L. Duan, X. Zhang, Y. Li, and Q. Zhao. 2020. Segmentation and recognition for historical Tibetan document images. IEEE Access 8 (2020), 52641–52651. DOI: 10.1109/ACCESS.2020.2975023Google ScholarGoogle ScholarCross RefCross Ref
  9. M. Meelen and É. Roux. 2020. Meta-dating the PArsed Corpus of Tibetan (PACTib). In Proceedings of the 19th International Workshop on Treebanks and Linguistic Theories, Association for Computational Linguistics. 31–42Google ScholarGoogle Scholar
  10. M. Meelen, É. Roux, and N. Hill. 2021. Optimisation of the largest annotated Tibetan corpus combining rule-based, memory-based and deep-learning methods. ACM Trans. Asian Low-resour. DOI: 10.17863/CAM.57600 Google ScholarGoogle ScholarCross RefCross Ref
  11. D. Schmidt. 2020. Grading Tibetan children's literature: A test case using the NLP readability tool “Dakje” ACM Trans. Asian Low-resour. Lang. Inf. Process. 19, 6 (2020). DOI: https://doi.org/10.1145/3392046 Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. F. Wan and J. Xia. 2017. Tibetan information extraction technology integrated with event feature and semantic role labelling. MATEC Web Conf. 128 (2017) 01016. DOI: 10.1051/matecconf/201712801016Google ScholarGoogle Scholar
  13. L. Wang and H. Yang. 2018. Tibetan word segmentation method based on BiLSTM_ CRF model. In Proceedings of the International Conference on Asian Language Processing (IALP’18). 297–302. DOI: 10.1109/IALP.2018.8629257Google ScholarGoogle Scholar
  14. X. Wang, W. Wang, Z. Li, Y. Wang, Y. Han, and Z. Hao. 2018. A recognition method of the similarity character for Uchen script Tibetan historical document based on DNN. In Pattern Recognition and Computer Vision, Z. Lai, C.-L. Liu, X. Chen, J. Zhou, T. Tan, N. Zheng, and H. Zha (Eds.). Springer International Publishing, 52–62Google ScholarGoogle Scholar
  15. W. Xia and C. Huaque. 2019. Dependency tree based Tibetan semantic dependency analysis. J. Tsing. Univ. (Sci. Technol.) 59, 9 (2019), 750–756Google ScholarGoogle Scholar
  16. Y. Zhao, J. Yue, X. Xu, L. Wu, and X. Li. 2019. End-to-end-based Tibetan multitask speech recognition. IEEE Access 7 (2019), 162519–162529. DOI: 10.1109/ACCESS.2019.2952406Google ScholarGoogle ScholarCross RefCross Ref
  17. Y. Zhao, X. Xu, J. Yue, W. Song, Z. Li, L. Wu, and Q. Ji. 2020. An open speech resource for Tibetan multi-dialect and multitask recognition. Int. J. Comput. Sci. Eng. 22, 2/3 (2020), 297. DOI: 10.1504/ijcse.2020.107351Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. X. Zhu and H. Huang. 2020. End-to-end Amdo-Tibetan speech recognition based on knowledge transfer. IEEE Access 8 (2020), 170991–171000. DOI: 10.1109/ACCESS.2020.3023783Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

(auto-classified)
  1. Recent Developments in Tibetan NLP

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in

            Full Access

            • Published in

              cover image ACM Transactions on Asian and Low-Resource Language Information Processing
              ACM Transactions on Asian and Low-Resource Language Information Processing  Volume 20, Issue 2
              March 2021
              313 pages
              ISSN:2375-4699
              EISSN:2375-4702
              DOI:10.1145/3454116
              Issue’s Table of Contents

              Copyright © 2021 Copyright held by the owner/author(s).

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 23 April 2021
              Published in tallip Volume 20, Issue 2

              Permissions

              Request permissions about this article.

              Request Permissions

              Check for updates

              Qualifiers

              • research-article
              • Refereed
            • Article Metrics

              • Downloads (Last 12 months)29
              • Downloads (Last 6 weeks)2

              Other Metrics

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader

            HTML Format

            View this article in HTML Format .

            View HTML Format
            About Cookies On This Site

            We use cookies to ensure that we give you the best experience on our website.

            Learn more

            Got it!