Abstract
Deep neural network models for Uyghur personal pronoun resolution learn semantic information for personal pronoun and antecedents, but tend to be short-sighted—they ignore the importance of each feature. In this article, we propose a Uyghur personal pronoun resolution model based on Attention mechanism, Convolutional neural networks and Gated recurrent unit (ATCG). Our model studies the grammatical structure and semantic features of Uyghur, and extracts 11 key features for Uyghur resolution task. Attention mechanism can focus on the importance of words in sentences. Gated Recurrent Unit (GRU) is applied in this model to achieve the interdependent features with long distance. The ATCG model effectively makes up for the shortcomings of relying only on the features of the content level and achieves better classification performance. Experimental results on Uyghur resolution dataset show that our model surpasses the state-of-the-art models.
- R. Iida, K. Torisawa, J. H. Oh, C. Kruengkrai, and J. Kloetzer. 2016. Intra-sentential subject zero anaphora resolution using multi-column convolutional neural network. In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP’16). 1244--1254.Google Scholar
- N. Q. Luong, A. Popescu-Belis, A. Rios, and D. Tuggener. 2017. Machine translation of Spanish personal and possessive pronouns using anaphora probabilities. In Proceedings of the Conference of the European Chapter of the Association for Computational Linguistics (EACL’17). 631--636.Google Scholar
- Li Min, Yu Long, Tian Sheng-Wei, Turglm Ibrahim, and Zhao Jian-Guo. 2017. Coreference resolution of Uyghur noun phrases based on deep learning. Acta Automatica Sinica 43, 11 (2017), 1984--1992.Google Scholar
- Tian Sheng-Wei, Qin Yue, Yu Long, Turglm Ibrahim, and Feng Guan-Jun. 2018. Anaphora resolution of Uyghur personal pronouns based on bi-LSTM. Acta Electronica Sinica 46, 7 (2018), 1691--1699.Google Scholar
- Li Dong-Bai, Tian Sheng-Wei, Yu Long, Turglm Ibrahim, and Feng Guan-Jun. 2017. Deep learning for pronominal anaphora resolution in Uyghur. Journal of Chinese Information Processing 31, 4 (2017), 80--88.Google Scholar
- Qi Qingshan, Tian Shengwei, Yu Long, and Aishan Wumaier. 2019. Anaphora resolution of Uyghur nouns based on ATT-IndRNN-CNN. Journal of Chinese Information Processing. 33, 9 (2019), 60--68.Google Scholar
- V. Mnih and N. Heess. 2014. Recurrent models of visual attention. In Proceedings of the Meeting of the Neural Information Processing Systems (NIPS’14). 2204--2212.Google Scholar
- D. Bahdanau, K. Cho, and Y. Bengio. 2015. Neural machine translation by jointly learning to align and translate. In Proceedings of the ICLR (2015).Google Scholar
- W. M. Soon, H. T. Ng, and D. C. Y. Lim. 2001. A machine learning approach to coreference resolution of noun phrases. Computational Linguistics 27, 4 (2001), 521--544.Google Scholar
Digital Library
- V. Ng. 2007. Semantic class induction and coreference resolution. In Proceedings of the Meeting of the Association for Computational Linguistics (ACL’07). 536--543.Google Scholar
- X. Yang, J. Su, and C. L. Tan. 2008. A twin-candidate model for learning-based anaphora resolution, Geogr. Comput. Linguist. 34, 3 (2008), 327--356.Google Scholar
Digital Library
- C. Chen and V. Ng. 2014. Chinese zero pronoun resolution: An unsupervised approach combining ranking and integer linear programming[C]. In Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence. 1622--1628.Google Scholar
- K. Clark, C. Manning, and D. Deep. 2016. Reinforcement learning for mention-ranking coreference models. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. 2256--2262.Google Scholar
- O. Uryupina and A. Moschitti. Collaborative partitioning for coreference resolution. In Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL’17). 247--257.Google Scholar
- K. Lee, L. He, M. Lewis, et al. 2017. End-to-end neural coreference resolution. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. 188--197.Google Scholar
Cross Ref
- W. Yin, H. Schütze, B. Xiang, and B. Zhou. 2016. Abcnn: Attention-based convolutional neural network for modeling sentence pairs. In Proceedings of the Meeting of the Transactions of the Association for Computational Linguistics (ACL’16). 259--272.Google Scholar
- H. Yu, P. Da, and F. U. Guohong. 2015. Chinese explanatory opinionated sentence recognition based on auto-encoding features. Acta Scientiarum Naturalium Universitatis Pekinensis 51, 2 (2015), 234--240.Google Scholar
- J. Pennington R. Socher, and C. D. Manning. 2014. GloVe: Global vectors for word representation. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP’14). 1532--1543.Google Scholar
- Y. Kim. 2014. Convolutional neural networks for sentence classification. In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP’14). 1746--1751.Google Scholar
Cross Ref
- X. Wang, Y. Liu, C. Sun, B. Wang, and X. Wang. 2015. Predicting polarities of tweets by composing word embeddings with long short-term memory. In Proceedings of the Association for Computational Linguistics (ACL’15). 1343--1353.Google Scholar
- I. Habernal and I. Gurevych. 2016. Which argument is more convincing analyzing and predicting convincingness of web arguments using bidirectional LSTM? In Proceedings of the Association for Computational Linguistics (ACL’16). 1589--1599.Google Scholar
- L. Mou, P. Ghamisi, and X. Zhu. 2017. Deep recurrent neural networks for hyperspectral image classification. IEEE Transactions on Geoscience and Remote Sensing 55, 7 (2017), 3639--3655.Google Scholar
Cross Ref
- L. Wang, Z. Cao, G. D. Melo, and Z. Liu. 2016. Relation classification via multi-level attention CNNs. In Proceedings of the Association for Computational Linguistics (ACL’16). 1298--1307.Google Scholar
- Y. Wang, M. Huang, X. Zhu, and L. Zhao. 2016. Attention-based LSTM for aspect-level sentiment classification. In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP’16). 606--615.Google Scholar
- X. Niu, Y. Hou, and P. Wang. 2017. Bi-Directional LSTM with quantum attention mechanism for sentence modeling. In Proceedings of the International Conference on Neural Information Processing (2017). 178--188.Google Scholar
Cross Ref
Index Terms
Attention Mechanism for Uyghur Personal Pronouns Resolution
Recommendations
Research and Implementation of the Uyghur-Chinese Personal Name Transliteration Based on Syllabification
IALP '13: Proceedings of the 2013 International Conference on Asian Language Processingin recent years, there have been many Uyghur-Chinese cross language applications, but automatic translation still lack of in-depth study between these two languages. The most traditional Uyghur-Chinese personal name transliteration based on rules, and ...
Morphological Analysis Corpus Construction of Uyghur
Chinese Computational LinguisticsAbstractMorphological analysis is a fundamental task in natural language processing, and results can be applied to different downstream tasks such as named entity recognition, syntactic analysis, and machine translation. However, there are many problems ...
Zero pronoun resolution in Bulgarian
CompSysTech '11: Proceedings of the 12th International Conference on Computer Systems and TechnologiesThis paper introduces the problem "zero anaphora resolution" in Bulgarian. Different types of subject ellipses in Bulgarian are described. Rules for identification of clauses with zero pronouns in Bulgarian are proposed. The antecedent of the omitted ...






Comments