Abstract
Applying artificial intelligence to Chinese language translation in computational linguistics is of practical significance for economic boosts and cultural exchanges. In the present work, the bi-directional long short-term memory (BiLSTM) network is employed to extract Chinese text features regarding the overlapping semantic roles in Chinese language translation and hard-to-converge training of high-dimensional text word vectors in text classification during translation. In addition, AlexNet is optimized to extract the local features of the text and meanwhile update and learn network parameters in the deep network. Then, the attention mechanism is introduced to build a forecasting algorithm of Chinese language translation based on BiLSTM and improved AlexNet. Last, the forecasting algorithm is simulated to validate its performance. Some state-of-the-art algorithms are selected for a comparative experiment, including long short-term memory, regions with convolutional neural network features, AlexNet, and support vector machine. Results demonstrate that the forecasting algorithm proposed here can achieve a feature identification accuracy of 90.55%, at least an improvement of 4.24% over other algorithms. In addition, it provides an area under the curve of above 90%, a training duration of about 54.21 seconds, and a test duration of about 19.07 seconds. Regarding the performance of Chinese language translation, the algorithm proposed here provides a bilingual evaluation understudy (BLEU) value of 28.21 on the training set, with a performance gain ratio reaching 111.55%; on the test set, its BLEU reaches 40.45, with a performance gain ratio of 129.80%. Hence, this forecasting algorithm is notably superior to other algorithms, which can enhance the machine translation performance. Through experiments, the Chinese language translation algorithm constructed here improves translation performance while ensuring a high correct identification rate, providing experimental references for the later intelligent development of Chinese language translation in computational linguistics.
- [1] . 2021. State of art for semantic analysis of natural language processing. Qubahan Academic Journal 1, 2 (2021), 21–28.Google Scholar
Cross Ref
- [2] . 2018. Natural language processing for requirements engineering: The best is yet to come. IEEE Software 35, 5 (2018), 115–119.Google Scholar
Cross Ref
- [3] . 2018. Recent trends in deep learning-based natural language processing. IEEE Computational Intelligence Magazine 13, 3 (2018), 55–75.Google Scholar
Cross Ref
- [4] . 2019. Arabic–Chinese neural machine translation: Romanized Arabic as subword unit for Arabic-sourced translation. IEEE Access 7, (2019), 133122–133135.Google Scholar
Cross Ref
- [5] . 2018. Challenges and opportunities in analytic-predictive environments of big data and natural language processing for social network rating systems. IEEE Latin America Transactions 16, 2 (2018), 592–597.Google Scholar
Cross Ref
- [6] . 2018. Natural language processing for EHR-based computational phenotyping. IEEE/ACM Transactions on Computational Biology and Bioinformatics 16, 1 (2018), 139–153.Google Scholar
Digital Library
- [7] . 2020. Adversarial attacks on deep-learning models in natural language processing: A survey. ACM Transactions on Intelligent Systems and Technology 11, 3 (2020), 1–41.Google Scholar
Digital Library
- [8] . 2021. Semantics of the black-box: Can knowledge graphs help make deep learning systems more interpretable and explainable? IEEE Internet Computing 25, 1 (2021), 51–59.Google Scholar
Cross Ref
- [9] . 2021. Pretraining techniques for sequence-to-sequence voice conversion. IEEE/ACM Transactions on Audio, Speech, and Language Processing 29 (2021), 745–755.Google Scholar
Digital Library
- [10] . 2019. Chinese grammatical error correction based on convolutional sequence to sequence model. IEEE Access 7 (2019), 72905–72913.Google Scholar
Cross Ref
- [11] . 2021. A modified simulation model for predicting the FDS of transformer oil-paper insulation under nonuniform aging. IEEE Transactions on Instrumentation and Measurement 70 (2021), 1–9.Google Scholar
Cross Ref
- [12] . 2018. Industrial big data analytics for prediction of remaining useful life based on deep learning. IEEE Access 6 (2018), 17190–17197.Google Scholar
Cross Ref
- [13] . 2019. Bright-field holography: Cross-modality deep learning enables snapshot 3D imaging with bright-field contrast using a single hologram. Light: Science & Applications 8, 1 (2019), 1–7.Google Scholar
Cross Ref
- [14] . 2020. pLoc_deep-mvirus: A CNN model for predicting subcellular localization of virus proteins by deep learning. Natural Science 12, 6 (2020), 388–399.Google Scholar
Cross Ref
- [15] . 2018. ANFIS model for determining the economic order quantity. Decision Making: Applications in Management and Engineering 1, 2 (2018), 81–92.Google Scholar
Cross Ref
- [16] . 2019. Deep learning spectroscopy: Neural networks for molecular excitation spectra. Advanced Science 6, 9 (2019), 1801367.Google Scholar
Cross Ref
- [17] . 2020. A survey of the usages of deep learning for natural language processing. IEEE Transactions on Neural Networks and Learning Systems 32, 2 (2020), 604–624.Google Scholar
Cross Ref
- [18] . 2018. Uniform multilingualism: A media genealogy of Google Translate. New Media & Society 20, 7 (2018), 2550–2565.Google Scholar
Cross Ref
- [19] . 2019. A Bi-LSTM mention hypergraph model with encoding schema for mention extraction. Engineering Applications of Artificial Intelligence 85 (2019), 175–181.Google Scholar
Cross Ref
- [20] . 2019. A model for English to Urdu and Hindi machine translation system using translation rules and artificial neural network. International Arab Journal of Information Technology 16, 1 (2019), 125–131.Google Scholar
- [21] . 2021. SHUOWEN-JIEZI: Linguistically informed tokenizers for Chinese language model pretraining. arXiv preprint arXiv:2106.00400 (2021).Google Scholar
- [22] . 2021. Self-attention-based conditional random fields latent variables model for sequence labeling. Pattern Recognition Letters 145 (2021), 157–164.Google Scholar
Digital Library
- [23] . 2021. ASRNN: A recurrent neural network with an attention model for sequence labeling. Knowledge-Based Systems 212 (2021), 106548.Google Scholar
Cross Ref
- [24] . 2021. A discussion on building practical NLP leaderboards: The case of machine translation. arXiv preprint arXiv:2106.06292 (2021).Google Scholar
- [25] . 2020. Enhanced sequence labeling based on latent variable conditional random fields. Neurocomputing 403 (2020), 431–440.Google Scholar
Cross Ref
- [26] . 2021. Enhancing Chinese character representation with lattice-aligned attention. IEEE Transactions on Neural Networks and Learning Systems. Early access, October 5, 2021.Google Scholar
Cross Ref
- [27] . 2020. Deep learning in clinical natural language processing: A methodical review. Journal of the American Medical Informatics Association 27, 3 (2020), 457–470.Google Scholar
Cross Ref
- [28] . 2019. Name indexing in Indonesian translation of Hadith using named entity recognition with naïve Bayes classifier. Procedia Computer Science 157 (2019), 142–149.Google Scholar
Digital Library
- [29] . 2019. Assessment of deep natural language processing in ascertaining oncologic outcomes from radiology reports. JAMA Oncology 5, 10 (2019), 1421–1429.Google Scholar
Cross Ref
- [30] . 2020. Deep learning for natural language processing in radiology—Fundamentals and a systematic review. Journal of the American College of Radiology 17, 5 (2020), 639–648.Google Scholar
Cross Ref
- [31] . 2019. Deep learning in computer graphics. IEEE Computer Graphics and Applications 39, 2 (2019), 25–25.Google Scholar
Cross Ref
- [32] . 2019. Survey on evolving deep learning neural network architectures. Journal of Artificial Intelligence 1, 2 (2019), 73–82.Google Scholar
- [33] . 2020. Survey on deep learning for pulmonary medical imaging. Frontiers of Medicine 14 (2020), 450–469.Google Scholar
Cross Ref
- [34] . 2019. Deep learning for natural language parsing. IEEE Access 7 (2019), 131363–131373.Google Scholar
Cross Ref
- [35] . 2019. Deep residual convolutional neural network for protein-protein interaction extraction. IEEE Access 7 (2019), 89354–89365.Google Scholar
Cross Ref
- [36] . 2019. A single attention-based combination of CNN and RNN for relation classification. IEEE Access 7 (2019), 12467–12475.Google Scholar
Cross Ref
- [37] . 2020. A natural language processing pipeline of Chinese free-text radiology reports for liver cancer diagnosis. IEEE Access 8 (2020), 159110–159119.Google Scholar
Cross Ref
- [38] . 2021. Text to image synthesis for improved image captioning. IEEE Access 9 (2021), 64918–64928.Google Scholar
Cross Ref
- [39] . 2020. Probabilistic analysis of targeted attacks using transform-domain adversarial examples. IEEE Access 8 (2020), 33855–33869.Google Scholar
Cross Ref
- [40] . 2020. Application of data mining methods in Internet of Things technology for the translation systems in traditional ethnic books. IEEE Access 8 (2020), 93398–93407.Google Scholar
Cross Ref
Index Terms
Deep Learning in Computational Linguistics for Chinese Language Translation
Recommendations
Syntax-based reordering for statistical machine translation
Abstract: In this paper, we develop an approach called syntax-based reordering (SBR) to handling the fundamental problem of word ordering for statistical machine translation (SMT). We propose to alleviate the word order challenge including morpho-...
Discontinuous grammar as a foreign language
Highlights- Sequence-to-sequence constituent parsers are lagging behind task-specific techniques.
AbstractIn order to achieve deep natural language understanding, syntactic constituent parsing is a vital step, highly demanded by many artificial intelligence systems to process both text and speech. One of the most recent proposals is the ...






Comments