Перегляд за Автор "Shevchenko, Olga"
Зараз показуємо 1 - 2 з 2
Результатів на сторінці
Налаштування сортування
Документ Відкритий доступ AI Language Learning Apps(Igor Sikorsky Kyiv Polytechnic Institute, 2024-05-15) Shevchenko, Olga; Ogurtsova, Olga; Uzhakov, NikitaAI-powered language learning apps have revolutionized the way people acquire new languages. By using artificial intelligence, these apps offer personalized and interactive learning experiences taking into account individual proficiency levels and learning styles. They provide immediate feedback, adaptive exercises, and innovative methods in language learning.. With the continuous development of AI technologies, these apps are becoming increasingly effective tools for language learners. AI language learning apps will continue to reshape language education, making it more accessible, engaging, and efficient.Документ Відкритий доступ Basic Principles and Limitations of Neural Machine Translation(Східноєвропейський національний університет ім. Лесі Українки, 2022) Shevchenko, Olga; Ogurtsova, OlgaThe article describes some problems of Neural Machine Translation (NMT) and the role of neural networks in the translation process. The mechanism of neural machine translation, its specific features, its differences from other machine translation systems and the system limitations are also analyzed. NMT systems use artificial neural networks that are trained on a large number of pairs of parallel sentences (‘parallel corpora’). These networks can read a word or a sentence in the source language and translate them into a target language. However, word matching and breakdown into phrases is no longer needed . This seems to be the main difference between the NMT system and other machine translation systems, such as Rule-based or Statistical MT. In order to create a NMT system one must provide the availability of several million pairs of sentences translated by human translators All modern NMTsystems are equipped with encoder-decoder and ‘attention’ mechanisms. The unique role of the ‘attention’ mechanism is to predict subsequent words during the translation process. While focusing on one or more words of the original sentence, it adds this information to the encoded full text. This process is similar to the behavior of a human translator who first reads the entire sentence and then looks at individual source words and phrases already translated or yet to be translated. In spite of its advantages, like fluency, NMT systems have a number of drawbacks. The most frequent are adequacy errors, as well as omissions and additions of content. Transfer of semantic content from the source to the target language often produces mistranslations. The source phrases need to be very clear, coherent and void of ambiguity to prevent low quality output.