BERT+vnKG: Using Deep Learning and Knowledge Graph to Improve Vietnamese Question Answering System
Thạc sĩPhan Hồ Viết TrườngDo Phuc
Khoa Công Nghệ Thông Tin
Thể loại: Bài báo
Abstract: A question answering (QA) system based on natural language processing and deep learning is a prominent area and is being researched widely. The Long Short-Term Memory (LSTM) model that is a variety of Recurrent Neural Network (RNN) used to be popular in machine translation, and question answering system. However, that model still has certainly limited capabilities, so a new model named Bidirectional Encoder Representation from Transformer (BERT) emerged to solve these restrictions. BERT has more advanced features than LSTM and shows state-of-the-art results in many tasks, especially in multilingual question answering system over the past few years. Nevertheless, we tried applying multilingual BERT model for a Vietnamese QA system and found that BERT model still has certainly limitation in term of time and precision to return a Vietnamese answer. The purpose of this study is to propose a method that solved above restriction of multilingual BERT and applied for question answering system about tourism in Vietnam. Our method combined BERT and knowledge graph to enhance accurately and find quickly for an answer. We experimented our crafted QA data about Vietnam tourism on three models such as LSTM, BERT fine-tuned multilingual for QA (BERT for QA), and BERT+vnKG. As a result, our model outperformed two previous models in terms of accuracy and time. This research can also be applied to other fields such as finance, e-commerce, and so on.
Tài liệu tham khảo
Để đọc toàn văn của bài báo này, bạn có thể yêu cầu một bản sao đầy đủ trực tiếp từ các tác giả.