•  
  •  
 

Abstract

Question Answering (QA) is a crucial aspect of Natural Language Processing (NLP) and information retrieval systems. Users usually hope to help with everyday life by teaching the program how to answer questions like a real person. QA aims using NLP techniques to generate a correct answer to a given question according to given context or knowledge on the massive unstructured corpus). Binary question answering (Binary QA) involves providing binary answers (yes/no, true/false) to questions posed in natural language. With the development of deep learning over the years, deep learning technologies have played a pivotal role in advancing the state-of-the-art in QA systems, enabling them to understand and respond to questions. This paper proposes a hybrid attention mechanism-based binary question answering model, which integrated two deep learning techniques: Bi-LSTM and Bi-GRU. The attention mechanism is applied at the outputs of Bi-LSTM and Bi-GRU in order to make the model pay different (less or more) attention to different words in the question and passage and this allows the question to focus on a certain part of the candidate answer. Experiments have been done on BoolQ dataset. It has been observed that the hybrid of Bi-LSTM and Bi-GRU with attention mechanism gives an accuracy of 0.8783. performance and accuracy compared with the accuracy of using only Bi-LSTM or using Bi-GRU.

Keywords

Attention Mechanism, Bi-GRU, Bi-LSTM, NLP, Question Answering, RNN, Textual Question

Subject Area

Computer Science

Article Type

Article

First Page

2402

Last Page

2411

Creative Commons License

Creative Commons Attribution 4.0 International License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Share

COinS