Event Extraction Method Based on Dual Attention Mechanism
Author:
Affiliation:

Clc Number:

TP18

  • Article
  • | |
  • Metrics
  • |
  • Reference [29]
  • |
  • Related
  • | | |
  • Comments
    Abstract:

    In view of the fact that the syntactic relationship is not fully utilized and the argument role is missing in event extraction, an event extraction based on dual attention mechanism (EEDAM) method is proposed to improve the accuracy and recall rate of event extraction. Firstly, sentence coding is based on four embedded vectors and dependency relation is introduced to construct dependency relation graph, so that deep neural network can make full use of syntactic relation. Then, through graph transformation attention network, new dependency arcs and aggregate node information are generated to capture long-range dependencies and potential interactions, weighted attention network is integrated to capture key semantic information in sentences, and sentence level event arguments are extracted to improve the prediction ability of the model. Finally, the key sentence detection and similarity ranking are used to fill in the document level arguments. The experimental results show that the event extraction method based on dual attention mechanism can improve the accuracy rate, recall rate, and F1-score by 17.82%, 4.61%, and 9.80% respectively compared with the optimal baseline joint multiple Chinese event extractor (JMCEE) on ACE2005 data set. On the data set of dam safety operation records, the accuracy, recall rate, and F1 score are 18.08%, 4.41%, and 9.93% higher than the optimal baseline JMCEE, respectively.

    Reference
    [1] Sha L, Qian F, Chang BB, Sui ZF. Jointly extracting event triggers and arguments by dependency-bridge rnn and tensor-based argument interaction. In: Proc. of the 32nd AAAI Conf. on Artificial Intelligence. New Orleans: AAAI, 2018. 5916–5923.
    [2] Balali A, Asadpour M, Campos R, Jatowt A. Joint event extraction along shortest dependency paths using graph convolutional networks. Knowledge-based Systems, 2020, 210: 106492. [doi: 10.1016/j.knosys.2020.106492
    [3] Kipf TN, Welling M. Semi-supervised classification with graph convolutional networks. In: Proc. of the 5th Int’l Conf. on Learning Representations. Toulon: OpenReview.net, 2017.
    [4] Ma J, Wang S, Anubhai R, Ballesteros M, Al-Onaizan Y. Resource-enhanced neural model for event argument extraction. In: Proc. of the Findings of the Association for Computational Linguistics: EMNLP 2020. Online: ACL, 2020. 3554–3559.
    [5] Ahmad WU, Peng NY, Chang KW. GATE: Graph attention transformer encoder for cross-lingual relation and event extraction. In: Proc. of the 35th AAAI Conf. on Artificial Intelligence. Online: AAAI, 2021. 12462–12470.
    [6] Liu J, Chen YB, Liu K, Zhao J. Event detection via gated multilingual attention mechanism. In: Proc. of the AAAI Conf. on Artificial Intelligence. New Orleans: AAAI, 2018. 4865–4872.
    [7] Yang H, Chen YB, Liu K, Xiao Y, Zhao J. DCFEE: A document-level Chinese financial event extraction system based on automatically labeled training data. In: Proc. of the 56th Annual Meeting of the Association for Computational Linguistics-system Demonstration. Melbourne: ACL, 2018. 50–55.
    [8] Chen P, Yang H, Liu K, Huang RH, Chen YB, Wang TF, Zhao J. Reconstructing event regions for event extraction via graph attention networks. In: Proc. of the 1st Conf. of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th Int’l Joint Conf. on Natural Language Processing. Suzhou: AACL, 2020. 811–820.
    [9] Chen YB, Xu LH, Liu K, Zeng DJ, Zhao J. Event extraction via dynamic multi-pooling convolutional neural networks. In: Proc. of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th Int’l Joint Conf. on Natural Language Processing. Beijing: ACL, 2015. 167–176.
    [10] Zeng Y, Yang HH, Feng YS, Wang Z, Zhao DY. A convolution BiLSTM neural network model for Chinese event extraction. In: Proc. of the 5th CCF Conf. on Natural Language Processing and Chinese Computing, NLPCC 2016, and 24th Int’l Conf. on Computer Processing of Oriental Languages. Kunming: Springer, 2016. 275–287.
    [11] Wang XZ, Wang ZQ, Han X, Liu ZY, Li JZ, Li P, Sun MS, Zhou J, Ren X. HMEAE: Hierarchical modular event argument extraction. In: Proc. of the 2019 Conf. on Empirical Methods in Natural Language Processing and the 9th Int’l Joint Conf. on Natural Language Processing. Hong Kong: ACL, 2019. 5776–5782.
    [12] Veyseh APB, Nguyen TN, Nguyen TH. Graph transformer networks with syntactic and semantic structures for event argument extraction. In: Proc. of the Findings of the Association for Computational Linguistics: EMNLP 2020. Online: ACL, 2020. 3651–3661.
    [13] Yan H, Qiu XP, Huang XJ. A graph-based model for joint Chinese word segmentation and dependency parsing. Trans. of the Association for Computational Linguistics, 2020, 8: 78–92. [doi: 10.1162/tacl_a_00301
    [14] Wu Y, Zhang JY. Chinese event extraction based on attention and semantic features: A bidirectional circular neural network. Future Internet, 2018, 10(10): 95. [doi: 10.3390/fi10100095
    [15] Liu X, Luo ZC, Huang HY. Jointly multiple events extraction via attention-based graph information aggregation. In: Proc. of the 2018 Conf. on Empirical Methods in Natural Language Processing. Brussels: ACL, 2018. 1247–1256.
    [16] Nguyen TH, Cho K, Grishman R. Joint event extraction via recurrent neural networks. In: Proc. of the 2016 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. San Diego: ACL, 2016. 300–309.
    [17] Xu N, Xie HH, Zhao DY. A novel joint framework for multiple Chinese events extraction. In: Proc. of the 19th Chinese National Conf. on Chinese Computational Linguistics. Haikou: Springer, 2020. 174–183.
    [18] Kim Y. Convolutional neural networks for sentence classification. In: Proc. of the 2014 Conf. on Empirical Methods in Natural Language Processing. Doha: ACL, 2014. 1746–1751.
    [19] Ji YZ, Lin YF, Gao JW, Wan HY. Exploiting the entity type sequence to benefit event detection. In: Proc. of the 23rd Conf. on Computational Natural Language Learning. Hong Kong: ACL, 2019. 613–623.
    [20] Devlin J, Chang MW, Lee K, Toutanova K. BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proc. of the 2019 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis: ACL, 2019. 4171–4186.
    [21] Lan ZZ, Chen MD, Goodman S, Gimpel K, Sharma P, Socicut R. ALBERT: A lite BERT for self-supervised learning of language representations. In: Proc. of the 8th Int’l Conf. on Learning Representations. Addis Ababa: ICLR, 2020.
    [22] Feng XC, Qin B, Liu T. A language-independent neural network for event detection. Science China Information Sciences, 2018, 61(9): 092106. [doi: 10.1007/s11432-017-9359-x
    [23] Cui SY, Yu BW, Liu TW, Zhang ZY, Wang XB, Shi JQ. Edge-enhanced graph convolution networks for event detection with syntactic relation. In: Proc. of the Findings of the Association for Computational Linguistics: EMNLP 2020. Online: ACL, 2020. 2329–2339.
    [24] Yun S, Jeong M, Kim R, Kang J, Kim HJ. Graph transformer networks. In: Proc. of the 33rd Int’l Conf. on Neural Information Processing Systems. Vancouver: NIPS, 2019. 32.
    [25] Veličković P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y. Graph attention networks. In: Proc. of the Int’l Conf. on Learning Representations. Vancouver: OpenReview.net, 2018.
    [26] Lin TY, Goyal P, Girshick R, He KM, Dollár P. Focal loss for dense object detection. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2020, 42(2): 318–327. [doi: 10.1109/TPAMI.2018.2858826
    [27] Mikolov T, Chen K, Corrado G, Dean J. Efficient estimation of word representations in vector space. In: Proc. of the 1st Int’l Conf. on Learning Representations. Scottsdale: ICLR, 2013.
    [28] Mueller J, Thyagarajan A. Siamese recurrent architectures for learning sentence similarity. In: Proc. of the 30th AAAI Conf. on Artificial Intelligence. Phoenix Arizona: AAAI, 2016. 2789–2792.
    [29] Xiang W, Wang B. A survey of event extraction from text. IEEE Access, 2019, 7: 173111–173137. [doi: 10.1109/ACCESS.2019.2956831
    Related
    Cited by
    Comments
    Comments
    分享到微博
    Submit
Get Citation

朱敏,毛莺池,程永,陈程军,王龙宝.基于双重注意力机制的事件抽取方法.软件学报,2023,34(7):3226-3240

Copy
Share
Article Metrics
  • Abstract:1066
  • PDF: 3349
  • HTML: 1486
  • Cited by: 0
History
  • Received:June 05,2021
  • Revised:August 05,2021
  • Online: May 24,2022
  • Published: July 06,2023
You are the first2037985Visitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063