基于深度学习的事件抽取研究综述
作者:
作者简介:

王浩畅(1974-),女,博士,教授,主要研究领域为人工智能,自然语言处理,数据挖掘,生物信息学;周郴莲(1995-),女,硕士,主要研究领域为自然语言处理;Marius Gabriel PETRESCU(1966-),男,博士,教授,主要研究领域为人工智能

通讯作者:

周郴莲,E-mail:chenlian_zhou@163.com

基金项目:

国家自然科学基金(61402099,61702093)


Survey on Event Extraction Based on Deep Learning
Author:
  • 摘要
  • | |
  • 访问统计
  • |
  • 参考文献 [101]
  • |
  • 相似文献 [20]
  • | | |
  • 文章评论
    摘要:

    事件抽取是从非结构化的自然语言文本中自动抽取用户感兴趣的事件信息, 并以结构化的形式表示出来. 事件抽取是自然语言处理与理解中的重要方向, 在政府公共事务管理、金融业务、生物医学等不同领域有着很高的应用价值. 根据对人工标注数据的依赖程度, 目前基于深度学习的事件抽取方法主要分为两类: 有监督和远程监督学习方法. 对当前深度学习中事件抽取技术进行了全面的综述. 围绕有监督中CNN、RNN、GAN、GCN与远程监督等方法, 系统地总结了近几年的研究情况, 并对不同的深度学习模型的性能进行了详细对比与分析. 最后, 对事件抽取面临的挑战进行了分析, 针对研究趋势进行了展望.

    Abstract:

    Event extraction is to automatically extract event information in which users are interested from unstructured natural language texts and express it in a structured form. Event extraction is an important direction in natural language processing and understanding and is of high application value in different fields, such as government management of public affairs, financial business, and biomedicine. According to the degree of dependence on manually labeled data, the current event extraction methods based on deep learning are mainly divided into two categories: supervised learning and distantly-supervised learning. This article provides a comprehensive overview of current event extraction techniques in deep learning. Focusing on supervised methods such as CNN, RNN, GAN, GCN, and distant supervision, this study systematically summarizes the research in recent years. Additionally, the performance of different deep learning models is compared and analyzed in detail. Finally, the challenges facing event extraction are analyzed, and the research trends are forecasted.

    参考文献
    [1] Grishman R, Sundheim B. Message understanding conference-6: A brief history. In: Proc. of the 16th Conf. on Computational Linguistics. Copenhagen: Association for Computational Linguistics, 1996. 466–471.
    [2] Missingham R. Access to Australian Government information: A decade of change 1997–2007. Government Information Quarterly, 2008, 25(1): 25–37. [doi: 10.1016/j.giq.2007.07.001]
    [3] Ahn D. The stages of event extraction. In: Proc. of the 2006 Workshop on Annotating and Reasoning about Time and Events. Sydney: Association for Computational Linguistics, 2006. 1–8.
    [4] Malakasiotis P. AUEB at TAC 2009. In: Proc. of the 2nd Text Analysis Conf. Gaithersburg: NIST, 2009. 1–6.
    [5] Mitamura T, Liu ZZ, Hovy EH. Overview of TAC KBP 2015 event nugget track. In: Proc. of the 2015 Text Analysis Conf. Gaithersburg: NIST, 2015. 1–11.
    [6] LeCun Y, Bengio Y. Convolutional networks for images, speech, and time series. In: Arbib MA, ed. The Handbook of Brain Theory and Neural Networks. Cambridge: MIT Press, 1995. 255–258.
    [7] Kombrink S, Mikolov T, Karafiát M, Burget L. Recurrent neural network based language modeling in meeting recognition. In: Proc. of the 12th Annual Conf. of the Int’l Speech Communication Association. Florence: ISCA, 2011. 2877–2880.
    [8] Goodfellow IJ, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, Courville A, Bengio Y. Generative adversarial nets. In: Proc. of the 27th Int’l Conf. on Neural Information Processing Systems. Montreal: MIT Press, 2014. 2672–2680.
    [9] Kipf TN, Welling M. Semi-supervised classification with graph convolutional networks. In: Proc. of the 5th Int’l Conf. on Learning Representations. Toulon: OpenReview.net, 2017. 1–14.
    [10] Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. In: Proc. of the 3rd Int’l Conf. on Learning Representations. San Diego: ICLR, 2015. 1–15.
    [11] Zeng DJ, Liu K, Lai SW, Zhou GY, Zhao J. Relation classification via convolutional deep neural network. In: Proc. of the 25th Int’l Conf. on Computational Linguistics. Dublin: Dublin City University and Association for Computational Linguistics, 2014. 2335–2344.
    [12] Tang DY, Qin B, Liu T. Document modeling with gated recurrent neural network for sentiment classification. In: Proc. of the 2015 Conf. on Empirical Methods in Natural Language Processing. Lisbon: Association for Computational Linguistics, 2015. 1422–1432.
    [13] Schuster M, Paliwal KK. Bidirectional recurrent neural networks. IEEE Transactions on Signal Processing, 1997, 45(11): 2673–2681. [doi: 10.1109/78.650093]
    [14] Chen YB, Xu LH, Liu K, Zeng DJ, Zhao J. Event extraction via dynamic multi-pooling convolutional neural networks. In: Proc. of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th Int’l Joint Conf. on Natural Language Processing. Beijing: Association for Computational Linguistics, 2015. 167–176.
    [15] Nguyen TH, Grishman R. Event detection and domain adaptation with convolutional neural networks. In: Proc. of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th Int’l Joint Conf. on Natural Language Processing. Beijing: Association for Computational Linguistics, 2015. 365–371.
    [16] Nguyen TH, Grishman R. Modeling skip-grams for event detection with convolutional neural networks. In: Proc. of the 2016 Conf. on Empirical Methods in Natural Language Processing. Austin: Association for Computational Linguistics, 2016. 886–891.
    [17] Wang XZ, Wang ZQ, Han X, Liu ZY, Li JZ, Li P, Sun MS, Zhou J, Ren X. HMEAE: Hierarchical modular event argument extraction. In: Proc. of the 2019 Conf. on Empirical Methods in Natural Language Processing and the 9th Int’l Joint Conf. on Natural Language Processing. Hong Kong: Association for Computational Linguistics, 2019. 5777–5783.
    [18] Feng XC, Huang LF, Tang DY, Ji H, Qin B, Liu T. A language-independent neural network for event detection. In: Proc. of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin: Association for Computational Linguistics, 2016. 66–71.
    [19] Nguyen TH, Cho K, Grishman R. Joint event extraction via recurrent neural networks. In: Proc. of the 2016 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. San Diego: Association for Computational Linguistics, 2016. 300–309.
    [20] Chen YB, Liu SL, He SZ, Liu K, Zhao J. Event extraction via bidirectional long short-term memory tensor neural networks. In: Proc. of the 15th China National Conf. on Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data. Yantai: Springer, 2016. 190–203.
    [21] Sha L, Qian F, Chang BB, Sui ZF. Jointly extracting event triggers and arguments by dependency-bridge RNN and tensor-based argument interaction. In: Proc. of the 32nd AAAI Conf. on Artificial Intelligence (AAAI2018), the 30th Innovative Applications of Artificial Intelligence (IAAI2018), and the 8th AAAI Symp. on Educational Advances in Artificial Intelligence (EAAI2018). New Orleans: AAAI, 2018. 5916–5923.
    [22] Orr JW, Tadepalli P, Fern X. Event detection with neural networks: A rigorous empirical evaluation. In: Proc. of the 2018 Conf. on Empirical Methods in Natural Language Processing. Brussels: Association for Computational Linguistics, 2018. 999–1004.
    [23] Chen YB, Yang H, Liu K, Zhao J, Jia YT. Collective event detection via a hierarchical and bias tagging networks with gated multi-level attention mechanisms. In: Proc. of the 2018 Conf. on Empirical Methods in Natural Language Processing. Brussels: Association for Computational Linguistics, 2018. 1267–1276.
    [24] Ding N, Li ZR, Liu ZY, Zheng HT, Lin ZB. Event detection with trigger-aware lattice neural network. In: Proc. of the 2019 Conf. on Empirical Methods in Natural Language Processing and the 9th Int’l Joint Conf. on Natural Language Processing. Hong Kong: Association for Computational Linguistics, 2019. 347–356.
    [25] Hong Y, Zhou WX, Zhang JL, Zhou GD, Zhu QM. Self-regulation: Employing a generative adversarial network to improve event detection. In: Proc. of the 56th Annual Meeting of the Association for Computational Linguistics. Melbourne: Association for Computational Linguistics, 2018. 515–526.
    [26] Wang R, Zhou DY, He YL. Open event extraction from online text using a generative adversarial network. In: Proc. of the 2019 Conf. on Empirical Methods in Natural Language Processing and the 9th Int’l Joint Conf. on Natural Language Processing. Hong Kong: Association for Computational Linguistics, 2019. 282–291.
    [27] Nguyen TH, Grishman R. Graph convolutional networks with argument-aware pooling for event detection. In: Proc. of the 32nd AAAI Conf. on Artificial Intelligence and the 30th Innovative Applications of Artificial Intelligence Conf. and the 8th AAAI Symp. on Educational Advances in Artificial Intelligence. New Orleans: AAAI, 2018. 5900–5907.
    [28] Yan HR, Jin XL, Meng XB, Guo JF, Cheng XQ. Event detection with multi-order graph convolution and aggregated attention. In: Proc. of the 2019 Conf. on Empirical Methods in Natural Language Processing and the 9th Int’l Joint Conf. on Natural Language Processing. Hong Kong: Association for Computational Linguistics, 2019. 5766–5770.
    [29] Collobert R, Weston J, Bottou L, Karlen M, Kavukcuoglu K, Kuksa R. Natural language processing (almost) from scratch. The Journal of Machine Learning Research, 2011, 12: 2493–2537. [doi: 10.5555/1953048.2078186]
    [30] Li Q, Ji H, Huang L. Joint event extraction via structured prediction with global features. In: Proc. of the 51st Annual Meeting of the Association for Computational Linguistics. Sofia: Association for Computational Linguistics, 2013. 73–82.
    [31] Lei T, Barzilay R, Jaakkola T. Molding CNNs for text: Non-linear, non-consecutive convolutions. In: Proc. of the 2015 Conf. on Empirical Methods in Natural Language Processing. Lisbon: Association for Computational Linguistics, 2015. 1565–1575.
    [32] Zhang TT, Whitehead S, Zhang HW, Li HZ, Ellis J, Huang LF, Liu W, Ji H, Chang SF. Improving event extraction via multimodal integration. In: Proc. of the 25th ACM Int’l Conf. on Multimedia. Mountain View: ACM, 2017. 270–278.
    [33] Liu WY, Nguyen TH. Similar but not the same: Word sense disambiguation improves event detection via neural representation matching. In: Proc. of the 2018 Conf. on Empirical Methods in Natural Language Processing. Brussels: Association for Computational Linguistics, 2018. 4822–4828.
    [34] Lin HY, Lu YJ, Han XP, Sun L. Cost-sensitive regularization for label confusion-aware event detection. In: Proc. of the 57th Annual Meeting of the Association for Computational Linguistics. Florence: Association for Computational Linguistics, 2019. 5278–5283.
    [35] Li LS, Liu Y, Qin MY. Extracting biomedical events with parallel multi-pooling convolutional neural networks. IEEE/ACM Transactions on Computational Biology and Bioinformatics, 2020, 17(2): 599–607. [doi: 10.1109/TCBB.2018.2868078]
    [36] Pyysalo S, Ohta T, Miwa M, Cho HC, Tsujii J, Ananiadou S. Event extraction across multiple levels of biological organization. Bioinformatics, 2012, 28(18): i575–i581. [doi: 10.1093/bioinformatics/bts407]
    [37] Zhu LX, Zheng HR. Biomedical event extraction with a novel combination strategy based on hybrid deep neural networks. BMC Bioinformatics, 2020, 21(1): 47. [doi: 10.1186/s12859-020-3376-2]
    [38] Ramponi A, Plank B, Lombardo R. Cross-domain evaluation of edge detection for biomedical event extraction. In: Proc. of the 12th Language Resources and Evaluation Conf. Marseille: European Language Resources Association, 2020. 1982–1989.
    [39] Goller C, Kuchler A. Learning task-dependent distributed representations by backpropagation through structure. In: Proc. of the 1996 Int’l Conf. on Neural Networks. Washington: IEEE, 1996. 347–352.
    [40] Hochreiter S, Schmidhuber J. Long short-term memory. Neural Computation, 1997, 9(8): 1735–1780. [doi: 10.1162/neco.1997.9.8.1735]
    [41] Cho K, van Merriënboer B, Gulcehre C, Bahdanau D, Bougares F, Schwenk H, Bengio Y. Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: Proc. of the 2014 Conf. on Empirical Methods in Natural Language Processing. Doha: Association for Computational Linguistics, 2014. 1724–1734.
    [42] Duan SY, He RF, Zhao WL. Exploiting document level information to improve event detection via recurrent neural networks. In: Proc. of the 8th Int’l Joint Conf. on Natural Language Processing. Taipei: Asian Federation of Natural Language Processing, 2017. 352–361.
    [43] Nguyen TM, Nguyen TH. One for all: Neural joint modeling of entities and events. Proc. of the AAAI Conf. on Artificial Intelligence, 2019, 33(1): 6851–6858.
    [44] Wu WT, Zhu XX, Tao JM, Li PF. Event detection via recurrent neural network and argument prediction. In: Proc. of the 7th CCF Int’l Conf. on Natural Language Processing and Chinese Computing. Hohhot: Springer, 2018. 235–245.
    [45] Ding RX, Li ZJ. Event extraction with deep contextualized word representation and multi-attention layer. In: Proc. of the 14th Int’l Conf. on Advanced Data Mining and Applications. Nanjing: Springer, 2018. 189–201.
    [46] Zhang JL, Zhou Wx, Hong Y, Yao JM, Zhang M. Using entity relation to improve event detection via attention mechanism. In: Proc. of the 7th CCF Int’l Conf. on Natural Language Processing and Chinese Computing. Hohhot: Springer, 2018. 171–183.
    [47] Liu W, Yang ZY, Liu ZT. Chinese event recognition via ensemble model. In: Proc. of the 25th Int’l Conf. on Neural Information Processing. Siem Reap: Springer, 2018. 255–264.
    [48] Zhao Y, Jin XL, Wang YZ, Cheng XQ. Document embedding enhanced event detection with hierarchical and supervised attention. In: Proc. of the 56th Annual Meeting of the Association for Computational Linguistics. Melbourne: Association for Computational Linguistics, 2018. 414–419.
    [49] Chen JL, Hong Y, Zhang JL, Yao JM. Using mention segmentation to improve event detection with multi-head attention. In: Proc. of the 2019 Int’l Conf. on Asian Language Processing. Shanghai: IEEE, 2019. 367–372.
    [50] Mehta S, Islam MR, Rangwala H, Ramakrishnan N. Event detection using hierarchical multi-aspect attention. In: Proc. of the 2019 World Wide Web Conf. San Francisco: ACM, 2019. 3079–3085.
    [51] 黄细凤. 基于动态掩蔽注意力机制的事件抽取. 计算机应用研究, 2020, 37(7): 1964–1968. [doi: 10.19734/j.issn.1001-3695.2018.12.0927]
    Huang XF. Event extraction based on dynamic masked attention. Application Research of Computers, 2020, 37(7): 1964–1968 (in Chinese with English abstract). [doi: 10.19734/j.issn.1001-3695.2018.12.0927]
    [52] Zhang TT, Ji H. Event extraction with generative adversarial imitation learning. arXiv:1804.07881, 2018.
    [53] Liu J, Chen YB, Liu K. Exploiting the ground-truth: An adversarial imitation based knowledge distillation approach for event detection. Proc. of the 2019 AAAI Conf. on Artificial Intelligence, 2019, 33(1): 6754–6761.
    [54] Liu X, Luo ZC, Huang HY. Jointly multiple events extraction via attention-based graph information aggregation. In: Proc. of the 2018 Conf. on Empirical Methods in Natural Language Processing. Brussels: Association for Computational Linguistics, 2018. 1247–1256.
    [55] Cui SY, Yu BW, Liu TW, Zhang ZY, Wang XB, Shi JQ. Edge-enhanced graph convolution networks for event detection with syntactic relation. arXiv:2002.10757, 2020.
    [56] Zeng DJ, Liu K, Chen YB, Zhao J. Distant supervision for relation extraction via piecewise convolutional neural networks. In: Proc. of the 2015 Conf. on Empirical Methods in Natural Language Processing. Lisbon: Association for Computational Linguistics, 2015. 1753–1762.
    [57] Chen YB, Liu SL, Zhang X, Liu K, Zhao J. Automatically labeled data generation for large scale event extraction. In: Proc. of the 55th Annual Meeting of the Association for Computational Linguistics. Vancouver: Association for Computational Linguistics, 2017. 409–419.
    [58] Bollacker K, Evans C, Paritosh P, Sturge T, Taylor J. Freebase: A collaboratively created graph database for structuring human knowledge. In: Proc. of the 2008 ACM SIGMOD Int’l Conf. on Management of Data. Vancouver: ACM, 2008. 1247–1250.
    [59] Baker CF, Fillmore CJ, Lowe JB. The Berkeley FrameNet project. In: Proc. of the 36th Annual Meeting of the Association for Computational Linguistics and the 17th Int’l Conf. on Computational Linguistics. Montreal: Association for Computational Linguistics, 1998. 86–90.
    [60] Zeng Y, Feng YS, Ma R, Wang Z, Yan R, Shi CD, Zhao DY. Scale up event extraction learning via automatic training data generation. In: Proc. of the 32nd AAAI Conf. on Artificial Intelligence and the 30th Innovative Applications of Artificial Intelligence Conf. and the 8th AAAI Symp. on Educational Advances in Artificial Intelligence. New Orleans: AAAI, 2018. 742.
    [61] Keith KA, Handler A, Pinkham M, Magliozzi C, McDuffie J, O’Connor B. Identifying civilians killed by police with distantly supervised entity-event extraction. In: Proc. of the 2017 Conf. on Empirical Methods in Natural Language Processing. Copenhagen: Association for Computational Linguistics, 2017. 1547–1557.
    [62] Rao S, Marcu D, Knight K, Daumé III H. Biomedical event extraction using abstract meaning representation. In: Proc. of the 2017 BioNLP. Vancouver: Association for Computational Linguistics, 2017. 126–135.
    [63] Kim JD, Wang Y, Takagi T, Yonezawa A. Overview of genia event task in BioNLP shared task 2011. In: Proc. of the 2011 BioNLP Shared Task Workshop. Portland: Association for Computational Linguistics, 2011. 7–15.
    [64] Liu SL, Chen YB, He SZ, Liu K, Zhao J. Leveraging framenet to improve automatic event detection. In: Proc. of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin: Association for Computational Linguistics, 2016. 2134–2143.
    [65] Wadden D, Wennberg U, Luan Y, Hajishirzi H. Entity, relation, and event extraction with contextualized span representations. In: Proc. of the 2019 Conf. on Empirical Methods in Natural Language Processing and the 9th Int’l Joint Conf. on Natural Language Processing. Hong Kong: Association for Computational Linguistics, 2019. 5784–5789.
    [66] Yang BS, Mitchell TM. Joint extraction of events and entities within a document context. In: Proc. of the 2016 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. San Diego: Association for Computational Linguistics, 2016. 289–299.
    [67] Han RJ, Ning Q, Peng NY. Joint event and temporal relation extraction with shared representations and structured prediction. In: Proc. of the 2019 Conf. on Empirical Methods in Natural Language Processing and the 9th Int’l Joint Conf. on Natural Language Processing. Hong Kong: Association for Computational Linguistics, 2019. 434–444.
    [68] Zhang JC, Qin YX, Zhang Y, Liu MC, Ji DH. Extracting entities and events as a single task using a transition-based neural model. In: Proc. of the 28th Int’l Joint Conf. on Artificial Intelligence. Macao: IJCAI.org, 2019. 5422–5428.
    [69] Lu YJ, Lin HY, Han XP, Sun L. Distilling discrimination and generalization knowledge for event detection via delta-representation learning. In: Proc. of the 57th Annual Meeting of the Association for Computational Linguistics. Florence: Association for Computational Linguistics, 2019. 4366–4376.
    [70] Deng SM, Zhang NY, Kang JJ, Zhang YC, Zhang W, Chen HJ. Meta-learning with dynamic-memory-based prototypical network for few-shot event detection. In: Proc. of the 13th Int’l Conf. on Web Search and Data Mining. Houston: ACM, 2020. 151–159.
    [71] Deng SM, Zhang NY, Li LQ, Chen H, Tou HX, Chen MS, Huang F, Chen HJ. OntoED: Low-resource event detection with ontology embedding. In: Proc. of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th Int’l Joint Conf. on Natural Language Processing. Association for Computational Linguistics, 2021. 2828–2839.
    [72] Yang S, Feng DW, Qiao LB, Kan ZG, Li DS. Exploring pre-trained language models for event extraction and generation. In: Proc. of the 57th Annual Meeting of the Association for Computational Linguistics. Florence: Association for Computational Linguistics, 2019. 5284–5294.
    [73] Du XY, Cardie C. Event extraction by answering (almost) natural questions. In: Proc. of the 2020 Conf. on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 2020. 671–683.
    [74] Gangal V, Hovy E. BERTERING RAMS: What and how much does BERT already know about event arguments?—A study on the RAMS dataset. In: Proc. of the 3rd BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP. Association for Computational Linguistics, 2020. 1–10.
    [75] Zhang ZS, Kong X, Liu ZZ, Ma XZ, Hovy E. A two-step approach for implicit event argument detection. In: Proc. of the 58th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, 2020. 7479–7485.
    [76] Huang KH, Yang M, Peng NY. Biomedical event extraction with hierarchical knowledge graphs. In: Proc. of the 2020 Findings of the Association for Computational Linguistics. Association for Computational Linguistics, 2020. 1277–1285.
    [77] Beltagy I, Lo K, Cohan A. SciBERT: A pretrained language model for scientific text. In: Proc. of the 2019 Conf. on Empirical Methods in Natural Language Processing and the 9th Int’l Joint Conf. on Natural Language Processing. Hong Kong: Association for Computational Linguistics, 2019. 3615–3620.
    [78] Lin HY, Lu YJ, Han XP, Sun L. Nugget proposal networks for Chinese event detection. In: Proc. of the 56th Annual Meeting of the Association for Computational Linguistics. Melbourne: Association for Computational Linguistics, 2018. 1565–1574.
    [79] Xu N, Xie HH, Zhao DY. A novel joint framework for multiple Chinese events extraction. In: Proc. of the 19th China National Conf. on Chinese Computational Linguistics. Springer, 2020. 174–183.
    [80] Liu SB, Cheng R, Yu XM, Cheng XQ. Exploiting contextual information via dynamic memory network for event detection. In: Proc. of the 2018 Conf. on Empirical Methods in Natural Language Processing. Brussels: Association for Computational Linguistics, 2018. 1030–1035.
    [81] 李培峰, 周国栋, 朱巧明. 基于语义的中文事件触发词抽取联合模型. 软件学报, 2016, 27(2): 280–294. http://www.jos.org.cn/1000-9825/4833.htm
    Li PF, Zhou GD, Zhu QM. Semantics-based joint model of Chinese event trigger extraction. Ruan Jian Xue Bao/Journal of Software, 2016, 27(2): 280–294 (in Chinese with English abstract). http://www.jos.org.cn/1000-9825/4833.htm
    [82] Zeng Y, Yang HH, Feng YS, Wang Z, Zhao DY. A convolution BiLSTM neural network model for Chinese event extraction. In: Proc. of the 5th CCF Conf. on Natural Language Processing and Chinese Computing, NLPCC 2016, and the 24th Int’l Conf. on Computer Processing of Oriental Languages. Kunming: Springer, 2016. 275–287.
    [83] Subburathinam A, Lu D, Ji H, May J, Chang SF, Sil A, Voss C. Cross-lingual structure transfer for relation and event extraction. In: Proc. of the 2019 Conf. on Empirical Methods in Natural Language Processing and the 9th Int’l Joint Conf. on Natural Language Processing. Hong Kong: Association for Computational Linguistics, 2019. 313–325.
    [84] Yang H, Chen YB, Liu K, Xiao Y, Zhao J. DCFEE: A document-level Chinese financial event extraction system based on automatically labeled training data. In: Proc. of the 2018 ACL System Demonstrations. Melbourne: Association for Computational Linguistics, 2018. 50–55.
    [85] 仲伟峰, 杨航, 陈玉博, 刘康, 赵军. 基于联合标注和全局推理的篇章级事件抽取. 中文信息学报, 2019, 33(9): 88–95, 106. [doi: 10.3969/j.issn.1003-0077.2019.09.011]
    Zhong WF, Yang H, Chen YB, Liu K, Zhao J. Document-level event extraction based on joint labeling and global reasoning. Journal of Chinese Information Processing, 2019, 33(9): 88–95, 106 (in Chinese with English abstract). [doi: 10.3969/j.issn.1003-0077.2019.09.011]
    [86] 朱培培, 王中卿, 李寿山, 王红玲. 基于篇章信息和Bi-GRU的中文事件检测. 计算机科学, 2020, 47(12): 233–238. [doi: 10.11896/jsjkx.191100031]
    Zhu PP, Wang ZQ, Li SS, Wang HL. Chinese event detection based on document information and Bi-GRU. Computer Science, 2020, 47(12): 233–238 (in Chinese with English abstract). [doi: 10.11896/jsjkx.191100031]
    [87] 孟环建. 突发事件领域事件抽取技术的研究 [硕士学位论文]. 上海: 上海大学, 2015.
    Meng HJ. Research on event extraction technology in the field of unexpected events [MS. Thesis]. Shanghai: Shanghai University, 2015 (in Chinese with English abstract).
    [88] Nédellec C, Bossy R, Kim JD, Kim JJ, Ohta T, Pyysalo S, Zweigenbaum P. Overview of BioNLP shared task 2013. In: Proc. of the 2013 BioNLP Shared Task Workshop. Sofia: Association for Computational Linguistics, 2013. 1–7.
    [89] Pustejovsky J, Hanks P, Sauri R, See A, Gaizauskas R, Setzer A, Radev DR, Sundheim B, Day D, Ferro L, Lazo M. The timebank corpus. Corpus Linguistics, 2003, 2003: 647–656.
    [90] Sundheim BM, Chinchor NA. Survey of the message understanding conferences. In: Proc. of the 1993 Workshop on Human Language Technology. Plainsboro: Association for Computational Linguistics, 1993. 56–60.
    [91] 丁效, 宋凡, 秦兵, 刘挺. 音乐领域典型事件抽取方法研究. 中文信息学报, 2011, 25(2): 15–20. [doi: 10.3969/j.issn.1003-0077.2011.02.003]
    Ding X, Song F, Qin B, Liu T. Research on typical event extraction method in the field of music. Journal of Chinese Information Processing, 2011, 25(2): 15–20 (in Chinese with English abstract). [doi: 10.3969/j.issn.1003-0077.2011.02.003]
    [92] 杨航. 面向非结构化中文文本的篇章级事件抽取研究 [硕士学位论文]. 哈尔滨: 哈尔滨理工大学, 2019.
    Yang H. Research on the methods for document-level event extraction from chinese unstructured texts [MS. Thesis]. Harbin: Harbin University of Science and Technology, 2019 (in Chinese with English abstract).
    [93] Zhang ZK, Xu WR, Chen QQ. Joint event extraction based on skip-window convolutional neural networks. In: Proc. of the 5th CCF Conf. on Natural Language Processing and Chinese Computing, NLPCC 2016, and the 24th Int’l Conf. on Computer Processing of Oriental Languages. Kunming: Springer, 2016. 324–334.
    [94] Chen C, Ng V. Joint modeling for Chinese event extraction with rich linguistic features. In: Proc. of the 2012 COLING. Mumbai: The COLING 2012 Organizing Committee, 2012. 529–544.
    引证文献
    网友评论
    网友评论
    分享到微博
    发 布
引用本文

王浩畅,周郴莲,Marius Gabriel PETRESCU.基于深度学习的事件抽取研究综述.软件学报,2023,34(8):3905-3923

复制
分享
文章指标
  • 点击次数:2234
  • 下载次数: 6322
  • HTML阅读次数: 3192
  • 引用次数: 0
历史
  • 收稿日期:2020-12-22
  • 最后修改日期:2021-06-28
  • 在线发布日期: 2022-05-24
  • 出版日期: 2023-08-06
文章二维码
您是第19754413位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京市海淀区中关村南四街4号,邮政编码:100190
电话:010-62562563 传真:010-62562533 Email:jos@iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号