属性建模与课程学习相结合的属性级情感分类方法
作者:
作者简介:

叶静(2000-), 女, 硕士, 主要研究领域为自然语言处理, 情感分析;向露(1988-), 女, 博士, 助理研究员, 主要研究领域为自然语言处理, 文本生成, 人机对话系统;宗成庆(1963-), 男, 博士, 研究员, 博士生导师, CCF会士, 主要研究领域为自然语言处理, 机器翻译, 情感分析.

通讯作者:

宗成庆, E-mail: cqzong@nlpr.ia.ac.cn

中图分类号:

TP18


Aspect-level Sentiment Classification Combining Aspect Modeling and Curriculum Learning
Author:
  • 摘要
  • | |
  • 访问统计
  • |
  • 参考文献 [41]
  • |
  • 相似文献 [20]
  • | | |
  • 文章评论
    摘要:

    属性级情感分类任务旨在判断句子针对给定属性的情感极性, 因其广泛应用而备受关注. 该任务的关键在于识别给定属性相关的上下文描述, 并根据上下文内容判断发文者针对相应属性的情感倾向. 统计发现, 大约30%的评论中并不包含关于给定属性的明确情感描述, 但仍然传达了清晰的情感倾向, 这被称为隐式情感表达. 近年来, 基于注意力机制的神经网络方法在情感分析中得到了成功应用. 但该类方法只能捕捉属性相关的显式情感描述, 而缺乏对隐含情感的有效分析和挖掘, 且往往将属性词与句子上下文分别建模, 使得属性词的表示缺乏上下文语义. 针对以上两个问题, 提出一种交叉融合属性局部和句子全局上下文信息的属性级情感分类方法, 并根据隐式和显式情感表达句子不同的分类难度采用课程学习提高模型的分类性能. 实验表明, 所提方法不仅对显式情感表达句子的属性情感倾向识别准确率高, 而且能够有效学习隐式情感表达句子的情感类别.

    Abstract:

    Aspect-level sentiment classification task, which aims to determine the sentiment polarity of a given aspect, has attracted increasing attention due to its broad applications. The key to this task is to identify contextual descriptions relevant to the given aspect and predict the aspect-related sentiment orientation of the author according to the context. Statistically, it is found that close to 30% of reviews convey a clear sentiment orientation without any explicit sentiment description of the given aspect, which is called implicit sentiment expression. Recent attention mechanism-based neural network methods have gained great achievement in sentiment analysis. However, this kind of method can only capture explicit aspect-related sentiment descriptions but fails to effectively explore and analyze implicit sentiment, and it often models aspect words and sentence contexts separately, which makes the expression of aspect words lack contextual semantics. To solve the above two problems, this study proposes an aspect-level sentiment classification method that integrates local aspect information and global sentence context information and improves the classification performance of the model by curriculum learning according to different classification difficulties of implicit and explicit sentiment sentences. Experimental results show that the proposed method not only has a high accuracy in identifying the aspect-related sentiment orientation of explicit sentiment sentences but also can effectively learn the sentiment categories of implicit sentiment sentences.

    参考文献
    [1] Pontiki M, Galanis D, Pavlopoulos J, Papageorgiou H, Androutsopoulos I, Manandhar S. SemEval-2014 task 4: Aspect based sentiment analysis. In: Proc. of the 8th Int’l Workshop on Semantic Evaluation (SemEval 2014). Dublin: Association for Computational Linguistics, 2014. 27–35.
    [2] 宗成庆, 夏睿, 张家俊. 文本数据挖掘. 第2版, 北京: 清华大学出版社, 2022. 150–159.
    Zong CQ, Xia R, Zhang JJ. Text Data Mining. 2nd ed., Beijing: Tsinghua University Press, 2022. 150–159 (in Chinese).
    [3] Russo I, Caselli T, Strapparava C. SemEval-2015 task 9: CLIPEval implicit polarity of events. In: Proc. of the 9th Int’l Workshop on Semantic Evaluation (SemEval 2015). Denver: Association for Computational Linguistics, 2015. 443–450.
    [4] Li ZY, Zou YC, Zhang C, Zhang Q, Wei ZY. Learning implicit sentiment in aspect-based sentiment analysis with supervised contrastive pre-training. In: Proc. of the 2021 Conf. on Empirical Methods in Natural Language Processing. Punta Cana: Association for Computational Linguistics, 2021. 246–256.
    [5] Wang YQ, Huang ML, Zhu XY, Zhao L. Attention-based LSTM for aspect-level sentiment classification. In: Proc. of the 2016 Conf. on Empirical Methods in Natural Language Processing. Austin: Association for Computational Linguistics, 2016. 606–615.
    [6] Tang DY, Qin B, Liu T. Aspect level sentiment classification with deep memory network. In: Proc. of the 2016 Conf. on Empirical Methods in Natural Language Processing. Austin: Association for Computational Linguistics, 2016. 214–224.
    [7] Ma DH, Li SJ, Zhang XD, Wang H. Interactive attention networks for aspect-level sentiment classification. In: Proc. of the 26th Int’l Joint Conf. on Artificial Intelligence, IJCAI-17. Melbourne: IJCAI.org, 2017. 4068–4074.
    [8] Fadaee M, Monz C. Back-translation sampling by targeting difficult words in neural machine translation. In: Proc. of the 2018 Conf. on Empirical Methods in Natural Language Processing. Brussels: Association for Computational Linguistics, 2018. 436–446.
    [9] Bengio Y, Louradour J, Collobert R, Weston J. Curriculum learning. In: Proc. of the 26th Annual Int’l Conf. on Machine Learning. Montreal: ACM Press, 2009. 41–48.
    [10] Jiang QN, Chen L, Xu RF, Ao X, Yang M. A challenge dataset and effective models for aspect-based sentiment analysis. In: Proc. of the 2019 Conf. on Empirical Methods in Natural Language Processing and the 9th Int’l Joint Conf. on Natural Language Processing (EMNLP-IJCNLP). Hong Kong: Association for Computational Linguistics, 2019. 6280–6285.
    [11] Kiritchenko S, Zhu XD, Cherry C, Mohammad S. NRC-Canada-2014: Detecting aspects and sentiment in customer reviews. In: Proc. of the 8th Int’l Workshop on Semantic Evaluation (SemEval 2014). Dublin: Association for Computational Linguistics, 2014. 437–442.
    [12] Vo DT, Zhang Y. Target-dependent Twitter sentiment classification with rich automatic features. In: Proc. of the 24th Int’l Conf. on Artificial Intelligence. Buenos Aires: AAAI Press, 2015. 1347–1353.
    [13] Tang DY, Qin B, Feng XC, Liu T. Effective LSTMs for target-dependent sentiment classification. In: Proc. of the 26th Int’l Conf. on Computational Linguistics: Technical Papers. Osaka: The COLING 2016 Organizing Committee, 2016. 3298–3307.
    [14] Wang JJ, Li J, Li SS, Kang YY, Zhang M, Si L, Zhou GD. Aspect sentiment classification with both word-level and clause-level attention networks. In: Proc. of the 27th Int’l Joint Conf. on Artificial Intelligence. Stockholm: AAAI Press, 2018. 4439–4445.
    [15] Yu JF, Jiang J, Xia R. Global inference for aspect and opinion terms co-extraction based on multi-task neural networks. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2019, 27(1): 168–177. [doi: 10.1109/TASLP.2018.2875170]
    [16] Cai HJ, Tu YF, Zhou XS, Yu JF, Xia R. Aspect-category based sentiment analysis with hierarchical graph convolutional network. In: Proc. of the 28th Int’l Conf. on Computational Linguistics. Barcelona: Int’l Committee on Computational Linguistics, 2020. 833–843.
    [17] Chen P, Sun ZQ, Bing LD, Yang W. Recurrent attention network on memory for aspect sentiment analysis. In: Proc. of the 2017 Conf. on Empirical Methods in Natural Language Processing. Copenhagen: Association for Computational Linguistics, 2017. 452–461.
    [18] 冯超, 黎海辉, 赵洪雅, 薛云, 唐婧尧. 基于层次注意力机制和门机制的属性级别情感分析. 中文信息学报, 2021, 35(10): 128–136. [doi: 10.3969/j.issn.1003-0077.2021.10.015]
    Feng C, Li HH, Zhao HY, Xue Y, Tang JY. Aspect-level sentiment analysis based on hierarchical attention and gate networks. Journal of Chinese Information Processing, 2021, 35(10): 128–136. (in Chinese with English abstract). [doi: 10.3969/j.issn.1003-0077.2021.10.015]
    [19] Zeng BQ, Yang H, Xu RY, Zhou W, Han XL. LCF: A local context focus mechanism for aspect-based sentiment classification. Applied Sciences, 2019, 9(16): 3389. [doi: 10.3390/app9163389]
    [20] Phan MH, Ogunbona PO. Modelling context and syntactical features for aspect-based sentiment analysis. In: Proc. of the 58th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, 2020. 3211–3220.
    [21] Xu H, Liu B, Shu L, Yu P. BERT post-training for review reading comprehension and aspect-based sentiment analysis. In: Proc. of the 2019 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Vol. 1 (Long and Short Papers). Minneapolis: Association for Computational Linguistics, 2019. 2324–2335.
    [22] Yan H, Dai JQ, Ji T, Qiu XP, Zhang Z. A unified generative framework for aspect-based sentiment analysis. In: Proc. of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th Int’l Joint Conf. on Natural Language Processing (Vol. 1: Long Papers). Association for Computational Linguistics, 2021. 2416–2429.
    [23] Sun C, Huang LY, Qiu XP. Utilizing BERT for aspect-based sentiment analysis via constructing auxiliary sentence. In: Proc. of the 2019 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Vol. 1 (Long and Short Papers). Minneapolis: Association for Computational Linguistics, 2019. 380–385.
    [24] Xu BF, Zhang LC, Mao ZD, Wang Q, Xie HT, Zhang YD. Curriculum learning for natural language understanding. In: Proc. of the 58th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, 2020. 6095–6104.
    [25] Lu JL, Zhang JJ. Exploiting curriculum learning in unsupervised neural machine translation. In: Proc. of the 2021 Association for Computational Linguistics. Punta Cana: Association for Computational Linguistics, 2021. 924–934.
    [26] Platanios EA, Stretcu O, Neubig G, Poczos B, Mitchell T. Competence-based curriculum learning for neural machine translation. In: Proc. of the 2019 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Vol. 1 (Long and Short Papers). Minneapolis: Association for Computational Linguistics, 2019. 1162–1172.
    [27] Wei J, Huang CY, Vosoughi S, Cheng Y, Xu SQ. Few-shot text classification with triplet networks, data augmentation, and curriculum learning. In: Proc. of the 2021 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Association for Computational Linguistics, 2021. 5493–5500.
    [28] Kocmi T, Bojar O. Curriculum learning and minibatch bucketing in neural machine translation. In: Proc. of the 2017 Int’l Conf. Recent Advances in Natural Language Processing. Varna: INCOMA Ltd., 2017. 379–386.
    [29] Hacohen G, Weinshall D. On the power of curriculum learning in training deep networks. In: Proc. of the 36th Int’l Conf. on Machine Learning. Long Beach: PMLR, 2019. 2535–2544.
    [30] Spitkovsky VI, Alshawi H, Jurafsky D. From baby steps to leapfrog: How “less is more” in unsupervised dependency parsing. In: Proc. of the Human Language Technologies: The 2010 Annual Conf. of the North American Chapter of the Association for Computational Linguistics. Los Angeles: Association for Computational Linguistics, 2010. 751–759.
    [31] Tay Y, Wang SH, Luu AT, Fu J, Phan MC, Yuan XD, Rao JF, Hui SC, Zhang A. Simple and effective curriculum pointer-generator networks for reading comprehension over long narratives. In: Proc. of the 57th Annual Meeting of the Association for Computational Linguistics. Florence: Association for Computational Linguistics, 2019. 4922–4931.
    [32] Wang X, Chen YD, Zhu WW. A survey on curriculum learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, 44(9): 4555–4576. [doi: 10.1109/TPAMI.2021.3069908]
    [33] Shu Y, Cao ZJ, Long MS, Wang JM. Transferable curriculum for weakly-supervised domain adaptation. In: Proc. of the 33rd AAAI Conf. on Artificial Intelligence. Honolulu: AAAI Press, 2019. 4951–4958.
    [34] Song YW, Wang JH, Jiang T, Liu ZY, Rao YH. Targeted sentiment classification with attentional encoder network. In: Proc. of the 28th Int’l Conf. on Artificial Neural Networks. Munich: Springer, 2019. 93–103.
    [35] Devlin J, Chang MW, Lee K, Toutanova K. BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proc. of the 2019 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Vol. 1 (Long and Short Papers). Minneapolis: Association for Computational Linguistics, 2019. 4171–4186.
    [36] Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser ?, Polosukhin I. Attention is all you need. In: Proc. of the 31st Int’l Conf. on Neural Information Processing Systems. Long Beach: Curran Associates Inc., 2017. 6000–6010.
    [37] Ba JL, Kiros JR, Hinton GE. Layer normalization. arXiv:1607.06450, 2016.
    [38] Schroff F, Kalenichenko D, Philbin J. FaceNet: A unified embedding for face recognition and clustering. In: Proc. of the 2015 IEEE Conf. on Computer Vision and Pattern Recognition (CVPR). Boston: IEEE, 2015. 815–823.
    [39] Ein Dor L, Mass Y, Halfon A, Venezian E, Shnayderman I, Aharonov R, Slonim N. Learning thematic similarity metric from article sections using triplet networks. In: Proc. of the 56th Annual Meeting of the Association for Computational Linguistics (Vol. 2: Short Papers). Melbourne: Association for Computational Linguistics, 2018. 49–54.
    引证文献
    网友评论
    网友评论
    分享到微博
    发 布
引用本文

叶静,向露,宗成庆.属性建模与课程学习相结合的属性级情感分类方法.软件学报,2024,35(9):4377-4389

复制
分享
文章指标
  • 点击次数:447
  • 下载次数: 1928
  • HTML阅读次数: 773
  • 引用次数: 0
历史
  • 收稿日期:2022-09-14
  • 最后修改日期:2022-11-03
  • 在线发布日期: 2023-09-06
  • 出版日期: 2024-09-06
文章二维码
您是第20059837位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京市海淀区中关村南四街4号,邮政编码:100190
电话:010-62562563 传真:010-62562533 Email:jos@iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号