Improving AMR-to-text Generation with Multi-task Pre-training
Author:
Affiliation:

Clc Number:

TP18

Fund Project:

National Key Research and Development Project of China (2017YFB1002101); National Natural Science Foundation of China (61876120)

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Given an AMR (abstract meaning representation) graph, AMR-to-text generation aims to generate text with the same meaning. Related studies show that the performance of AMR-to-text severely suffers from the size of the manually annotated dataset. To alleviate the dependence on manually annotated dataset, this study proposes a novel multi-task pre-training for AMR-to-text generation. In particular, based on a large-scale automatic AMR dataset, three relevant pre-training tasks are defined, i.e., AMR denoising auto-encoder, sentence denoising auto-encoder, and AMR-to-text generation itself. In addition, to fine-tune the pre-training models, the vanilla fine-tuning method is further extended to multi-task learning fine-tuning, which enables the final model to maintain performance on both AMR-to-text and pre-training tasks. With the automatic dataset of 0.39M sentences, detailed experimentation on two AMR benchmarks shows that the proposed pre-training approach significantly improves the performance of AMR-to-text generation, with the improvement of 12.27 BLEU on AMR2.0 and 7.57 on AMR3.0, respectively. This greatly advances the state-of-the-art performance with 40.30 BLEU on AMR2.0 and 38.97 on AMR 3.0, respectively. To the best knowledge, this is the best result achieved so far on AMR 2.0 while AMR-to-text generation performance on AMR 3.0 is firstly reported.

    Reference
    Related
    Cited by
Get Citation

徐东钦,李军辉,朱慕华,周国栋.基于多任务预训练的AMR文本生成研究.软件学报,2021,32(10):3036-3050

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:July 30,2020
  • Revised:October 19,2020
  • Adopted:
  • Online: December 02,2020
  • Published: October 06,2021
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063