llj110 / TMM-for-MAMS

Code for the NLPCC 2020 paper "Transformer-based Multi-Aspect Modeling for Multi-Aspect Multi-Sentiment Analysis"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Transformer-based Multi-Aspect Modeling for MAMS

Transformer-based Multi-Aspect Modeling for Multi-Aspect Multi-Sentiment Analysis. Zhen Wu, Chengcan Ying, Xinyu Dai, Shujian Huang, Jiajun Chen. NLPCC 2020.

Requirements

  • pytorch=1.7.1
  • python=3.7

Usage

Download the pretrained RoBERTa model (link, password:2fv2) and unzip it into the folder pretrained.

The MAMS data is preprocessed by the script preprocess.py. The original and preprocessed versions of the data are provided in the folder data.

For the ATSA subtask

Run the command python main_ATSA.py to train and test the ATSA model.

You can change training settings in the file configs.py.

For the ACSA subtask

Run the command python main_ACSA.py to train and test the ACSA model.

You can change training settings in the file configs.py.

Citation

If you use the code, please cite our paper:

@InProceedings{10.1007/978-3-030-60457-8_45,
author="Wu, Zhen
and Ying, Chengcan
and Dai, Xinyu
and Huang, Shujian
and Chen, Jiajun",
editor="Zhu, Xiaodan
and Zhang, Min
and Hong, Yu
and He, Ruifang",
title="Transformer-Based Multi-aspect Modeling for Multi-aspect Multi-sentiment Analysis",
booktitle="Natural Language Processing and Chinese Computing",
year="2020",
publisher="Springer International Publishing",
address="Cham",
pages="546--557",
isbn="978-3-030-60457-8"
}

Reference

[1]. Zhen Wu, Chengcan Ying, Xinyu Dai, Shujian Huang, Jiajun Chen. Transformer-based Multi-Aspect Modeling for Multi-Aspect Multi-Sentiment Analysis. NLPCC 2020.

[2]. Qingnan Jiang, Lei Chen, Ruifeng Xu, Xiang Ao, Min Yang. A Challenge Dataset and Effective Models for Aspect-Based Sentiment Analysis. EMNLP-IJCNLP 2019.

About

Code for the NLPCC 2020 paper "Transformer-based Multi-Aspect Modeling for Multi-Aspect Multi-Sentiment Analysis"


Languages

Language:Python 96.6%Language:Cuda 2.4%Language:C++ 1.0%