FeipengMa6 / PriSA

[ICME 2023 Oral] Pytorch implementation for Multimodal Sentiment Analysis with Preferential Fusion and Distance-aware Contrastive Learning.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

PriSA

Pytorch implementation for Multimodal Sentiment Analysis with Preferential Fusion and Distance-aware Contrastive Learning (ICME 2023 Oral).

Dependencies

Environments

We provide the anaconda enviroments to help you build a runnable environment.

conda env create -f environments.yml
conda activate prisa

Datasets

Please download MOSEI dataset into ./data for training and evaluation.

Pretrained Models

Please download bert-base-uncased from huggingface into ./pretrained_models.

Training

You can train the model using the following command. The output will be saved at /tmp/log/, you can modify .runx to change the path of training log.

python -m runx.runx mosei.yml -i

Or you can directly run the following command without runx, the output will be saved at ./log

python main.py

Feel free to concat with us (mafp@foxmail.com) if you have any problem.

Citation

@inproceedings{ma2023multimodal,
  title={Multimodal Sentiment Analysis with Preferential Fusion and Distance-aware Contrastive Learning},
  author={Ma, Feipeng and Zhang, Yueyi and Sun, Xiaoyan},
  booktitle={2023 IEEE International Conference on Multimedia and Expo (ICME)},
  pages={1367--1372},
  year={2023},
  organization={IEEE}
}

About

[ICME 2023 Oral] Pytorch implementation for Multimodal Sentiment Analysis with Preferential Fusion and Distance-aware Contrastive Learning.


Languages

Language:Python 100.0%