prajwalsingh / Awesome-Multimodality

A Survey on multimodal learning research.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Awesome Multimodality

Awesome

A collection of resources on multimodal learning research.

Content

1.Description

🐌 Markdown Format:

  • (Conference/Journal Year) [Task/Keywords] Title, First Author et al. [Paper] [Code] [Project]

2.Paper With Code

  • Survey

    • (arXiv preprint 2021) A Survey on Multi-modal Summarization, Anubhav Jangra et al. [v1](2021.09.11)
  • 2021

    • (ICCV 2021 Oral) [Text-guided Image Manipulation] StyleCLIP: Text-Driven Manipulation of StyleGAN Imagery, Or Patashnik et al. [Paper] [Code] [Play]
    • (ICCV 2021) [Facial Editing] Talk-to-Edit: Fine-Grained Facial Editing via Dialog, Yuming Jiang et al. [Paper] [Code] [Project] [Dataset Project] [Dataset(CelebA-Dialog Dataset)]
    • (arXiv preprint 2021) [Video Action Recognition] ActionCLIP: A New Paradigm for Video Action Recognition, Mengmeng Wang et al. [Paper]
  • 2020

3.Courses

Contact Me

About

A Survey on multimodal learning research.