wxbbuaa2011 / Edge-MoE

Edge-MoE: Memory-Efficient Multi-Task Vision Transformer Architecture with Task-level Sparsity via Mixture-of-Experts

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Edge-MoE: Memory-Efficient Multi-Task Vision Transformer Architecture with Task-level Sparsity via Mixture-of-Experts

Rishov Sarkar1, Hanxue Liang2, Zhiwen Fan2, Zhangyang Wang2, Cong Hao1

1School of Electrical and Computer Engineering, Georgia Institute of Technology
2School of Electrical and Computer Engineering, University of Texas at Austin

Overview

Edge-MoE overall architecture

This is Edge-MoE, the first end-to-end FPGA accelerator for multi-task ViT with a rich collection of architectural innovations.

Currently, this repository contains a prebuilt bitstream for the AMD/Xilinx ZCU102 FPGA and a video demo. Our HLS code will be open-source upon acceptance.

About

Edge-MoE: Memory-Efficient Multi-Task Vision Transformer Architecture with Task-level Sparsity via Mixture-of-Experts