Chongyu-Liu's starred repositories
geektime-books
:books: 极客时间电子书
imagen-pytorch
Implementation of Imagen, Google's Text-to-Image Neural Network, in Pytorch
LLM-Agent-Paper-List
The paper list of the 86-page paper "The Rise and Potential of Large Language Model Based Agents: A Survey" by Zhiheng Xi et al.
LLaMA2-Accessory
An Open-source Toolkit for LLM Development
InternImage
[CVPR 2023 Highlight] InternImage: Exploring Large-Scale Vision Foundation Models with Deformable Convolutions
awesome_LLMs_interview_notes
LLMs interview notes and answers:该仓库主要记录大模型(LLMs)算法工程师相关的面试题和参考答案
Awesome-LLMs-Datasets
Summarize existing representative LLMs text datasets.
LLM-in-Vision
Recent LLM-based CV and related works. Welcome to comment/contribute!
CLIP_benchmark
CLIP-like model evaluation
FontDiffuser
[AAAI2024] FontDiffuser: One-Shot Font Generation via Denoising Diffusion with Multi-Scale Content Aggregation and Style Contrastive Learning
EEG-Transformer
i. A practical application of Transformer (ViT) on 2-D physiological signal (EEG) classification tasks. Also could be tried with EMG, EOG, ECG, etc. ii. Including the attention of spatial dimension (channel attention) and *temporal dimension*. iii. Common spatial pattern (CSP), an efficient feature enhancement method, realized with Python.
Recommendations-Diffusion-Text-Image
A paper collection of recent diffusion models for text-image generation tasks, e,g., visual text generation, font generation, text removal, text image super resolution, text editing, handwritten generation, scene text recognition and scene text detection.
GPT-4V_OCR
Evaluation of the Optical Character Recognition (OCR) capabilities of GPT-4V(ision)
ESTextSpotter
(ICCV 2023) ESTextSpotter: Towards Better Scene Text Spotting with Explicit Synergy in Transformer
SCUT-EnsExam
SCUT-EnsExam is a real-world handwritten text erasure dataset for examination paper scenarios, which consists of 545 examination paper images. The dataset is randomly divided into training set and test set of 430 and 115 images, respectively.