Linhui Xiao's repositories
Awesome-Visual-Grounding
A Survey on Visual Grounding
adapter-transformers
Huggingface Transformers + Adapters = ❤️
Awesome-Multimodal-Large-Language-Models
:sparkles::sparkles:Latest Papers and Datasets on Multimodal Large Language Models, and Their Evaluation.
awesome-RLHF
A curated list of reinforcement learning with human feedback resources (continually updated)
awesome-described-object-detection
A curated list of papers and resources related to Described Object Detection, Open-Vocabulary/Open-World Object Detection and Referring Expression Comprehension. Updated frequently and pull requests welcomed.
Awesome-Open-Vocabulary
(TPAMI 2024) A Survey on Open Vocabulary Learning
Books
My book list
CVinW_Readings
A collection of papers on the topic of ``Computer Vision in the Wild (CVinW)''
detectron2
Detectron2 is a platform for object detection, segmentation and other visual recognition tasks.
DN-DETR
[CVPR 2022 Oral]Official implementation of DN-DETR
GLIP
Grounded Language-Image Pre-training
Grounded-Segment-Anything
Marrying Grounding DINO with Segment Anything & Stable Diffusion & BLIP & Whisper & ChatBot - Automatically Detect , Segment and Generate Anything with Image, Text, and Speech Inputs
GroundingDINO
Grounding DINO: Marrying DINO with Grounded Pre-Training for Open-Set Object Detection
llama
Inference code for Llama models
mae
PyTorch implementation of MAE https//arxiv.org/abs/2111.06377
mmdetection
OpenMMLab Detection Toolbox and Benchmark
NLP-Interview-Notes
该仓库主要记录 NLP 算法工程师相关的面试题
open_clip
An open source implementation of CLIP.
OV-DETR
[Under preparation] Code repo for "Open-Vocabulary DETR with Conditional Matching" (ECCV 2022)
ovr-cnn
A new framework for open-vocabulary object detection, based on maskrcnn-benchmark
paper-reading
深度学习经典、新论文逐段精读
unilm
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
zsgnet-pytorch
Official implementation of ICCV19 oral paper Zero-Shot grounding of Objects from Natural Language Queries (https://arxiv.org/abs/1908.07129)