Teli Ma's starred repositories
Awesome-LLM-Robotics
A comprehensive list of papers using large language/multi-modal models for Robotics/RL, including papers, codes, and related websites
ICCV-2023-Papers
ICCV 2023 Papers: Discover cutting-edge research from ICCV 2023, the leading computer vision conference. Stay updated on the latest in computer vision and deep learning, with code included. ⭐ support visual intelligence development!
Awesome-Embodied-Agent-with-LLMs
This is a curated list of "Embodied AI or robot with Large Language Models" research. Watch this repository for the latest updates! 🔥
Everything-LLMs-And-Robotics
The world's largest GitHub Repository for LLMs + Robotics
diffusion-literature-for-robotics
Summary of key papers and blogs about diffusion models to learn about the topic. Detailed list of all published diffusion robotics papers.
awesome-vision-language-navigation
A curated list for vision-and-language navigation. ACL 2022 paper "Vision-and-Language Navigation: A Survey of Tasks, Methods, and Future Directions"
SEED-Bench
(CVPR2024)A benchmark for evaluating Multimodal LLMs using multiple-choice questions.
Awesome-Robot-Learning
This repo contains a curative list of robot learning (mainly for manipulation) resources.
Awesome-Text2X-Resources
This is an open collection of state-of-the-art (SOTA), novel Text to X (X can be everything) methods (papers, codes and datasets).
UDR-S2Former_deraining
[ICCV'23] Sparse Sampling Transformer with Uncertainty-Driven Ranking for Unified Removal of Raindrops and Rain Streaks
hab-mobile-manipulation
Mobile manipulation in Habitat
Meta-Learning-Papers-with-Code
🎉🎨 This repository contains a reading list of papers with code on **Meta-Learning** and ***Meta-Reinforcement-Learning*
visual_gpt_score
VisualGPTScore for visio-linguistic reasoning
evaluations
[AAAI 2024] ConceptBed Evaluations for Personalized Text-to-Image Diffusion Models
MultiTrain
Code and model for "Multi-dataset Training of Transformers for Robust Action Recognition", NeurIPS 2022 Spotlight
ManiSkill2
This repo has moved to https://github.com/haosulab/ManiSkill