Teli Ma's starred repositories

fiftyone

Refine high-quality datasets and visual AI models

Language:PythonLicense:Apache-2.0Stargazers:8711Issues:55Issues:1513

Awesome-LLM-Robotics

A comprehensive list of papers using large language/multi-modal models for Robotics/RL, including papers, codes, and related websites

Emu

Emu Series: Generative Multimodal Models from BAAI

Language:PythonLicense:Apache-2.0Stargazers:1622Issues:21Issues:86

ICCV-2023-Papers

ICCV 2023 Papers: Discover cutting-edge research from ICCV 2023, the leading computer vision conference. Stay updated on the latest in computer vision and deep learning, with code included. ⭐ support visual intelligence development!

Language:PythonLicense:MITStargazers:921Issues:13Issues:10

Awesome-Embodied-Agent-with-LLMs

This is a curated list of "Embodied AI or robot with Large Language Models" research. Watch this repository for the latest updates! 🔥

Everything-LLMs-And-Robotics

The world's largest GitHub Repository for LLMs + Robotics

License:BSD-3-ClauseStargazers:754Issues:21Issues:0

diffusion-literature-for-robotics

Summary of key papers and blogs about diffusion models to learn about the topic. Detailed list of all published diffusion robotics papers.

License:MITStargazers:583Issues:20Issues:0

tidybot

TidyBot: Personalized Robot Assistance with Large Language Models

Language:PythonLicense:MITStargazers:533Issues:11Issues:4

eai-vc

The repository for the largest and most comprehensive empirical study of visual foundation models for Embodied AI (EAI).

Language:PythonLicense:NOASSERTIONStargazers:456Issues:20Issues:18

awesome-vision-language-navigation

A curated list for vision-and-language navigation. ACL 2022 paper "Vision-and-Language Navigation: A Survey of Tasks, Methods, and Future Directions"

License:MITStargazers:356Issues:15Issues:0

mae_st

Official Open Source code for "Masked Autoencoders As Spatiotemporal Learners"

Language:PythonLicense:NOASSERTIONStargazers:312Issues:7Issues:23

SEED-Bench

(CVPR2024)A benchmark for evaluating Multimodal LLMs using multiple-choice questions.

Language:PythonLicense:NOASSERTIONStargazers:310Issues:4Issues:27

RVT

Official Code for RVT-2 and RVT

Language:Jupyter NotebookLicense:NOASSERTIONStargazers:265Issues:9Issues:56
Language:Jupyter NotebookLicense:MITStargazers:197Issues:9Issues:3

Awesome-Robot-Learning

This repo contains a curative list of robot learning (mainly for manipulation) resources.

License:MITStargazers:145Issues:5Issues:0

Awesome-Text2X-Resources

This is an open collection of state-of-the-art (SOTA), novel Text to X (X can be everything) methods (papers, codes and datasets).

UDR-S2Former_deraining

[ICCV'23] Sparse Sampling Transformer with Uncertainty-Driven Ranking for Unified Removal of Raindrops and Rain Streaks

GNFactor

[CoRL 2023 Oral] GNFactor: Multi-Task Real Robot Learning with Generalizable Neural Feature Fields

Language:PythonLicense:MITStargazers:114Issues:2Issues:11

hab-mobile-manipulation

Mobile manipulation in Habitat

Meta-Learning-Papers-with-Code

🎉🎨 This repository contains a reading list of papers with code on **Meta-Learning** and ***Meta-Reinforcement-Learning*

polarnet

[CoRL2023] Official PyTorch implementation of PolarNet: 3D Point Clouds for Language-Guided Robotic Manipulation

Language:PythonLicense:MITStargazers:29Issues:2Issues:5

visual_gpt_score

VisualGPTScore for visio-linguistic reasoning

evaluations

[AAAI 2024] ConceptBed Evaluations for Personalized Text-to-Image Diffusion Models

Language:PythonLicense:MITStargazers:24Issues:1Issues:3

MultiTrain

Code and model for "Multi-dataset Training of Transformers for Robust Action Recognition", NeurIPS 2022 Spotlight

Language:PythonLicense:MITStargazers:19Issues:6Issues:3

MODE

An Examination of the Compositionality of Large Generative Vision-Language Models

License:MITStargazers:17Issues:1Issues:0

ManiSkill2

This repo has moved to https://github.com/haosulab/ManiSkill

Language:HTMLLicense:Apache-2.0Stargazers:2Issues:3Issues:0