Chan Hee (Luke) Song (chanhee-luke)

chanhee-luke

Geek Repo

Company:The Ohio State University

Home Page:chanh.ee

Twitter:@luke_ch_song

Github PK Tool:Github PK Tool


Organizations
dki-lab

Chan Hee (Luke) Song's starred repositories

robot_sugar

Official implementation of "SUGAR: Pre-training 3D Visual Representations for Robotics" (CVPR'24).

Language:PythonStargazers:19Issues:0Issues:0

3DSceneGraph

The data skeleton from "3D Scene Graph: A Structure for Unified Semantics, 3D Space, and Camera" http://3dscenegraph.stanford.edu

Language:PythonLicense:NOASSERTIONStargazers:231Issues:0Issues:0

open-eqa

OpenEQA Embodied Question Answering in the Era of Foundation Models

Language:Jupyter NotebookLicense:MITStargazers:193Issues:0Issues:0

SceneVerse

Official implementation of ECCV24 paper "SceneVerse: Scaling 3D Vision-Language Learning for Grounded Scene Understanding"

Language:PythonLicense:MITStargazers:146Issues:0Issues:0

3RScan

3RScan Toolkit

Language:C++License:MITStargazers:180Issues:0Issues:0

multiscan

MultiScan: Scalable RGBD scanning for 3D environments with articulated objects

Language:PythonLicense:MITStargazers:110Issues:0Issues:0

LASA

CVPR2024 | LASA: Instance Reconstruction from Real Scans using A Large-scale Aligned Shape Annotation Dataset

Language:PythonStargazers:69Issues:0Issues:0

drogozhang.github.io

Github Pages template for academic personal websites, forked from academicpages/academicpages.github.io

Language:JavaScriptLicense:MITStargazers:2Issues:0Issues:0

SQA3D

[ICLR 2023] SQA3D for embodied scene understanding and reasoning

Language:PythonLicense:Apache-2.0Stargazers:109Issues:0Issues:0

3D-GRAND

Official Implementation of 3D-GRAND: Towards Better Grounding and Less Hallucination for 3D-LLMs

Stargazers:25Issues:0Issues:0

Grounded_3D-LLM

Code&Data for Grounded 3D-LLM with Referent Tokens

Language:PythonStargazers:66Issues:0Issues:0

Awesome-LLM-3D

Awesome-LLM-3D: a curated list of Multi-modal Large Language Model in 3D world Resources

License:MITStargazers:884Issues:0Issues:0

3D-VisTA

Official implementation of ICCV 2023 paper "3D-VisTA: Pre-trained Transformer for 3D Vision and Text Alignment"

Language:PythonLicense:MITStargazers:176Issues:0Issues:0

BridgeQA

[AAAI 24] Official Codebase for BridgeQA: Bridging the Gap between 2D and 3D Visual Question Answering: A Fusion Approach for 3D VQA

Language:PythonLicense:NOASSERTIONStargazers:11Issues:0Issues:0

3D-CLR-Official

[CVPR 2023] Code for "3D Concept Learning and Reasoning from Multi-View Images"

Language:PythonStargazers:71Issues:0Issues:0

3D-LLM

Code for 3D-LLM: Injecting the 3D World into Large Language Models

Language:PythonLicense:MITStargazers:871Issues:0Issues:0

Awesome-Embodied-Agent-with-LLMs

This is a curated list of "Embodied AI or robot with Large Language Models" research. Watch this repository for the latest updates! 🔥

Stargazers:774Issues:0Issues:0

LLMTaskPlanning

LoTa-Bench: Benchmarking Language-oriented Task Planners for Embodied Agents (ICLR 2024)

Language:Jupyter NotebookStargazers:43Issues:0Issues:0

Thinking-VLN

Ideas and thoughts about the fascinating Vision-and-Language Navigation

Stargazers:129Issues:0Issues:0

awesome-language-agents

List of language agents based on paper "Cognitive Architectures for Language Agents"

Language:TeXStargazers:679Issues:0Issues:0

arnold

[ICCV 2023] Official code repository for ARNOLD benchmark

Language:Jupyter NotebookLicense:MITStargazers:122Issues:0Issues:0

TypeChat

TypeChat is a library that makes it easy to build natural language interfaces using types.

Language:TypeScriptLicense:MITStargazers:8098Issues:0Issues:0

LLMAgentPapers

Must-read Papers on LLM Agents.

Stargazers:1537Issues:0Issues:0

EnvInteractiveLMPapers

Paper collections of methods that using language to interact with environment, including interact with real world, simulated world or WWW(🏄).

License:MITStargazers:120Issues:0Issues:0

awesome-vision-language-navigation

A curated list for vision-and-language navigation. ACL 2022 paper "Vision-and-Language Navigation: A Survey of Tasks, Methods, and Future Directions"

License:MITStargazers:314Issues:0Issues:0

teach

TEACh is a dataset of human-human interactive dialogues to complete tasks in a simulated household environment.

Language:PythonStargazers:130Issues:0Issues:0

Awesome-Neural-Logic

Awesome Neural Logic and Causality: MLN, NLRL, NLM, etc. 因果推断,神经逻辑,强人工智能逻辑推理前沿领域。

Stargazers:129Issues:0Issues:0

awesome-vision-language-pretraining-papers

Recent Advances in Vision and Language PreTrained Models (VL-PTMs)

Stargazers:1134Issues:0Issues:0

pytorch-styleguide

An unofficial styleguide and best practices summary for PyTorch

Language:PythonLicense:GPL-3.0Stargazers:1881Issues:0Issues:0