There are 2 repositories under language-grounding topic.
awesome grounding: A curated list of research papers in visual grounding
[NeurIPS 2022] đź›’WebShop: Towards Scalable Real-World Web Interaction with Grounded Language Agents
Train an RL agent to execute natural language instructions in a 3D Environment (PyTorch)
A curated list of “Temporally Language Grounding” and related area
Implementation of EMNLP 2017 Paper "Natural Language Does Not Emerge 'Naturally' in Multi-Agent Dialog" using PyTorch and ParlAI
A Pytorch implemention for some state-of-the-art models for" Temporally Language Grounding in Untrimmed Videos"
NeurIPS 2022 Paper "VLMbench: A Compositional Benchmark for Vision-and-Language Manipulation"
Official code for NeurRIPS 2020 paper "Rel3D: A Minimally Contrastive Benchmark for Grounding Spatial Relations in 3D"
This framework provides out-of-the-box implementations of Referential Games variants in order to study the emergence of artificial languages using deep learning, relying on PyTorch (https://www.pytorch.org).
[ICLR 2022 Spotlight] Multi-Stage Episodic Control for Strategic Exploration in Text Games
Implementation of the Hierarchical and Interpretable Skill Acquisition in Multi-task Reinforcement Learning by Tianmin Shu, Caiming Xiong, and Richard Socher
An accompanying code and experiments' results for Task-Oriented Language Grounding for Language Input with Multiple Sub-Goals of Non-Linear Order
This repo contains the implelemtation for a simple language grounding in python using the robot Pepper and dockers running the servers for language groundind and speech recognition.
Spatial Preposition Annotation Tool for Virtual Environments
Fourier Transform Enhanced Vision Language Multi-goal Navigation
A package that uses BabyAI in a sender/receiver setup.
Training anki robots to do simple things e.g. "go to the green cup"