Yongxin Guo's repositories

VTG-LLM

[Preprint] VTG-LLM: Integrating Timestamp Knowledge into Video LLMs for Enhanced Video Temporal Grounding

Language:PythonLicense:Apache-2.0Stargazers:37Issues:0Issues:0

25-fall-recruit

Try to summarize some information of 25 fall recruit

Stargazers:6Issues:0Issues:0

huarongdao

一个华容道小游戏

Language:JavaStargazers:1Issues:0Issues:0
Language:C++Stargazers:0Issues:0Issues:0

Academic-project-page-template

A project page template for academic papers. Demo at https://eliahuhorwitz.github.io/Academic-project-page-template/

Language:JavaScriptStargazers:0Issues:0Issues:0

AutoGPT

AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.

Language:JavaScriptLicense:MITStargazers:0Issues:0Issues:0

Awesome-FL

Comprehensive and timely academic information on federated learning (papers, frameworks, datasets, tutorials, workshops)

Language:PythonLicense:CC-BY-SA-4.0Stargazers:0Issues:0Issues:0

EMoE

Official PyTorch Implementation of EMoE: Emergent Mixture-of-Experts: Can Dense Pre-trained Transformers Benefit from Emergent Modular Structures?

Language:PythonLicense:MITStargazers:0Issues:0Issues:0
Language:JavaStargazers:0Issues:0Issues:0

fastmoe

A fast MoE impl for PyTorch

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

FedEM

Official code for "Federated Multi-Task Learning under a Mixture of Distributions" (NeurIPS'21)

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

FedGMM_ICML2023

Personalized Federated Learning under Mixture of Distributions

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

FedLab

A flexible Federated Learning Framework based on PyTorch, simplifying your Federated Learning research.

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:0Issues:0Issues:0

FL-bench

Benchmark of federated learning. Dedicated to the community. 🤗

Language:PythonLicense:GPL-2.0Stargazers:0Issues:0Issues:0

Generalizable-Mixture-of-Experts

GMoE could be the next backbone model for many kinds of generalization task.

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

imagenet-r

ImageNet-R(endition) and DeepAugment (ICCV 2021)

License:MITStargazers:0Issues:0Issues:0
Language:CStargazers:0Issues:0Issues:0

robustness

Corruption and Perturbation Robustness (ICLR 2019)

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

trustworthy_GCs

Materials for the paper https://arxiv.org/pdf/2007.15036.pdf

Stargazers:0Issues:0Issues:0

tutel

Tutel MoE: An Optimized Mixture-of-Experts Implementation

License:MITStargazers:0Issues:0Issues:0

VideoLLaMA2

VideoLLaMA 2: Advancing Spatial-Temporal Modeling and Audio Understanding in Video-LLMs

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

XAgent

An Autonomous LLM Agent for Complex Task Solving

Language:TypeScriptLicense:Apache-2.0Stargazers:0Issues:0Issues:0

yongxinguo

Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes

Language:JavaScriptLicense:MITStargazers:0Issues:0Issues:0