waltersharpWEI's repositories

timesfm

TimesFM (Time Series Foundation Model) is a pretrained time-series foundation model developed by Google Research for time-series forecasting.

License:Apache-2.0Stargazers:0Issues:0Issues:0

videollm-online

VideoLLM-online: Online Video Large Language Model for Streaming Video (CVPR 2024)

License:Apache-2.0Stargazers:0Issues:0Issues:0
License:MITStargazers:0Issues:0Issues:0
License:GPL-3.0Stargazers:0Issues:0Issues:0

TerDiT

TerDiT: Ternary Diffusion Models with Transformers

License:MITStargazers:0Issues:0Issues:0

3DGStream

[CVPR 2024 Highlight] Official repository for the paper "3DGStream: On-the-fly Training of 3D Gaussians for Efficient Streaming of Photo-Realistic Free-Viewpoint Videos".

License:MITStargazers:0Issues:0Issues:0

rag-tutorial-v2

An Improved Langchain RAG Tutorial (v2) with local LLMs, database updates, and testing.

Stargazers:0Issues:0Issues:0

unimatch

[TPAMI'23] Unifying Flow, Stereo and Depth Estimation

License:MITStargazers:0Issues:0Issues:0

pykan

Kolmogorov Arnold Networks

License:MITStargazers:0Issues:0Issues:0

OpenPCDet

OpenPCDet Toolbox for LiDAR-based 3D Object Detection.

License:Apache-2.0Stargazers:0Issues:0Issues:0

UnityGaussianSplatting

Toy Gaussian Splatting visualization in Unity

License:MITStargazers:0Issues:0Issues:0

AliceMind

ALIbaba's Collection of Encoder-decoders from MinD (Machine IntelligeNce of Damo) Lab

License:Apache-2.0Stargazers:0Issues:0Issues:0

PointCloudCompensation

Applys error compensation to point clouds taken with a laser scanner (TLS).

License:GPL-3.0Stargazers:0Issues:0Issues:0

DiT

Official PyTorch Implementation of "Scalable Diffusion Models with Transformers"

License:NOASSERTIONStargazers:0Issues:0Issues:0

pytorch-transformer-ts

Repository of Transformer based PyTorch Time Series Models

License:MITStargazers:0Issues:0Issues:0

SO2

[AAAI2024] A Perspective of Q-value Estimation on Offline-to-Online Reinforcement Learning

License:Apache-2.0Stargazers:0Issues:0Issues:0

BSCV-Dataset

Official repository for the paper titled "Bitstream-corrupted Video Recovery: A Novel Benchmark Dataset and Method", accepted by NeurIPS 2023 Dataset and Benchmark Track

Stargazers:0Issues:0Issues:0

langchain-rag-tutorial

A simple Langchain RAG application.

Stargazers:0Issues:0Issues:0

PointCRT

PointCRT: Detecting Backdoor in 3D Point Cloud via Corruption Robustness (MM '23)

License:MITStargazers:0Issues:0Issues:0

iTransformer

Implementation of iTransformer - SOTA Time Series Forecasting using Attention networks, out of Tsinghua / Ant group

License:MITStargazers:0Issues:0Issues:0

docker-openvpn

🔒 OpenVPN server in a Docker container complete with an EasyRSA PKI CA

License:MITStargazers:0Issues:0Issues:0

llama2.mojo

Inference Llama 2 in one file of pure 🔥

License:MITStargazers:0Issues:0Issues:0

instant-ngp

Instant neural graphics primitives: lightning fast NeRF and more

License:NOASSERTIONStargazers:0Issues:0Issues:0

ultralytics

NEW - YOLOv8 🚀 in PyTorch > ONNX > OpenVINO > CoreML > TFLite

License:AGPL-3.0Stargazers:0Issues:0Issues:0

GroundingDINO

Official implementation of the paper "Grounding DINO: Marrying DINO with Grounded Pre-Training for Open-Set Object Detection"

License:Apache-2.0Stargazers:0Issues:0Issues:0

vit-pytorch

Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch

License:MITStargazers:0Issues:0Issues:0

lidar-point-cloud-error-concealment

This is an error concealment system for LiDAR point clouds, which can be used to generate experimental results. If you want to generate your own dataset as input, please use our co-simulator.

Stargazers:0Issues:0Issues:0

SparsePCGC

Sparse Tensor-based Multiscale Representation for Point Cloud Geometry Compression

License:MITStargazers:0Issues:0Issues:0

NDCR

A Neural Divide-and-Conquer Reasoning Framework for Multimodal Reasoning on Linguistically Complex Text and Similar Images

License:Apache-2.0Stargazers:0Issues:0Issues:0

Crossformer

Official implementation of our ICLR 2023 paper "Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting"

Stargazers:0Issues:0Issues:0