MetaKing (kwuking)

kwuking

User data from Github https://github.com/kwuking

0

followers

0

following

0

stars

GitHub:@kwuking

MetaKing's repositories

TimeMixer

[ICLR 2024] Official implementation of "TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting"

Language:PythonLicense:Apache-2.0Stargazers:1750Issues:85Issues:107

AutoTimes

Official implementation for "AutoTimes: Autoregressive Time Series Forecasters via Large Language Models"

Language:PythonLicense:MITStargazers:6Issues:0Issues:0

Autoformer

About Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

gill

🐟 Code and models for the NeurIPS 2023 paper "Generating Images with Multimodal Language Models".

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:0Issues:0Issues:0

google-research

Google Research

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:0Issues:0Issues:0

iTransformer

Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting" (ICLR 2024 Spotlight), https://openreview.net/forum?id=JePfAI8fah

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

Koopa

Code release for "Koopa: Learning Non-stationary Time Series Dynamics with Koopman Predictors" (NeurIPS 2023), https://arxiv.org/abs/2305.18803

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

ModaVerse

[CVPR2024] ModaVerse: Efficiently Transforming Modalities with LLMs

Language:PythonStargazers:0Issues:0Issues:0

NExT-GPT

Code and models for NExT-GPT: Any-to-Any Multimodal Large Language Model

Language:PythonLicense:BSD-3-ClauseStargazers:0Issues:0Issues:0

Time-Series-Library

A Library for Advanced Deep Time Series Models.

Language:PythonLicense:MITStargazers:0Issues:0Issues:0
Language:PythonLicense:MITStargazers:0Issues:0Issues:0

PatchTST

An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0
Language:Jupyter NotebookStargazers:0Issues:0Issues:0

Time-MoE

Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0