xinyyhu

xinyyhu

Geek Repo

Github PK Tool:Github PK Tool

xinyyhu's starred repositories

Midas2OpenSEES

Midas模型转换成OpenSEES模型,并实时显示。

Language:TclStargazers:8Issues:0Issues:0

practice-in-paddle

《神经网络与深度学习》案例与实践

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:723Issues:0Issues:0

Reactive-Resume

A one-of-a-kind resume builder that keeps your privacy in mind. Completely secure, customizable, portable, open-source and free forever. Try it out today!

Language:TypeScriptLicense:MITStargazers:20991Issues:0Issues:0

Sap2OpenSees

Data Transmission from Sap2000 to OpenSees

Language:TclStargazers:10Issues:0Issues:0

RTSF

Revisiting Long-term Time Series Forecasting: An Investigation on Linear Mapping

Language:Jupyter NotebookStargazers:60Issues:0Issues:0

nlp_notes

自然语言处理学习笔记:机器学习及深度学习原理和示例,基于 Tensorflow 和 PyTorch 框架,Transformer、BERT、ALBERT等最新预训练模型及源代码详解,及基于预训练模型进行各种自然语言处理任务。模型部署

Language:Jupyter NotebookStargazers:307Issues:0Issues:0

Time-Series-Forecasting-and-Deep-Learning

Resources about time series forecasting and deep learning.

Stargazers:490Issues:0Issues:0

iTransformer

Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting" (ICLR 2024 Spotlight), https://openreview.net/forum?id=JePfAI8fah

Language:PythonLicense:MITStargazers:1019Issues:0Issues:0

PatchTST

An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730

Language:PythonLicense:Apache-2.0Stargazers:1438Issues:0Issues:0

LTSF-Linear

[AAAI-23 Oral] Official implementation of the paper "Are Transformers Effective for Time Series Forecasting?"

Language:PythonLicense:Apache-2.0Stargazers:1889Issues:0Issues:0

LightTS

Code for the paper "LightTS: Lightweight Time Series Classification with Adaptive Ensemble"

Language:PythonStargazers:12Issues:0Issues:0

ETSformer

PyTorch code for ETSformer: Exponential Smoothing Transformers for Time-series Forecasting

Language:PythonLicense:BSD-3-ClauseStargazers:255Issues:0Issues:0

Nonstationary_Transformers

Code release for "Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting" (NeurIPS 2022), https://arxiv.org/abs/2205.14415

Language:PythonLicense:MITStargazers:451Issues:0Issues:0
Language:PythonLicense:MITStargazers:608Issues:0Issues:0
Language:PythonLicense:Apache-2.0Stargazers:240Issues:0Issues:0

Autoformer

About Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008

Language:Jupyter NotebookLicense:MITStargazers:1846Issues:0Issues:0

Informer2020

The GitHub repository for the paper "Informer" accepted by AAAI 2021.

Language:PythonLicense:Apache-2.0Stargazers:5184Issues:0Issues:0

Koopa

Code release for "Koopa: Learning Non-stationary Time Series Dynamics with Koopman Predictors" (NeurIPS 2023), https://arxiv.org/abs/2305.18803

Language:PythonLicense:MITStargazers:169Issues:0Issues:0

NeurIPS2022-FiLM

Source code of NeurIPS'22 paper: FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting

Language:PythonLicense:MITStargazers:32Issues:0Issues:0

MICN

Code release of paper "MICN: Multi-scale Local and Global Context Modeling for Long-term Series Forecasting" (ICLR 2023)

Language:PythonStargazers:94Issues:0Issues:0

Crossformer

Official implementation of our ICLR 2023 paper "Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting"

Language:PythonLicense:Apache-2.0Stargazers:408Issues:0Issues:0

app_deep_learning

T81-558: PyTorch - Applications of Deep Neural Networks @Washington University in St. Louis

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:336Issues:0Issues:0

Deep-Learning-Experiments

Videos, notes and experiments to understand deep learning

Language:Jupyter NotebookLicense:MITStargazers:1092Issues:0Issues:0

transformer

Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.

Language:Jupyter NotebookLicense:GPL-3.0Stargazers:831Issues:0Issues:0

LSTM-time-series-forecasting

Predicting the behavior of $BTC-USD by training a memory-based neural net on historical data

Language:Jupyter NotebookStargazers:40Issues:0Issues:0

Pytorch-Transfomer

My implementation of the transformer architecture from the Attention is All you need paper applied to time series.

Language:Jupyter NotebookStargazers:306Issues:0Issues:0

influenza_transformer

PyTorch implementation of Transformer model used in "Deep Transformer Models for Time Series Forecasting: The Influenza Prevalence Case"

Language:PythonStargazers:238Issues:0Issues:0

blog

Public repo for HF blog posts

Language:Jupyter NotebookStargazers:2191Issues:0Issues:0

DeepLearningExamples

State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterprise-grade infrastructure.

Language:Jupyter NotebookStargazers:13079Issues:0Issues:0

Transformer_Time_Series

Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting (NeurIPS 2019)

Language:Jupyter NotebookStargazers:522Issues:0Issues:0