Dylan-Inventor's starred repositories
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
google-research
Google Research
attention-is-all-you-need-pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
nlp-competitions-list-review
复盘所有NLP比赛的TOP方案,只关注NLP比赛,持续更新中!
reformer-pytorch
Reformer, the efficient Transformer, in Pytorch
stock-knowledge-graph
利用网络上公开的数据构建一个小型的证券知识图谱/知识库
transformer
Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
spacetimeformer
Multivariate Time Series Forecasting with efficient Transformers. Code for the paper "Long-Range Transformers for Dynamic Spatiotemporal Forecasting."
CrossSection
Code to accompany our paper Chen and Zimmermann (2020), "Open source cross-sectional asset pricing"
Transformer_Time_Series
Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting (NeurIPS 2019)
BERT-for-Sequence-Labeling-and-Text-Classification
This is the template code to use BERT for sequence lableing and text classification, in order to facilitate BERT for more tasks. Currently, the template code has included conll-2003 named entity identification, Snips Slot Filling and Intent Prediction.
TFC-pretraining
Self-supervised contrastive learning for time series via time-frequency consistency
linformer-pytorch
My take on a practical implementation of Linformer for Pytorch.
Quant-using-Python
A code lib to store the program file mentioned in my blogs