ClarenceKe's starred repositories
SSVEP-Impulse-Response
Some demos about how to decompose and reconstruct an SSVEP based on the superposition model
fucking-algorithm
刷算法全靠套路,认准 labuladong 就够了!English version supported! Crack LeetCode, not only how, but also why.
vision-lstm
xLSTM as Generic Vision Backbone
gpt-computer-assistant
gpt-4o for windows, macos and linux
awesome-kan
A comprehensive collection of KAN(Kolmogorov-Arnold Network)-related resources, including libraries, projects, tutorials, papers, and more, for researchers and developers in the Kolmogorov-Arnold Network field.
Deep-Learning-for-BCI
Resources for Book: Deep Learning for EEG-based Brain-Computer Interface: Representations, Algorithms and Applications
Large-Time-Series-Model
Official code, datasets and checkpoints for "Timer: Generative Pre-trained Transformers Are Large Time Series Models" (ICML 2024)
Time-Series-Library
A Library for Advanced Deep Time Series Models.
FusionMamba
FusionMamba: Dynamic Feature Enhancement for Multimodal Image Fusion with Mamba
pytorch-grad-cam
Advanced AI Explainability for computer vision. Support for CNNs, Vision Transformers, Classification, Object detection, Segmentation, Image similarity and more.
brainscanner
Real-time EEG source localization based on Smarting mBrainTrain EEG headset and implemented in Matlab.
uniBrain-Speller
uniBrain Speller: A one-stop, user-friendly, open-source brain-computer interface speller software developed by Prof. Gao Xiaorong's team at Tsinghua University, China, designed for various users including patients, researchers, and practitioners.
ept_TFCE-matlab
Advanced EEG Statistics
tuning_playbook_zh_cn
一本系统地教你将深度学习模型的性能最大化的战术手册。
tuning_playbook
A playbook for systematically maximizing the performance of deep learning models.
Awesome-Transformer-Attention
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
nixtla
TimeGPT-1: production ready pre-trained Time Series Foundation Model for forecasting and anomaly detection. Generative pretrained transformer for time series trained on over 100B data points. It's capable of accurately predicting various domains such as retail, electricity, finance, and IoT with just a few lines of code 🚀.
transformer-models
Deep Learning Transformer models in MATLAB