KJ_Kwanjai's starred repositories
basic-dataset
a collection of Dataset from various sources
Deep-Learning-Algorithms
CNN, LSTM, RNN, GRU, DNN, BERT, Transformer, ULMFiT
t81_558_deep_learning
T81-558: Keras - Applications of Deep Neural Networks @Washington University in St. Louis
attention-is-all-you-need-keras
A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need
non-coding-DNA-classifier
Deep learning multi-label classifier of non-coding DNA sequences
transformer_soc
Transformer neural network for state of charge estimation in Tensorflow
python_for_microscopists
https://www.youtube.com/channel/UC34rW-HtPJulxr5wp2Xa04w?sub_confirmation=1
Seq2SeqSharp
Seq2SeqSharp is a tensor based fast & flexible deep neural network framework written by .NET (C#). It has many highlighted features, such as automatic differentiation, different network types (Transformer, LSTM, BiLSTM and so on), multi-GPUs supported, cross-platforms (Windows, Linux, x86, x64, ARM), multimodal model for text and images and so on.
transformer
Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
Speech-Emotion-Analysis
Human emotions are one of the strongest ways of communication. Even if a person doesn’t understand a language, he or she can very well understand the emotions delivered by an individual. In other words, emotions are universal.The idea behind the project is to develop a Speech Emotion Analyzer using deep-learning to correctly classify a human’s different emotions, such as, neutral speech, angry speech, surprised speech, etc. We have deployed three different network architectures namely 1-D CNN, LSTMs and Transformers to carryout the classification task. Also, we have used two different feature extraction methodologies (MFCC & Mel Spectrograms) to capture the features in a given voice signal and compared the two in their ability to produce high quality results, especially in deep-learning models.
Coursera_Deep_Learning_Specialization
Implementation of Logistic Regression, MLP, CNN, RNN & LSTM from scratch in python. Training of deep learning models for image classification, object detection, and sequence processing (including transformers implementation) in TensorFlow.
multigraph_transformer
IEEE TNNLS 2021, transformer, multi-graph transformer, graph, graph classification, sketch recognition, sketch classification, free-hand sketch, official code of the paper "Multi-Graph Transformer for Free-Hand Sketch Recognition"
machine-learning-articles
🧠💬 Articles I wrote about machine learning, archived from MachineCurve.com.
ML-assignments
about Regression, Classification, CNN, RNN, Explainable AI, Adversarial Attack, Network Compression, Seq2Seq, GAN, Transfer Learning, Meta Learning, Life-long Learning, Reforcement Learning.
Transfer-learning
Transfer Knowledge Learned from Multiple Domains for Time-series Data Prediction
flow-forecast
Deep learning PyTorch library for time series forecasting, classification, and anomaly detection (originally for flood forecasting).
transfer_learning_music
Transfer learning for music classification and regression tasks
ConvTransformerTimeSeries
Convolutional Transformer for time series
transformer-time-series-prediction
proof of concept for a transformer-based time series prediction model
learning-wavelets
Learning wavelet transforms for audio compression
PhysicsAnalysis
Converts a CSV file containing linear acceleration data into a set of line graphs
accelerometer_data_filtering
using median filter and low pass filter from scipy lib
Accelerometer-Filtering
Filter accelerometer data to produce a clear trace