Giorgos Paraskevopoulos's repositories
optimistic-adam
PyTorch implementation of Optimistic Adam proposed in Training GANs with Optimism (https://arxiv.org/pdf/1711.00141.pdf)
kaldi-grpc-server
Deploy Kaldi models using grpc for bidirectional streaming.
grnet_guide
Guide for the slp group on how to use the Grnet cluster
gmmhmm-pytorch
Pytorch implementations of GMM - HMM
python-lab
Preparatory Lab for NTUA Pattern Recognition and Speech and Language Processing
omegaconf-argparse
Integration between Omegaconf and argparse for mixed config file and CLI arguments
cookiecutter-pytorch-slp
Project template for models based on https://github.com/georgepar/slp
react-transcript-editor
A React component to make correcting automated transcriptions of audio and video easier and faster. By BBC News Labs. - Work in progress
text-baselines
Utilities for text processing using sklearn. Useful for baselining small datasets.
blur-diffusion
Official PyTorch implementation of the paper Progressive Deblurring of Diffusion Models for Coarse-to-Fine Image Synthesis.
CMU-MultimodalSDK
CMU MultimodalSDK is a machine learning platform for development of advanced multimodal models as well as easily accessing and processing multimodal datasets.
docstr_coverage
Docstring coverage analysis and rating for Python
dotfiles
my dotfiles
georgepar.github.io
Github pages
kaldi
kaldi-asr/kaldi is the official location of the Kaldi project.
lanarky
FastAPI framework to build production-grade LLM applications
mmf
A modular framework for vision & language multimodal research from Facebook AI Research (FAIR)
models
Models and examples built with TensorFlow
ntuaslp
Website for NTUA/SLP group
pudb
Full-screen console debugger for Python
pudb-torch
Debug PyTorch with pudb
pykaldi
A Python wrapper for Kaldi
sherpa
Speech-to-text server framework with next-gen Kaldi
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
tuning_playbook
A playbook for systematically maximizing the performance of deep learning models.
x-transformers
A simple but complete full-attention transformer with a set of promising experimental features from various papers