Oleg Ovcharenko's starred repositories
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
pytorch-image-models
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT), MobileNetV4, MobileNet-V3 & V2, RegNet, DPN, CSPNet, Swin Transformer, MaxViT, CoAtNet, ConvNeXt, and more
instant-ngp
Instant neural graphics primitives: lightning fast NeRF and more
Megatron-LM
Ongoing research training transformer models at scale
uvadlc_notebooks
Repository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2023
zarr-python
An implementation of chunked, compressed, N-dimensional arrays for Python.
deepstream_python_apps
DeepStream SDK Python bindings and sample applications
seismic-zfp
Compress and decompress seismic data
Kirchhoff-Seismic-Migration
seismic migration
storseismic
StorSeismic: An approach to pre-train a neural network to store seismic data features
Transform2022_SelfSupervisedDenoising
A hands-on introduction to self-supervised, blind-spot denoising
segysak-t21-tutorial
Transform 21 Tutorial for SEGYSAK
pytorch-pl-hydra-templates
Different template codes for Deep Learning with PyTorch.
dda_pytorch
Direct Domain Adaptation Through Reciprocal Linear Transformations