Arthur's repositories
RecvisProject
In this project, we propose to study Vision Transformers trained using the Barlow Twins self-supervised method, and compare the results with DINO. We demonstrate the effectiveness of the Barlow Twins method by showing that networks pretrained on the small PASCAL VOC 2012 dataset are able to generalize well. Authors: Apavou Clément & Zucker Arthur
DeepLearningProject
Deep Learning Project : Combining Self-Supervised Objective Functions in Computer Vision
PytorchTemplate
This Pytorch template regroups every of my favorite tools, and was made in order for everyone to easily implement, train and test models. ment and develop models, log and
SH_Propagation
Projet de jeu en GODOT
accelerate
🚀 A simple way to train and use PyTorch models with multi-GPU, TPU, mixed-precision
arthurzucker
My secret page with fun and interesting features
HighPerformanceComputing
Lab works and Project using MPI and openMP parallel programming.
SecondYear
This git contains the entire content of my classes
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
tokenizers
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
adept-inference
Inference code for Persimmon-8B
AdvancedHighPerformanceComputing
GPU project on CUDA
minbpe
Minimal, clean, educational code for the Byte Pair Encoding (BPE) algorithm commonly used in LLM tokenization.