There are 1 repository under mixed-precision topic.
Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
Toolkit for efficient experimentation with Speech Recognition, Text2Speech and NLP
[CVPR 2019, Oral] HAQ: Hardware-Aware Automated Quantization with Mixed Precision
Simple Pose: Rethinking and Improving a Bottom-up Approach for Multi-Person Pose Estimation
Training with FP16 weights in PyTorch
A tool for debugging and assessing floating point precision and reproducibility.
基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。
:dart: Accumulated Gradients for TensorFlow 2
<케라스 창시자에게 배우는 딥러닝 2판> 도서의 코드 저장소
PyCon SG 2019 Tutorial: Optimizing TensorFlow Performance
Extremely simple and understandable GPT2 implementation with minor tweaks
This repository contains notebooks showing how to perform mixed precision training in tf.keras 2.0
Let's train CIFAR 10 Pytorch with Half-Precision!
Hybrid-Precision Analysis on CG Solver (H.A.C.S). Merging single and double precision to generate a fast yet accurate CG solver
PyTorch RNet implementation with Distributed and Mixed-Precision training support.
Deep learning solution for Cassava Leaf Disease Classification, a Kaggle's Research Code Competition using Tensorflow.
A Post-Training Quantizer for the Design of Mixed Low-Precision DNNs with Dynamic Fixed-Point Representation for Efficient Hardware Acceleration on Edge Devices
Experiments to accelerate GPU device for PyTorch training