Hao Gu's repositories
extreme-bert
ExtremeBERT is a toolkit that accelerates the pretraining of customized language models on customized datasets, described in the paper “ExtremeBERT: A Toolkit for Accelerating Pretraining of Customized BERT”.
dont-stop-pretraining
Code associated with the Don't Stop Pretraining ACL 2020 paper
Language:Python000
icemoon-creative
Config files for my GitHub profile.
000
lihang-code
《统计学习方法》的代码实现
Language:Jupyter Notebook000
matlab_demos
it contains all the MATLAB demo code associated with my machine learning notes
Language:MATLAB000