baoyufuyou / fastaiv2keras

This is an implementation of the fastai part1 v2 course in Keras

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

fastaiv2keras

This is an implementation of the fastai part1 v2 course in Keras lesson1 and lesson1-finetune2 jupyter notebooks go through the dogs and cats dataset

  • lesson1 uses a finetune function that simply freezes early layers and makes the fc layer output predictions for 2 classes
  • lesson1-finetune2 uses finetune2 function. It uses a few extra layers to enhance the model. It adds average and max pooling layers and concatenates them and then follows them with batchnorm, dropout, and dense layers. Note: that our function to find the optimal learning rate (LR_FIND) does not seem to work quite as well when we use finetune2.

ABOUT Learning rate found

in utils.py

Callback: LR_find

Blog:

Estimating an Optimal Learning Rate For a Deep Neural Network

https://towardsdatascience.com/estimating-optimal-learning-rate-for-a-deep-neural-network-ce32f2556ce0

Paper to read

Unsupervised Domain Adaptation by Backpropagation

https://arxiv.org/pdf/1409.7495.pdf

About

This is an implementation of the fastai part1 v2 course in Keras


Languages

Language:Jupyter Notebook 91.3%Language:Python 8.7%