irapha / replayed_distillation

Implementation of Data-free Knowledge Distillation for Deep Neural Networks (on arxiv!)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Distill procedure: create student model

BomboButt opened this issue · comments

Dear @irapha:
When create student model,
this line
https://github.com/iRapha/replayed_distillation/blob/943169e3b37f5453b386ae652af81db935efc7bd/procedures/distill.py#L28

shouldn't it be like:
outputs, _, feed_dicts = m.get(f.student_model).create_model(inputs_reshape, output_size)

Hm, I remember there being something weird about the way tensorflow was loading models that made something be unintuitive in the way flags worked. So there's a chance that this is correct as is. Unfortunately, I don't remember exactly what the unintuitive thing is (it's been a year since I wrote this code).

The codebase also went through major refactoring right before we released it, so there's also a chance that this should, in fact, be student_model.

Do you see it not working the way it is? Does changing this fix it? If so, please let me know and I'll fix it here too.