chenxin061 / pdarts

Codes for our paper "Progressive Differentiable Architecture Search:Bridging the Depth Gap between Search and Evaluation"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

What does 'operation-level Dropout' mean?

dk-hong opened this issue · comments

As I understand, it means that make every value of an operation's output 0 with probability p.
But in your code, a dropout doesn't work operation level, but element-level.

Is it right that 'operation-level Dropout' means typical element-level dropout?

And I want you to ask one more question.

In your paper, page 4, section3.2.2, you said "gradually decay the Dropout rate during the training process in each search stage, thus the straightforward path through skip-connection is blocked at the beginning and treated equally afterward when parameters of other operations are well learned,".

I think it means that 'set Dropout rate large when start training and reduce it during training' and I also think it works well.
But your search script in Github, you set Dropout rate small at first and increase it during training.

Why did you schedule Dropout rate like that?
Is it better than the first way?