chenxin061 / pdarts

Codes for our paper "Progressive Differentiable Architecture Search:Bridging the Depth Gap between Search and Evaluation"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Can i control the flops of the models searched?

zyc4me opened this issue · comments

thank you for the code, i have a problom, how can i control the flops? for example, i want a model with input size 224*224, flops 60~80M, can you teach me how to set the search param? thanks!

thank you for the code, i have a problom, how can i control the flops? for example, i want a model with input size 224*224, flops 60~80M, can you teach me how to set the search param? thanks!

This is currently not an option in our code.
Actually, most differentiable NAS methods do not support controlling the FLOPs to a specific range.
However, you can adjust hyper-parameters during evaluation scenario. For example, you can apply small --layers and --initial_channels to reduce the FLOPs into the range 60~80M with input size 224*224.

thank you for the code, i have a problom, how can i control the flops? for example, i want a model with input size 224*224, flops 60~80M, can you teach me how to set the search param? thanks!

A nice question!

Here I provide another answer. Note that when you perform weighted average over a few operators, you can also compute the expected FLOPS of the overall network. If you hope to reduce FLOPS, you can add this number as an additional loss term. This does not impact the differentiability of the entire framework, so the only thing you have to do is to set a proper balancing parameter.

Hope this helps.

thank you for the code, i have a problom, how can i control the flops? for example, i want a model with input size 224*224, flops 60~80M, can you teach me how to set the search param? thanks!

You may go with this paper below

You Only Search Once: Single Shot Neural Architecture Search via Direct Sparse Optimization

Where they raised an interesting way to limit Flops and memory.