[ layer ] Erasing `_FP16` code block in each layer
skykongkong8 opened this issue · comments
For quick implementation of half-precision layers, I could not avoid adding _FP16
code blocks like : #2355 #2280 or many other PRs.
However, such behavior is not really helpful for clean code maintenance.
To avoid such coding style, I believe we should only use Tensor
ops inside of layer functions, and let the Tensor do the work automatically for given tensor_type
.
Thus from now on:
- when implementing new layers, avoiding using Tensor member function template inside of layer member function would be highly recommended.
- we should discuss about the way how to delete the
#ifdef
code block inside of the layer member function implementation
We are going to use some different approach to get rid of if/else code blocks for dataTypes. Close this issue accordingly.