[Question]: can tiny-cuda-nn build a network with layer's bias=0?
zyc-bit opened this issue · comments
Zhang Yuchang commented
Hello everyone, I have a question I would like to ask you, if you can reply to me, I would be very grateful.
In pytorch, I can do as follows:
self.mlp = torch.nn.Sequential(
layer1,
torch.nn.ReLU(inplace=True),
layer2,
torch.nn.ReLU(inplace=True),
layer3,
)
**if bias_enable:
torch.nn.init.constant_(self.mlp[-1].bias, 0)**
which I set mlp[-1] layer's bias=0 by usingtorch.nn.init.constant_()
Can tiny-cuda-nn do the same thing? How?
btw, I build network using tiny-cuda-nn as follows:
network_config = {
"otype": "CutlassMLP",
"activation": "ReLU",
"output_activation": "Sigmoid",
"n_neurons": layer_width,
"n_hidden_layers": num_layers - 1,
}
self.tcnn_encoding = tcnn.Network(
n_input_dims=in_dim,
n_output_dims=out_dim,
network_config=network_config,
)