aunum / goro

A High-level Machine Learning Library for Go

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Using `Init: gorgonia.Zeroes()` in every layer does not make output always zero

bandreghetti opened this issue · comments

Initializing network with the following code:

qModel, err := m.NewSequential("qLearning")
if err != nil {
	return nil, err
}

xShape := []int{1, 71}
yShape := []int{1, 16}
in := m.NewInput("state", xShape)
out := m.NewInput("actionValue", yShape)

qModel.AddLayers(
	layer.FC{Input: in.Squeeze()[0], Output: 256, Init: gorgonia.Zeroes()},
	layer.FC{Input: 256, Output: 128, Init: gorgonia.Zeroes()},
	layer.FC{Input: 128, Output: 64, Init: gorgonia.Zeroes()},
	layer.FC{Input: 64, Output: 32, Init: gorgonia.Zeroes()},
	layer.FC{Input: 32, Output: out.Squeeze()[0], Activation: layer.Linear, Init: gorgonia.Zeroes()},
)

err = qModel.Compile(in, out,
	m.WithBatchSize(1),
)
if err != nil {
	return nil, err
}

I want the initial output of the network to be zero for any input. What am I doing wrong?

I think I figured it out, the InitBias was missing for the layers. Just passed gorgonia.Zeroes() to it.