tensorflow / lattice

Lattice methods in TensorFlow

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

calibration of lattice output

vishnuapp opened this issue · comments

We need to train layers of lattices, and want to calibrate outputs of lattices.
I set it up to have bounds on the lattice output ([0:1]) and then the output is setup with calibration in that range with uniform keypoints.

The issue is that the outputs of the lattice drift outside the bounds while training and the projection ops clip the output. Eventually the output of the lattice covers a very small range of values and the output calibration isn't useful.

I tried an un bounded lattice output and passing it thru a sigmoid to get it to the 0:1 range, which seems to work better, but that doesn't use the full 0:1 range of the calibration unless I carefully scale the lattice output.

I finally had better success by modifying the projection ops in lattice_layer to linearly scale the outputs to [min:max] instead of clipping the output.

Maybe I should also frame this as a question.
What is the recommended way to calibrate the output of a lattice layer?
Add an activation on top of it? using bounds and letting the lattice layer clip its parameters seems to not converge well.

You can try adding an output calibration layer: bound lattice output to 0-1 and add a PWL calibration layer on top of it. This might be particularly helpful for regression models with heavy tail label distributions.