tensorflow / lattice

Lattice methods in TensorFlow

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Train calibrator/lattice jointly with DNN with monotonicity

x-cloud opened this issue · comments

This may be related to the closed issue #23 but may need an update.
I wonder if the calibration or lattice layers can be used as the upper layer (closer to output) jointly with the DNN to ensure monotonicity. At the end of this tutorial "Other potential use cases...", it mentioned a few possibilities to integrate calibration or lattice layer with other types of networks. Are there any examples out there with guaranteed monotonicity?
Alternatively, is it possible to use lattice to somehow constrain or supervise the training of traditional DNN layers such that the outputs are monotonic? This would be in contrast to adding additional post-processing layers at the output. Thanks for your reply.

In general it is not possible to constrain common DNN layers by mixing monotonic lattice and calibrators. One possible way to achieve monotonicity is to use a monotonic linear layer from this library + non-linear and monotonic activation (e.g. a sigmoid).

You might want to look at the isotonic regression literature for methods that do not strictly force the constraints, but rather train models that are (likely) monotonic in the training data.