yitong91 / Collaborating-Networks

Estimating Uncertainty Intervals from Collaborating Networks.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Collaborating-Networks

This repository contains the code associated with examples of paper by Zhou, Li, Yuan and Carlson. The manuscript is now available on https://jmlr.org/papers/volume22/20-1100/20-1100.pdf. These examples illustrate how to use collaborating networks(CN) to estimate the conditional distribution Y|X=x of continuous outcome. Specifically, one network (g) approximates the cumulative distribution function, and the second network (f) approximates its inverse. Simple example can be found in CN_example.ipynb .

Methods for Comparison

We include four other methods also capable of estimating full distribution. They are:

[1] Gal, Yarin, and Zoubin Ghahramani. ["Dropout as a bayesian approximation: Representing model uncertainty in deep learning." In International Conference on Machine Learning,] (http://proceedings.mlr.press/v48/gal16.pdf) 2016.

[2] Kuleshov, Volodymyr, Nathan Fenner, and Stefano Ermon, ["Accurate uncertainties for deep learning using calibrated regression." In International Conference on Machine Learning,] (http://proceedings.mlr.press/v80/kuleshov18a/kuleshov18a.pdf) 2018.

[3] Gal, Y., Hron, J. and Kendall, ["Concrete dropout." In International Conference on Neural Information Processing Systems,] (https://proceedings.neurips.cc/paper/2017/file/84ddfb34126fc3a48ee38d7044e87276-Paper.pdf) 2017.

[4] Jankowiak, Martin, Geoff Pleiss, and Jacob Gardner, ["Parametric Gaussian process regressors." In International Conference on Machine Learning,] (http://proceedings.mlr.press/v119/jankowiak20a/jankowiak20a.pdf) 2020.

[5] Yaniv Romano, Evan Patterson, and Emmanuel J. Candes. [“Conformalized quantile regression.” In International Conference on Neural Information Processing Systems,] (https://proceedings.neurips.cc/paper/2019/file/5103c3584b063c431bd1268e9b5e76fb-Paper.pdf) 2019.

[6]Lakshminarayanan, Balaji, Alexander Pritzel, and Charles Blundell. ["Simple and scalable predictive uncertainty estimation using deep ensembles."] (http://papers.nips.cc/paper/7219-simple-and-scalable-predictive-uncertainty-estimation-using-deep-ensembles.pdf) 2017.

Experimental Details

  1. property_of_learning_f: CN's stability under overparameterization, and the merit of learning g and f jointly over learning g alone with a fixed f.
  2. synthetic_examples: Two synthetic examples simulated from Gaussian and Weibull Distribution
  3. real_data: Five real data examples.
  4. 2d_example: Extend CN for 2 dimensional outcomes.

Overall:

  • CN has great recovery of the ground truth distribution in the synthetic examples:

Gaussian Sample

  • CN has faithfull interval coverage(calibration)

  • CN increases the interval sharpness:

  • CN can also be extended for multi-output problems.

Scatter plot of estimated distribution(left) versus true distribution(right):

CDF of estimated distribution(left) versus true distribution(right):

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

Estimating Uncertainty Intervals from Collaborating Networks.

License:MIT License


Languages

Language:Jupyter Notebook 100.0%