probtorch / pytorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration

Home Page:http://pytorch.org

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Implement TransformDistribution with multiple base distributions?

vishwakftw opened this issue · comments

I find this very useful, when implementing certain distributions. I think this would be a bit non-trivial, but is it actually possible to do this?

Well it might make mathematical sense to implement a StackDistribution in analogy to torch.stack. This is nice theoretically because it is a categorical product. But in practice, I don't see it simplifying code or increasing correctness, so I'd opt to simply write those cases out by hand. Also in many cases you'd want to transform multiple distributions of different tensor shape, and by the time you add all the .contiguous().view() reshaping logic to make this work for a transformed distribution, I suspect the code harder to maintain than simply writing out the transformation manually.

Thanks @fritzo . Considering the difficulty in this task, I will close it.