IshmaelBelghazi / ALI

Adversarially Learned Inference

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

softplus activation

edgarriba opened this issue · comments

@vdumoulin I'm a bit confused about why are you using softplus activation** for the discriminator since in the paper formulas there no mention or at least I cannot recognize it.

**https://github.com/IshmaelBelghazi/ALI/blob/master/ali/bricks.py#L69-L72

BTW, @vdumoulin we met at your talk in the CVC during the break time :D

Nice to talk to you again! :) This is a numerically stable version of -log(sigmoid(x)):

-log(sigmoid(x)) = -log(1 / (1 + exp(-x))) = log(1 + exp(-x)) = softplus(-x)

Normally the substitution would happen automatically at the optimization stage of compiling the Theano function, but I encountered situations in the past where this was not triggered, so I use the numerically stable version directly.