google-deepmind / neural-processes

This repository contains notebook implementations of the following Neural Process variants: Conditional Neural Processes (CNPs), Neural Processes (NPs), Attentive Neural Processes (ANPs).

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Question for latent encoder of attentive neural processes script

jsikyoon opened this issue · comments

Hi,

First of all, thank you for opening your great script!

I have a question about latent encoder.

In your code, posterior and prior use same latent encoder. Exactly using same (x,y) pair encoder is no problem, but the linear layers to sample the latent variable would be different functions for prior and posterior I think.

If my understanding about the code is wrong, then could I know which point I missed?

Thanks.

Best Regards,
Jaesik Yoon.

Hi Jaesik, I'm not quite sure what you mean when you say

linear layers to sample the latent variable would be different functions for prior and posterior

The code uses the same linear function for both; using the notation of Equation (3) in the Attentive Neural Processes paper, the prior corresponds to q(z|s_C) and the posterior corresponds to q(z|s_T), and the same encoder parameters are used to compute samples from each (as well as the KL between them).