Neural Processes(NPs) combine the strengths of neural networks and Gaussian processes to achieve both flexible learning and fast prediction in stochastic processes. There are an extensive research to overcome issues from NP such as underfitting to context data and scalability to high dimensions. I'll introduce some important papers in NP family and provide implementations so that researchers who are new to NP can easily access how NP and its variants works.
Model Name | Paper | Key Idea | Implementation |
---|---|---|---|
NP | Neural Processes | Combine GP and NN to achieve both flexibilty and fast prediction | Completed |
ANP | Attentive Neural Processes | Insert attention module to prevent underfiitng to context data | Completed |
BNP | Bootstrapping Neural Processes | Improve robustness of NP using Bootstrapping | Completed |
FNP | The Functional Neural Process | Learn a graph of dependecies instead of global latent variable |