We extend 'Conditional Neural Processes' to be able to generate different samples given the same context points. We achieve this by introducing a global latent variable that allows for consistency across samples.
CONDITIONAL NEURAL PROCESSES
We introduce neural processes, a generalisation of the GQN framework to a range of few-shot tasks like regression and classification. Neural processes are flexible, computationally efficient at test-time and learn to estimate uncertainty given some context points.
GENERATIVE QUERY NETWORKS
S M A Eslami, D J Rezende, F Besse, F Viola, A S Morcos, M Garnelo, A Ruderman, A A Rusu, I Danihelka, K Gregor, D P Reichert, L Buesing, T Weber, O Vinyals, D Rosenbaum, N Rabinowitz, H King, C Hillier, M Botvinick, D Wierstra, K Kavukcuoglu, D Hassabis
Generative query networks learn to predict what 3D scenes look like viewed from a new position given some context observations of that scene from other viewpoints. As part of my internship I used the representations learned by GQN to learn the reaching task with a simulated robot arm.
TOWARDS DEEP SYMBOLIC REINFORCEMENT LEARNING
We propose a reinforcement learning architecture that comprises a neural back end and a symbolic front end with the potential to overcome some of the shortcomings associated with both symbolic and neural methods.
Bonus paper from a past life:
Interaction between tumour-infiltrating B cells and T cells controls the progression of hepatocellular carcinoma
Have you ever wondered what role the immune system plays in cancer? Wonder no longer!