Jaeyun-Song / AM3-MAML

meta learning from the initializaion induced by word embedding

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

DOI

AM3-MAML

MAML with the initialization induced by word embeddings.

Our code is based on https://github.com/sungyubkim/GBML

  • [AM3-MAML]
python3 main.py --download False

Results on miniImagenet

  • Without pre-trained encoder (Use 64 channels by default. The exceptions are in parentheses)
5way 1shot 5way 1shot (ours) 5way 5shot 5way 5shot (ours)
MAML - - 63.11 (64) -
AM3-MAML - - - 66.41 (64)

How to run AM3-MAML

  1. Download Common Crawl (840B tokens, 2.2M vocab, cased, 300d vectors, 2.03 GB download) in Glove github.
  2. Extract files and run "mml/preprocess.py" to get word embeddings from pretrained Glove.
  3. Install Higher and Torchmeta (links in dependencies).
  4. Set "--download True" if you need to download miniimagenet and run the command above it.

Related work

[1] Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
[2] Adaptive Cross-Modal Few-Shot Learning
[3] GloVe: Global Vectors for Word Representation
[4] Meta-Dataset: A Dataset of Datasets for Learning to Learn from Few Examples
[5] Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks

Reference for codes

[1] GBML for the base
[2] Torchmeta for the dataset
[3] glove_pretrain for the pretrained glove embeddings
[4] SetTransformer for SetTransformer modules

Dependencies

Acknowledgement

This work was supported by Institute of Information & Communications Technology Planning & Evaluation(IITP) grant funded by the Korea government (MSIT) (No.2019-0-01371, Development of brain-inspired AI with human-like intelligence)

About

meta learning from the initializaion induced by word embedding

License:MIT License


Languages

Language:Python 100.0%