DRSY / LAMP

[NAACL 2022 Findings]Specializing Pre-trained Language Models for Better Relational Reasoning via Network Pruning

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Exploring and Exploiting Latent Commonsense Knowledge in Pretrained Masked Language Models

Codebase for the paper "Exploring and Exploiting Latent Commonsense Knowledge in Pretrained Masked Language Models".

Note: under maintenance, will be complete soon.

Current supported models:

  • DistilBERT-base
  • BERT(base, large, etc.)
  • RoBERTa(base, large, etc.)
  • MPNet

Prepare the codebase

git clone https://github.com/DRSY/LAMP.git && cd LAMP
pip install -r requirements.txt

Run pruning and probing

Specify parameters about probing experiments in a separate params file, then run:

make -f Makefile probe

detailed hyperparameters can be found in probe.sh.

Run GLUE

Specify parameters about GLUE experiments in a separate params file, then run:

make -f Makefile glue

Clean the log files

make -f Makefile clean

detailed hyperparameters can be found in glue.sh.

About

[NAACL 2022 Findings]Specializing Pre-trained Language Models for Better Relational Reasoning via Network Pruning


Languages

Language:Python 98.4%Language:Shell 1.3%Language:Makefile 0.2%