lxuechen / private-transformers

A codebase that makes differentially private training of transformers easy.

Home Page:https://arxiv.org/abs/2110.05679

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

v0.3.0 fixes

lxuechen opened this issue · comments

Non-structural fixes.

  • Convert to make_private style to avoid bad syntax highlighting during static analysis
  • Improve the cleanliness of examples
  • Refactor test file and use functorch to simplify ground truth gradients' logic
  • Don't compute per-sample gradients for weights which don't require gradients
  • Use the new smart resizer for tokenizer and model
  • Refactor decoding to use new left padding based construction