schwartz-lab-NLP 's repositories
fight-bias-with-bias
Code for paper 'Fighting Bias with Bias: Promoting Model Robustness by Amplifying Dataset Biases'
label-bias
Evaluating Label Bias in LLMs
Language:Python000
model_size_and_gender_bias
Code for "Fewer Errors, but More Stereotypes? The Effect of Model Size on Gender Bias" by Yarden Tal, Inbal Magar and Roy Schwartz, 4rd Workshop on Gender Bias in Natural Language, NAACL 2022.
Language:Python000