Lukas-Justen / Law-OMNI-BERT-Project

Directly applying advancements in transfer learning from BERT results in poor accuracy in domain-specific areas like law because of a word distribution shift from general domain corpora to domain-specific corpora. In our project, we will demonstrate how the pre-trained language model BERT can be adapted to additional domains, such as contract law or court judgments.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Lukas-Justen/Law-OMNI-BERT-Project Issues

No issues in this repository yet.