nikitakit / self-attentive-parser

High-accuracy NLP parser with models for 11 languages.

Home Page:https://parser.kitaev.io/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ctb result

HongyanJiao opened this issue · comments

I used default parameters and 8 layers bert ,trained on data ctb5.1.
I got 'FScore=90.50, CompleteMatch=28.82'
But the paper 'Cross-Domain Generalization of Neural Constituency Parsers'
refered F1=92.14, exactmatch=44.42
I was confused if there were anything wrong on my training, have you fine-tuned bert?

Hey,I am using this model and ctb5.1 too.
Could I have your email? I want to ask you some questions. :-)