cvbrandoe / NEREval

Implementation of evaluation measures for Named Entity Recognition task for use in the Matriciel project (PEPS)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

PRES : https://docs.google.com/presentation/d/1WMa8Sc1RudVV2qeW7YxiXan1WUEWk_FISczUQA62bPE/edit?usp=sharing

NER Evaluation

Here, you will find several code sources for use in the Matriciel project (PEPS CNRS), in particular

  • an implementation of evaluation measures for Named Entity Recognition task (classification is not considered) according to the State of art (Nouvel et al. 2015). The algorithm aligns named-entities annotations (and their context) given two input texts.

  • a client for annotating named-entities in French texts using DBpedia Spotlight (remote server or own server).

For any inquiry, please contact us.

Magali Capeyron Catherine Domingues Carmen Brando

About

Implementation of evaluation measures for Named Entity Recognition task for use in the Matriciel project (PEPS)


Languages

Language:Java 100.0%