EmilyAlsentzer / clinicalBERT

repository for Publicly Available Clinical BERT Embeddings

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Known issue with section splitting in heuristic_tokenize.py

EmilyAlsentzer opened this issue · comments

There are two bugs in the sent_tokenize_rules function in heuristic_tokenize.py

We have not fixed them in this repo because we want to maintain the reproducibility of
our code at the time the work was published. However, anyone wanting to extend this work should make the following changes in heuristic_tokenize.py:

  1. fix a bug on line #168 where . should be replaced with \. i.e. should be while re.search('\n\s*%d\.'%n,segment):
  2. add else statement (else: new_segments.append(segments[i])) to the if statement at line 287 if (i == N-1) or is_title(segments[i+1]): This fixes a bug where lists that have a title header will lose their first entry.