Giters
kermitt2
/
delft
a Deep Learning Framework for Text
Geek Repo:
Geek Repo
Github PK Tool:
Github PK Tool
Stargazers:
385
Watchers:
26
Issues:
83
Forks:
65
kermitt2/delft Issues
Last tensorflow-addons
Updated
6 months ago
Comments count
1
Print additional information when training
Updated
10 months ago
Comments count
3
I am trying to make a model that reads problems and solutions and then it will generate text (when I give it a problem it will give the solutions) but i am stuck
Closed
a year ago
Comments count
2
Sub-tokenization with certain transformers
Updated
a year ago
Comments count
22
Support for Pre-trained ELMo Representations for Many Languages
Closed
a year ago
Comments count
3
Transformer configuration should be passed to the tokenizer?
Closed
a year ago
Comments count
4
Hardcoded padding tokens
Closed
a year ago
Comments count
3
Allow uniform modification of nb_workers
Updated
a year ago
Implement feature channel in classification
Updated
a year ago
Bad performance to produce ELMo embedding
Closed
2 years ago
Comments count
2
Comparison version 0.2.6 and 0.3.0 with scibert
Closed
2 years ago
Comments count
10
Fix --output option to dump the model in a specific directory
Updated
2 years ago
Comments count
3
add parameter to optionally output raw results in the evaluation
Updated
2 years ago
Comments count
8
Print minimal configuration information when running training or train/eval
Closed
2 years ago
Comments count
3
Classification and transformers
Closed
2 years ago
Comments count
3
improve directory management for training/nfold-training
Updated
2 years ago
[suggestion] add transformer name / embedding names in the model name
Closed
2 years ago
Comments count
3
Switch to TensorFlow 2.0 and tf.keras
Closed
2 years ago
Comments count
12
Load transformer config and tokenizer from disk when n>1 for nfold traninig
Closed
2 years ago
Better support SentencePiece tokenizer(s)
Updated
2 years ago
Training callbacks are ignored when using a BERT architecture for sequence labelling
Closed
2 years ago
Feature-based approach with BERT for seq. labelling is super slow
Closed
2 years ago
Automatically download embeddings
Closed
2 years ago
Comments count
1
Eval metrics per class
Closed
3 years ago
Comments count
1
Command for saving the metrics in a file
Closed
3 years ago
Comments count
2
Allow multiple tokens per feature data row
Updated
3 years ago
Comments count
3
Unable to train NER using custom ELMO embedding model
Closed
3 years ago
Comments count
1
sequence labelling, n-fold training should use separate preprocessors
Updated
3 years ago
Comments count
1
header training data - mismatching features columns?
Closed
3 years ago
Comments count
6
Implement sliding window
Updated
4 years ago
Comments count
6
Annoying warning due to joblib
Closed
4 years ago
Comments count
2
Cannot have 2 models with ELMo embeddings of 2 different languages at the same time
Closed
4 years ago
Suggestion: save preprocessor in a more transferable format
Closed
4 years ago
Comments count
7
<PAD> tags should be filtered out from the output of the Tagger
Updated
4 years ago
Comments count
17
Find a way to disable the ELMo/BERT caching mechanism in "production" mode
Closed
4 years ago
Comments count
1
Add a way to disable multiprocessing in Classifier training options as it is for Sequence
Closed
4 years ago
Some links are broken in the documentation
Closed
4 years ago
Comments count
1
grobidTagger: make model an optional argument
Updated
4 years ago
reader.py misses the the <EX_ENAMEX> annotated entities on LeMonde Corpus
Closed
4 years ago
Comments count
3
Sorting of fields in the sequence labelling evaluation report
Closed
4 years ago
Comments count
1
new multiprocessing parameter for Sequence and TrainingConfig
Closed
4 years ago
Comments count
5
sequenceLabelling.Trainer.train method fails if validation set is None
Closed
4 years ago
Comments count
3
Be able to pass an additional callbacks argument to the train method of Sequence/Classifier object
Closed
4 years ago
Comments count
3
Incompatible arrays dimension when using ELMo and input is of length 1 (only 1 word)
Closed
4 years ago
Comments count
2
Average precision/recall/f1 per label
Closed
4 years ago
Comments count
2
--use-BERT option is ignored by nerTagger eval
Closed
5 years ago
Comments count
1
'patience' training_config parameter is ignored by Trainer class
Closed
5 years ago
Comments count
1
bert-{lang} bert-base-{lang} discrepancy in Embeddings
Closed
5 years ago
Comments count
1
LMDB embeddings creation is very slow on spinning drive
Closed
5 years ago
Comments count
2
Error when model checkpointing model if f1 is not available (yet)
Closed
5 years ago
Comments count
1
Previous
Next