NbAiLab / notram

Norwegian Transformer Model

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Couldn't fine tun ('NbAiLab/nb-bert-large')

MarieNoelChanthavong opened this issue · comments

Hello, I tried to run to collab notebook 'How to finetune a classification model (advanced)'. However, I got the issue when the fine-tuning model.

Epoch 1/5

AttributeError Traceback (most recent call last)
in ()
11
12 # Start training
---> 13 history = model.fit(train_dataset, validation_data=dev_dataset, epochs=num_epochs, batch_size=batch_size)
14
15 print(f'\nThe training has finished training after {num_epochs} epochs.')

1 frames
/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/func_graph.py in autograph_handler(*args, **kwargs)
1145 except Exception as e: # pylint:disable=broad-except
1146 if hasattr(e, "ag_error_metadata"):
-> 1147 raise e.ag_error_metadata.to_exception(e)
1148 else:
1149 raise

AttributeError: in user code:

File "/usr/local/lib/python3.7/dist-packages/keras/engine/training.py", line 1021, in train_function  *
    return step_function(self, iterator)
File "/usr/local/lib/python3.7/dist-packages/transformers/modeling_tf_utils.py", line 1043, in compute_loss  *
    return super().compute_loss(*args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/keras/engine/training.py", line 919, in compute_loss  **
    y, y_pred, sample_weight, regularization_losses=self.losses)
File "/usr/local/lib/python3.7/dist-packages/keras/engine/compile_utils.py", line 199, in __call__
    y_t, y_p, sw = match_dtype_and_rank(y_t, y_p, sw)
File "/usr/local/lib/python3.7/dist-packages/keras/engine/compile_utils.py", line 684, in match_dtype_and_rank
    if ((y_t.dtype.is_floating and y_p.dtype.is_floating) or

AttributeError: 'NoneType' object has no attribute 'dtype'

I have the exact same problem as @MarieNoelChanthavong