openvinotoolkit / anomalib

An anomaly detection library comprising state-of-the-art algorithms and features such as experiment management, hyper-parameter optimization, and edge inference.

Home Page:https://anomalib.readthedocs.io/en/latest/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Bug]: No samples to concatenate - AUROC calculation during testing

udarnicus opened this issue · comments

commented

Describe the bug

Hello,

I am training the EfficientAD model with a custom Folder dataset like this:

datamodule = Folder( name="mvtec_ad", normal_dir="path", abnormal_dir="path", normal_test_dir ="path", task="classification", num_workers=0, train_batch_size=32, eval_batch_size = 4, )

engine = Engine( normalization=NormalizationMethod.MIN_MAX, threshold="F1AdaptiveThreshold", task=TaskType.CLASSIFICATION, image_metrics=["AUROC", "BinaryPrecision","BinaryRecall","F1Score"], accelerator="auto", check_val_every_n_epoch=1, devices=1, max_epochs=2, num_sanity_val_steps=0, val_check_interval=1.0, logger=[logger_tensorboard,logger_comet ], log_every_n_steps = 10, callbacks = callbacks, )

The train steps and evaluation steps work fine, however when I run predictions = engine.predict( dataloaders=datamodule.test_dataloader(),model=model,) to run the model on the test set, I get the following error:

image

What could be the cause for this error? This problem exists for AUPRO as well.

Dataset

Folder

Model

Other (please specify in the field below)

Steps to reproduce the behavior

  1. Clone anomalib from GitHub
  2. Create conda environment as described in documentation
  3. Calculate AUROC metric on the test dataset

OS information

OS information:

  • OS: Sagemaker instance
  • anomalib==1.1.0.dev0

Expected behavior

Calculate the AUROC metric for the test set.

Screenshots

No response

Pip/GitHub

GitHub

What version/branch did you use?

1.1.0.dev0

Configuration YAML

no configuration file used

Logs

Predicting DataLoader 0: 100%|██████████| 308/308 [11:57<00:00,  0.43it/s]
Traceback (most recent call last):
  File "/home/sagemaker-user/anomalib/run_training_singlerun.py", line 174, in <module>
    predictions = engine.predict(
  File "/home/sagemaker-user/anomalib/src/anomalib/engine/engine.py", line 772, in predict
    return self.trainer.predict(model, dataloaders, datamodule, return_predictions, ckpt_path)
  File "/home/sagemaker-user/.conda/envs/anomalib_env/lib/python3.10/site-packages/lightning/pytorch/trainer/trainer.py", line 864, in predict
    return call._call_and_handle_interrupt(
  File "/home/sagemaker-user/.conda/envs/anomalib_env/lib/python3.10/site-packages/lightning/pytorch/trainer/call.py", line 44, in _call_and_handle_interrupt
    return trainer_fn(*args, **kwargs)
  File "/home/sagemaker-user/.conda/envs/anomalib_env/lib/python3.10/site-packages/lightning/pytorch/trainer/trainer.py", line 903, in _predict_impl
    results = self._run(model, ckpt_path=ckpt_path)
  File "/home/sagemaker-user/.conda/envs/anomalib_env/lib/python3.10/site-packages/lightning/pytorch/trainer/trainer.py", line 989, in _run
    results = self._run_stage()
  File "/home/sagemaker-user/.conda/envs/anomalib_env/lib/python3.10/site-packages/lightning/pytorch/trainer/trainer.py", line 1030, in _run_stage
    return self.predict_loop.run()
  File "/home/sagemaker-user/.conda/envs/anomalib_env/lib/python3.10/site-packages/lightning/pytorch/loops/utilities.py", line 182, in _decorator
    return loop_run(self, *args, **kwargs)
  File "/home/sagemaker-user/.conda/envs/anomalib_env/lib/python3.10/site-packages/lightning/pytorch/loops/prediction_loop.py", line 128, in run
    return self.on_run_end()
  File "/home/sagemaker-user/.conda/envs/anomalib_env/lib/python3.10/site-packages/lightning/pytorch/loops/prediction_loop.py", line 201, in on_run_end
    self._on_predict_end()
  File "/home/sagemaker-user/.conda/envs/anomalib_env/lib/python3.10/site-packages/lightning/pytorch/loops/prediction_loop.py", line 372, in _on_predict_end
    call._call_callback_hooks(trainer, "on_predict_end")
  File "/home/sagemaker-user/.conda/envs/anomalib_env/lib/python3.10/site-packages/lightning/pytorch/trainer/call.py", line 208, in _call_callback_hooks
    fn(trainer, trainer.lightning_module, *args, **kwargs)
  File "/home/sagemaker-user/anomalib/src/anomalib/callbacks/visualizer.py", line 148, in on_predict_end
    return self.on_test_end(trainer, pl_module)
  File "/home/sagemaker-user/anomalib/src/anomalib/callbacks/visualizer.py", line 121, in on_test_end
    for result in generator(trainer=trainer, pl_module=pl_module):
  File "/home/sagemaker-user/anomalib/src/anomalib/utils/visualization/metrics.py", line 31, in generate
    fig, log_name = metric.generate_figure()
  File "/home/sagemaker-user/anomalib/src/anomalib/metrics/auroc.py", line 81, in generate_figure
    fpr, tpr = self._compute()
  File "/home/sagemaker-user/anomalib/src/anomalib/metrics/auroc.py", line 72, in _compute
    fpr, tpr, _thresholds = super().compute()
  File "/home/sagemaker-user/.conda/envs/anomalib_env/lib/python3.10/site-packages/torchmetrics/classification/roc.py", line 122, in compute
    state = [dim_zero_cat(self.preds), dim_zero_cat(self.target)] if self.thresholds is None else self.confmat
  File "/home/sagemaker-user/.conda/envs/anomalib_env/lib/python3.10/site-packages/torchmetrics/utilities/data.py", line 34, in dim_zero_cat
    raise ValueError("No samples to concatenate")
ValueError: No samples to concatenate

Code of Conduct

  • I agree to follow this project's Code of Conduct
commented

When debugging the line state = [dim_zero_cat(self.preds), dim_zero_cat(self.target)] if self.thresholds is None else self.confmat] in auroc.py, it seems both self.preds and self.target are empty, like the results are not being logged during testing. However, when debugging the line during validation epoch, both variables contains tensors.

commented

I found the problem, I was using predict insted of test. When replacing predict with test, the metrics are calculated correctly