openvinotoolkit / anomalib

An anomaly detection library comprising state-of-the-art algorithms and features such as experiment management, hyper-parameter optimization, and edge inference.

Home Page:https://anomalib.readthedocs.io/en/latest/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Bug]: Extracting confusion matrix as a test metric

lathashree01 opened this issue · comments

Describe the bug

Hi team,

During testing, I would like to extract Binaryconfusion matrix as a metric from the engine.

I can see the value being calculated successfully but it is failing in trainer

https://github.com/Lightning-AI/pytorch-lightning/blob/d1949766f8cddd424e2fac3a68b275bebe13d3e4/src/lightning/fabric/utilities/apply_func.py#L123

def convert_tensors_to_scalars(data: Any) -> Any:
 """Recursively walk through a collection and convert single-item tensors to scalar values.
 Raises:
     ValueError:
         If tensors inside ``metrics`` contains multiple elements, hence preventing conversion to a scalar.

 """
 def to_item(value: Tensor) -> Union[int, float, bool]:
     if value.numel() != 1:
         raise ValueError(
             f"The metric `{value}` does not contain a single element, thus it cannot be converted to a scalar."
         )
     return value.item()
 return apply_to_collection(data, Tensor, to_item)

where the number of elements in result is greater than 1.

How do I resolve this? Any help would be greatly appreciated. Thanks

Dataset

Folder

Model

PatchCore

Steps to reproduce the behavior

Train the model
Test the model with BinaryConfusionMatrix as a image metric

metrics = [  "ConfusionMatrix": {
                    "class_path": "torchmetrics.classification.BinaryConfusionMatrix",
                    "init_args": {},
                }]

OS information

OS information:

  • OS: [e.g. Ubuntu 20.04]
  • Python version: [e.g. 3.10.0]
  • Anomalib version: [e.g. 0.3.6]
  • PyTorch version: [e.g. 1.9.0]
  • CUDA/cuDNN version: [e.g. 11.1]
  • GPU models and configuration: [e.g. 2x GeForce RTX 3090]
  • Any other relevant information: [e.g. I'm using a custom dataset]

Expected behavior

Able to get the confusionmatrrix back as in a results dict

Screenshots

No response

Pip/GitHub

pip

What version/branch did you use?

1.1.0

Configuration YAML

# Initialise Patchcore model with default settings

engine = Engine(task=task, 
                   threshold="F1AdaptiveThreshold",
                   image_metrics=metrics,
                   callbacks=callbacks,
                   accelerator="gpu")

Logs

ValueError: The metric `tensor([[343, 497],
        [  0,   4]])` does not contain a single element, thus it cannot be converted to a scalar.

Code of Conduct

  • I agree to follow this project's Code of Conduct