rachtibat / zennit-crp

An eXplainable AI toolkit with Concept Relevance Propagation and Relevance Maximization

Home Page:https://www.nature.com/articles/s42256-023-00711-8

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ReceptiveField calculation misses the last neurons of a layer

maxdreyer opened this issue · comments

Hi all,

when I viewed the top-k most relevant samples of a concept with receptive field "on", I got as a result less than k samples.

I spotted a bug in the method analyze_layer of the ReceptiveField class:

def analyze_layer(self, concept: Concept, layer_name: str, c_indices, canonizer=None, batch_size=16, verbose=True):

    composite = AllFlatComposite(canonizer)
    conditions = [{layer_name: [index]} for index in c_indices]

    batch = 0
    for attr in self.attribution.generate(
            self.single_sample, conditions, composite, [], concept.mask_rf,
            layer_name, 1, batch_size, None, verbose):

        heat = self.norm_rf(attr.heatmap, layer_name)

        try:
            rf_array[batch * len(heat): (batch+1) * len(heat)] = heat
        except UnboundLocalError:
            rf_array = torch.zeros((len(c_indices), *heat.shape[1:]), dtype=torch.uint8)
            rf_array[batch * len(heat): (batch+1) * len(heat)] = heat

        batch += 1

    return rf_array

The error occurs in the line

rf_array[batch * len(heat): (batch+1) * len(heat)] = heat

For the last batch, len(heat) might be smaller than the batch size. Then, the indexing (supposedly beginning at the position of the last neuron index) is not correct anymore.

Possible solution: counting len(heat) for every batch with an index and using the index instead of batch*len(heat).

Best,
Max

Solved in pull request #15