chr5tphr / zennit

Zennit is a high-level framework in Python using PyTorch for explaining/exploring neural networks using attribution methods like LRP.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Flat Rule for Pooling Layers

rachtibat opened this issue · comments

Hi Chris,

I am defining a new composite for instance:


@register_composite('all_flat')
class AllFlat(LayerMapComposite):

    def __init__(self, canonizers=None):
        layer_map = [
            (Linear, Flat()),
            (AvgPool, Flat()),
            (Activation, Pass()),
            (Sum, Norm()),
        ]
       
        super().__init__(layer_map, canonizers=canonizers)

The problem is, that the Flat() rule changes the parameter of a layer and the pooling layers do not define the "weight" parameter. As a consequence, there will be a RuntimeError saying, that zennit tries to access the parameter "weight" which is not available.
The solution would be to define a new rule that does not have a param_modifier, for instance:


class FlatPooling(LinearHook):
    '''This is the Flat LRP rule. It is essentially the same as the WSquare Rule, but with all parameters set to ones.
    '''
    def __init__(self):
        super().__init__(
            input_modifiers=[torch.ones_like],
            param_modifiers=[None],
            output_modifiers=[lambda output: output],
            gradient_mapper=(lambda out_grad, outputs: out_grad / stabilize(outputs[0])),
            reducer=(lambda inputs, gradients: gradients[0])
        )

What do you think?

Best

Ahh, yes, I think the Parameter-less Flat-Rule makes sense.

Though I actually think the expected behavior of Flat would be, that it also works with Pooling Layers.

Maybe getting rid of the [None]-behavior, but instead also allowing non-existent parameters in mod_params might be a better approach, then we would not need an additional rule.

I will think about how to modify mod_params. Maybe a flag to ignore non-existent weights/biases might be beneficial, instead of the 'None'-behavior.

Okay, I fixed this with #16 .
Could you try it out and see whether it works for you?

Hi,

yes, now it's working! Good idea to add more control so that the user can choose which parameters to modify.

Thanks