marcotcr / anchor

Code for "High-Precision Model-Agnostic Explanations" paper

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Justification for removing bisection for computing KL-confidence regions

jklaise opened this issue · comments

Hi @marcotcr, whilst browsing the repo I noticed that you've removed the bisection part for computing the upper and lower confidence bounds: ff0924e.

The bisection is required to compute the KL-bounds (4) and (5) defined in the bandit paper so I'm a bit puzzled as to why you've removed it. The new behaviour is also not Hoeffding-bound based (3) but rather is equivalent to running bisection just once and then returning whatever is found (note - there is also no guarantee that the bound returned will satisfy the inequalities in (4) and (5) - in practice I think this will result in looser bounds).

Based on my understanding, I agree with you, @jklaise. The bounds will be looser, but I guess that's the way to scale this method to a larger number of input features.