lasso-net / lassonet

Feature selection in neural networks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Passing in lambda_seq does not change default value of lambda_start, causing bug in path()

p-smirnov opened this issue · comments

This line here:

if self.lambda_start == "auto":

will evaluate to True if lambda_seq is passed in without setting lambda_start to None. It expects self.lambda_start_ to exist, but that only gets set if lambda_seq is None.

Note setting lambda_start to None in init wouldn't work since lambda_seq can also be passed to path.

Hey! I wrote that code quite fast before the NeurIPS deadline.

Here is another way to describe the issue:

  • I put warnings to detect when the starting point is incorrect.
  • there are 3 cases:
  1. lambda_seq is not None
  2. lambda_seq = _lambda_seq(self.lambda_start)
  3. lambda_seq = _lambda_seq(self.lambda_start_)
  • Currently there is no way to capture case 1.
  • I have a solution to encompass all 3 cases: I can just set a variable lambda_start during the first iteration (by initializing it to None before the loop then setting it with a condition).
  • The condition would simply be current_lambda / lambda_start_ < 2 and the error message would change between cases {1,2} and case 3.

What do you think? Could you implement it? I really don't have the bandwidth to even code 3 lines and check whether they work those days...

Also, did you test that new "auto" heuristic? It was something I had been wanting to try for a long time, I implemented it with random constants (for _ in range(10000):, if torch.abs(beta - new_beta).max() < 1e-5, start = 1e-6) and it seemed to work well on all examples.

I used this fun trick:

        # extract first value of lambda_seq
        lambda_seq = iter(lambda_seq)
        lambda_start = next(lambda_seq)

        for current_lambda in itertools.chain([lambda_start], lambda_seq):