percentile method: allow fewer points to be thrown out when narrowing bounds
carolyner opened this issue · comments
Lines 654
keep_points <- c(trunc(seq(1.0, improve_min_samplefactor, 0.1)*N_cov_points), current_good_n)
On line 273, we set
improve_min_samplefactor <- 2
The effect of this is to first try to keep just N_cov_points and then incrementally increase the number retained up to 2N_cov_points.
If we get to 2N_cov+points and can’t update the bounds then we give up and keep all of the points.
Revise to update in steps that span the number of retained points from N_cov_points to current_good_n.
A simple way to modify this would be to set
improve_step = trunc((current_good_n - N_cov_points)/10)
keep_points <- c(seq(N_cov_points,(current_good_n-improve_step),improve_step),current_good_n)
An even better (recommended) improvement is set the maximum step size to be 100, which will allow smaller steps when current_good_n is large:
improve_step = min(trunc((current_good_n - N_cov_points)/10),100)
If we change this, then improve_min_samplefactor is only relevant to when we start to look to narrow bounds, which is once we have improve_min_samplefactor times the number of points as needed for covariance calculations.
This change could affect convergence of the percentile approach.
@carolyner pushed change to dev