JuliaStats / GLM.jl

Generalized linear models in Julia

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Feature request: Support for reuse of pinv computation for multiple Y values in linear models

yoninazarathy opened this issue · comments

For LinearModel, it may be useful to have some sort of API which allows one to reapply the model to multiple right hand sides repeatedly (without redoing the internal QR or Cholesky decomposition or other pinv computation again).

This can help performance when using the package for simulation with a fixed design matrix and multiple right hand sides.

My current workaround for this is of the form:

model = fit(LinearModel, formula ... , data, ...)
model_design_matrix = model.mm.m
model_pseudo_inv_matrix = pinv(model_design_matrix) #from LinearAlgebra

#now when looping on many response vectors y:
coef = model_pseudo_inv_matrix *y
...

With this one can clearly just carry out the complete LinearModel fit just using elementary LinearAlgebra operations, still the support that GLM.jl gives to data frames, contrasts, and model queries is useful.

I think this is almost already supported. With #535, I can write

julia> X = [ones(10) randn(10)];

julia> y1 = X*ones(2) + randn(10)*0.1;

julia> y2 = X*ones(2) + randn(10)*0.1;

julia> myp = GLM.DensePredChol(X, false);

julia> myr1 = GLM.LmResp(y1);

julia> myr2 = GLM.LmResp(y2);

julia> fit!(LinearModel(myr1, myp, nothing))
LinearModel

Coefficients:
───────────────────────────────────────────────────────────────
       Coef.  Std. Error      t  Pr(>|t|)  Lower 95%  Upper 95%
───────────────────────────────────────────────────────────────
x1  0.976384   0.0449526  21.72    <1e-07   0.872723    1.08004
x2  0.977983   0.0427468  22.88    <1e-07   0.879409    1.07656
───────────────────────────────────────────────────────────────


julia> fit!(LinearModel(myr2, myp, nothing))
LinearModel

Coefficients:
───────────────────────────────────────────────────────────────
       Coef.  Std. Error      t  Pr(>|t|)  Lower 95%  Upper 95%
───────────────────────────────────────────────────────────────
x1  0.995871   0.0278513  35.76    <1e-09   0.931646    1.0601
x2  0.965657   0.0264846  36.46    <1e-09   0.904583    1.02673
───────────────────────────────────────────────────────────────

Thank you.