cornellius-gp / linear_operator

A LinearOperator implementation to wrap the numerical nuts and bolts of GPyTorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Representation of a LinearOperator should consist only of Tensors

AndreaBraschi opened this issue · comments

Hi all,

I've written a function to concatenate 2 square LinearOperator objects with different shapes:

from linear_operator.operators import ZeroLinearOperator, CatLinearOperator, KroneckerProductLinearOperator

def cat_LinearOperators(*linear_operators):
        oper_A = linear_operators[0]
        oper_B = linear_operators[1]
        oper_A_size = oper_A.shape[0]
        oper_B_size = oper_B.shape[0]

        # mat_A
        a = ZeroLinearOperator(oper_A_size, oper_B_size)
        cat_1 =CatLinearOperator(a, oper_A, dim=1)
        c = ZeroLinearOperator(oper_B_size, oper_B_size)
        cat_2 = CatLinearOperator(a.transpose(-1, -2), c, dim=1)
        tensor_A = CatLinearOperator(cat_2, cat_1, dim=0)

        # mat_B
        e = ZeroLinearOperator(oper_B_size, oper_A_size)
        cat_3 = CatLinearOperator(oper_B, e, dim=1)
        c = ZeroLinearOperator(oper_A_size, lazy_A_size)
        cat_4 = CatLinearOperator(c, e, dim=0)
        tensor_B = CatLinearOperator(cat_3, cat_4.transpose(-1, -2), dim=0)

        cat_operator = SumLinearOperator(tensor_A, tensor_B)

        return cat_operator

the function works, butI get the following RuntimeError when I evaluate the resulting CatLinearOperator:
Representation of a LazyTensor should consist only of Tensors.

You can evaluate the function yourself:

tensor_A = torch.randn(38, 38)
tensor_B = torch.randn(50, 50)
tensor_C = torch.randn(4, 4)
Kron_A = KroneckerProductLinearOperator(tensor_A, tensor_B)
Kron_B =  KroneckerProductLinearOperator(tensor_C, tensor_B)
cat_operator =cat_LinearOperators(Kron_B, Kron_A)
cat_operator.to_dense()

I start to believe that the mistake is in how I use the ZeroLinearOperator object?

Any help or suggestion will be greatly appreciated.

Many thanks,
Andrea

Originally posted by @AndreaBraschi in #78