lucasplagwitz / recon

Primal-Dual Solver for Inverse Problems

Home Page:https://lucasplagwitz.github.io/recon/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Projection

mrava87 opened this issue · comments

Hi,
I was looking at the Projection term and I noticed something that I can't quite explain.
I would expect when you do the proximal of ||Grad*x||_1� that you want to sum the squares of the derivatives in the two/n directions at the same point and then replicate the same vector over both dimensions.

Assuming that what comes in (the vector f) is f=Grad*x (or part of it contains this term...), this vector contains the derivative over the first dimension in the first half and the derivative over the second dimension in the second half. So I would have expected it to be reshaped as (2, nmodel) and summed over the 0 axis

Looking at your cod

aux = np.sqrt(np.sum(abs(np.reshape(f, (int(self.shape/self.n_dim), self.n_dim)))**2, axis=1))

it looks to me like as if you reshape it the other way around so that the sum is over the derivative of 2 neighbour points instead of the two derivatives at the same point, perhaps just a small mistake when you ported matlab code that uses Fortran array layout vs Python C layout? Or am I missing something wrt theory?

Hey,
thanks for validating the code! I'll check it out - will answer your mail tomorrow, too.

Hi again,
you are definitely right! I pushed the quick fix, it should be your recommended version. I try to debug/comment/clean up files next week to increase usability.
It is very helpful to have another pair of eyes to check it. Please stay tuned!

Fantastic! I wasn't sure myself as I am also new to proximal operators but it seemed like not agreeing with the definition in Chambolle and Pock paper, great that you agree.

I will keep looking at it and looking forward to the progress of this project :)