Borda / BIRL

BIRL: Benchmark on Image Registration methods with Landmark validations

Home Page:http://borda.github.io/BIRL

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

bUnwarpJ: imprecise warping landmarks

Borda opened this issue · comments

Asked (reported) to ImageJ: I have a case - register two images together, then warp the source image to target image and in the end generate the raw transformation to be able show some important points in the warped image according their positions in source image. (In this moment I don't race much about quality of the registration.) This perfectly works when both image are the same size but in following case the source image is smaller then the target one.

See an example if correctly transformed landmarks
registration_landmarks
bad points warping in the warped image
registration_landmarks
and some more visualization where it should be
registration_visual

  • macro I used for registration, warping image and converting raw transformation

Ignacio Arganda-Carreras
OK, so you have calculated the registration between a reference image (REF) and a moving image (MOV). And you have stored the transformation of MOV to REF. In fact, that transformation stores for each pixel in REF space, which are the corresponding coordinates in MOV space to fill it in. So the pixel-to-pixel correspondence but in inverse order (for all pixels in REF). This is done this way so the result image is always complete. That means that the correspondence of the points in MOV is not directly stored, you would have to approximate it.
Have a look at how the raw transform is applied: https://github.com/fiji/bUnwarpJ/blob/master/src/main/java/bunwarpj/MiscTools.java#L305

Let me give you an example
Let's say our REF image is 200x200 and our moving image is 240x240
the transform stores then 200x200 x-values and 200x200 y-values
that is, the coordinates of the pixels in MOV space that will fill the result image (in REF space)
so now, if you have one point in MOV space, let's say (134, 175), we don't know how to transform it in REF space you would have to go through the sets of x-values and y-values and see which ones are closer to those (134, 175) and then use their positions in the tables as the coordinates in REF

In any case, why don't you try to modify load_parse_bunwarpj_displacement_axis so it approximates the values?

It turned out I already had and implementation of the pseudo-inverse of a transform:

https://github.com/fiji/bUnwarpJ/blob/master/src/main/java/bunwarpj/MiscTools.java#L968-L984

warping function in the benchmark - bm_bunwarpj.py#L340
parting the raw transformation - bm_bunwarpj.py#L136