cdiazbas / enhance

Enhance

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Issue with np.max() in normalisation procedure

lzivadinovic opened this issue · comments

Hi,

I've been testing enhance on my dataset (again) and i've noticed problem with normalization if there are NaN in dataset.

So bassicaly, this auto normalization procedure in continuum located here https://github.com/cdiazbas/enhance/blob/master/enhance.py#L52 will produce messed up prediction because of NaN. I've managed to get somewhat decent results using np.nanmax() and np.nanmean() for normalization, but i wanted to make sure that model will work ok with mean instead of max as "normalization" factor?

So here is example code and results for normalization only (without enhance)

mapa = sunpy.map.Map(first)
mapa.peek()
sunpy.map.Map(mapa.data/np.max(mapa.data), mapa.meta).peek()
sunpy.map.Map(mapa.data/np.nanmean(mapa.data), mapa.meta).peek() #i've used np.nanmean() but also with np.nanmax() you get similar output (just scaled differently)

div_by_np

Here is normalization factor unchanged + enhance prediction

div_by_npmax

Here is normalization factor changed to np.nanmean + enhance prediction

nan_mean_as prediction

Ofc there are artefacts on limb and there are negative values where nan is, but that is easily fixable.

Also, because i want to use enhance as class in my code, i've changed bits of codes and kind of packed it as python package, but that is for another talk. Perhaps, i can write something usable and user friendly and submit patch later on when i find time to actually test my code :D

Hi lzivadinovic,
Thanks again for your comments. Some lines before the normalization, I'm using nan_to_num to avoid this problem, so I'm not sure why you still have problems. You can check the line here: https://github.com/cdiazbas/enhance/blob/master/enhance.py#L38 . If you are using a modified version of the script, you can use another normalizations like nanmax or nanmean (as you already did). The model will work well if the intensity input (after the normalization) is lower than 1.2 or so. Therefore, if you are close to the limb I would recommend nanmax instead of nanmean. After getting the output, you can revert the normalization and have again the original units.
Best regards,
Carlos D

Hi,
Well, something strange is happening with my version. I will check it out.

Anyhow, just wanted to confirm with you regarding normalization. Thanks for clarification!

Im closing the issue.

Cheers,
Lazar