panda-official / TimeSwipe

PANDA Timeswipe driver and firmware

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Integration of visualization optimization

i5abell opened this issue · comments

Branch visualization_opt has to be merged to master.
Things that have been implemented:

  • a defined minimum range for resetting the range boarders (2*min_dist)
  • a sensor recognition by start range (min_wind) surpassing
  • visualization only on channels for which a sensor is recognized

@begodev : Please check. I changed the function m_pLED->get_zerob_ind() in nodeLED.h from protected to public, to activate the visualization on individual channels. Please check it that's ok, or if there's a better way to do it.

@begodev @thomaseichhorn we need to add 2 things:

  1. please try setting offset of half of brightness to leds once they have passed initial limit - so we can see direction of changing (+/-) and not only level
  2. when sensor is disconnected, LED jumps on highest level - how to get back on 0?

@begodev : Please check. I changed the function m_pLED->get_zerob_ind() in nodeLED.h from protected to public, to activate the visualization on individual channels. Please check it that's ok, or if there's a better way to do it.

The key idea that CDataVis class object represents a single visualization channel. So there are already 4 objects exist. Each with its own set of variables.Thats why in my opinion senscon_chan[(m_pLED->get_zerob_ind())] can be replaced just with a member variable like bool m_senscon_chan or of any other name.

@begodev @thomaseichhorn we need to add 2 things:

  1. please try setting offset of half of brightness to leds once they have passed initial limit - so we can see direction of changing (+/-) and not only level

Maybe the middle brightness should be binded with the middle of the dynamic window. But maybe it worth to visualize the deviation from the middle point? With increasing brightness to the both sides? And using different colors? For example mixing green for + deviation and red - for -?
But I think it is a question what exactly is required to visualize . Level is one thing, deviation from the middle point is another...

  1. when sensor is disconnected, LED jumps on highest level - how to get back on 0?

A condition may be that the signal level drops out of the dynamic window, for example, by a distance several times greater than the window-> reset window

  1. Intensity for the middle point is moved to 0.4 from 0.1. Maybe that was the problem. Now it should be near the half of brightness

  2. Sensor drop-out detection is added.

I tested branch visualization_opt, and whilst the sensor connected/not-connected 'detection' looks good, the brightness now doesn't really visibly change with increasing/decreasing signal...

edit: sorry that was a cabling issue :)

As it turns out, the lighting does work as it should. However, for maximal signals, the brightness "increases" over the max value, so that the light colour changes from blue to green. Is this intended?

I set middle window point to 0.4 of brightness. It corresponds to B=2.25.
Maybe the middle brightness should be yet more less like 0.25 - B=10 gives this value (recommended value from the article). But then there is no so much degrees of brightness to display from 0.
Also now I put a minimum possible value of brightness to prevent flickering

Another problem when the measured value is constantly changing in one direction it will always show constant value of brightness because it always at the minimum or maximum border of the range window and pulls window along with it. The value must oscillate at least once then this window measurements start working. And it is not clear how to avoid this..
Maybe to inflate window initially by a FullMeasurementRange*InflationFactor as soon as the sensor is detected? Then there will be some space between current value and the window border.

However, for maximal signals, the brightness "increases" over the max value, so that the light >colour changes from blue to green. Is this intended?

Initially basic color is multiplied by normalized intensity [0,1]
The low level is limited, the high level shouldn't be more then 1, but I'll look
I changed order of calculations a bit . Maybe it should be also limited by 0.9 for example...

What do you think to replace the range window that is currently used with a moving average?
It would be good for periodic signals. Then it will show the current signal relative to the average.

image

It will be possible to see changes up/down sides

Of course for constant signals it will show middle brightness finally, but then it is possible to use ratio of a signal to absolute range (+-10V). Then to choose mode manually (constant signal/periodic signal) or maybe even combine both automatically:
reserve some degrees of brightness for absolute range + some degrees for oscillations and mix them?

To detect sensor drop: see if there was a drop from average region. If the drop was detected and the moving average goes to zero - sensor is out. It is not ultimate solution, but for sensors that produce signal in some narrow range above/below zero it should work.

So far, all my thoughts about this.What do you think?