senguptaumd / Background-Matting

Background Matting: The World is Your Green Screen

Home Page:https://grail.cs.washington.edu/projects/background-matting/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Why hand-hold sample video looks not good

daodaoawaker opened this issue · comments

First of all, great work !!
I want to re-implement the result on sample videos as your procedure in the repo, but why I cannot get the same effect as yours. I'm confusing ... orz. I find the problem is that _back.png obtained by running test_pre_process_video.py looks incorrect, like following pictures in sample_video/input

This is one of frame extracted from teaser.mov
0001_img

This is the corresponding bg generated by running test_pre_process_video.py
0001_back

Obviously there is something wrong. As said in README.md, If there are significant exposure changes between the captured image and the captured background, use bias-gain adjustment to account for that. Should I turn on the part of bias-gain adjustment in the test_pre_process_video.py?
CAPTURE_202065_160347

Is that correct ? Thanks very much !!

There is an alternate Matlab code that seems to be more robust to misalignment and exposure change. You can try that if you have access to Matlab. I think the python code for bias-gain adjustment have some bug, that is why I turned it off. The python code was later developed before publishing the code and I was internally using the matlab code for alignment and exposure adjustment.