schoolpost / PiDNG

Create Adobe DNG RAW files using Python. Works with any Bayer RAW Data including native support for Raspberry Pi cameras.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

PyDNG doesn't extract/convert multiple images

nocantsin opened this issue · comments

When converting raw yuv files shot with Raspberry Pi HQ quality to DNG, PyDNG only detects and converts single frames, not multiples images/shots stored in the same file.

Here are some sample files:
https://www.dropbox.com/sh/seddezwwymwnh4v/AACo9aKiX_bPlyAC1NDt_sVaa/rawvid.YUV?dl=0
https://www.dropbox.com/sh/seddezwwymwnh4v/AADjmpDa56KGWZeiF15QHC05a/200524153912.yuv?dl=0

Would be great if future versions of PyDNG could support the decoding of raw image sequences!

I took a look at those files, not quite sure how those would be processed. There is no metadata to specify frame size, number of frames, no presence of any notable header or EOF.

What program did you use to generate those?

Are you sure that is RAW Bayer data? YUV would suggest this is already demosiaced data and is transformed to YUV space. This is not what a DNG wrapper would be used for.

The files weren't created by me, but by somebody else on a video user forum with whom I am in touch. He used the following tools for recording the image sequences: https://www.raspberrypi.org/documentation/raspbian/applications/camera.md , specifically raspividyuv in both yuv and rgb modes. The full set of his output files is here, with the last files (the ones with .yuv and .rgb extensions) being the relevant ones: https://www.dropbox.com/sh/seddezwwymwnh4v/AAAU5es-RCWOXCj1Fpkbz3Dba?dl=0

So we have similar end goals I see 😎....saw your post on the EOSHD forums.

The tool used to generate those files isn't outputting RAW, its already demosiaced at that point ( in either YUV or RGB space )

I haven't re-visited it yet, but the tool needed for RAW video is something like this one:
https://github.com/6by9/raspiraw

I don't think its officially been updated to work with the new HQ Camera.

likewise I don't think it is possible to do the full sensor readout at 24-30fps, I think it was said something more like 10fps. However I think in the binned mode it can do 2K at 50fps so that might be a bit more feasible.

Worth doing some more investigation, try asking on the RaspberryPi forums to see the state of raspiraw tool and if can be used with the HQ camera yet and what are the limitations.

Thanks so much for looking into this!