schellingb / UnityCapture

Streams Unity rendered output to other Windows applications as virtual capture device

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Please consider setting average frame rate to 60 in video capability?

euyuil opened this issue · comments

Currently for each video capability, average frame rate is 30 and max frame rate is 120. But for many old web cam programs they only support 30 FPS.

For example, if I use AmCap 9.23, it will play the video in 60 FPS. But in lower version for example AmCap 9.21, it will play the video in 30 FPS.

Please consider if we can set average frame rate to 60 if possible.

The filter currently reports a MinFrameInterval of 120 FPS and MaxFrameInterval 30 FPS. Do you think that max value is what you see as 'average'?

I guess it would be nice if the framerate in Unity could be used to decide the filter capabilities but the filter can be started before Unity is running so that's why these min and max values are fixed.

In the end the actual framerate is depending on how quick Unity sends new frames and how quick the receiving application reads them. Unity can send frame updates quicker than they get read. On the other side frames can't be read quicker than they are sent, the filter waits until a new frame arrives.

I think probably my word 'average' means 'max frame interval' in this project. Here is my code as a user of the virtual camera:

using System;
using System.Linq;
using AForge.Video.DirectShow;

namespace ConsoleApp1016
{
    class Program
    {
        static void Main(string[] args)
        {
            var videoDevices = new FilterInfoCollection(FilterCategory.VideoInputDevice);
            var monikerString = videoDevices.OfType<FilterInfo>()
                .Single(e => e.Name == "Unity Video Capture").MonikerString;

            var videoDevice = new VideoCaptureDevice(monikerString);
            var videoCapabilities = videoDevice.VideoCapabilities;
            foreach (var videoCapability in videoCapabilities)
            {
                Console.WriteLine($"Size={videoCapability.FrameSize.Width}x{videoCapability.FrameSize.Height}\t" +
                                  $"AvgFps={videoCapability.AverageFrameRate}\t" +
                                  $"MaxFps={videoCapability.MaximumFrameRate}");
            }
        }
    }
}

The output is:

Size=1920x1080  AvgFps=30       MaxFps=120
Size=1280x720   AvgFps=30       MaxFps=120
Size=960x540    AvgFps=30       MaxFps=120
Size=640x360    AvgFps=30       MaxFps=120
Size=480x270    AvgFps=30       MaxFps=120
Size=256x144    AvgFps=30       MaxFps=120
Size=2560x1440  AvgFps=30       MaxFps=120
Size=3840x2160  AvgFps=30       MaxFps=120
Size=1440x1080  AvgFps=30       MaxFps=120
Size=960x720    AvgFps=30       MaxFps=120
Size=640x480    AvgFps=30       MaxFps=120
Size=480x360    AvgFps=30       MaxFps=120
Size=320x240    AvgFps=30       MaxFps=120
Size=192x144    AvgFps=30       MaxFps=120
Size=1920x1440  AvgFps=30       MaxFps=120
Size=2880x2160  AvgFps=30       MaxFps=120
Size=1920x1200  AvgFps=30       MaxFps=120
Size=1280x800   AvgFps=30       MaxFps=120
Size=2880x1800  AvgFps=30       MaxFps=120
Size=2560x1600  AvgFps=30       MaxFps=120
Size=1680x1050  AvgFps=30       MaxFps=120
Size=1440x900   AvgFps=30       MaxFps=120

It looks like 'average' value (comes from library AForge) is the same as the value of 'max frame interval'.

As I observed, many (obsoleted) programs (who reads the frames) will treat the camera as 30 FPS, even if Unity is sending at a rate of 60 FPS. I changed the code myself and found it useful for these old programs when I actually using 60 FPS rate (commit).

The commit above is just for demonstrative purpose, hard-coding to 60 is not a good solution I think. It just achieves my temporary requirement. I think a better solution is to add more 'video capabilities', with different 'max frame interval', to support some old camera reading programs.

I just did some research. In my previous commit, I also changed m_avgTimePerFrame. This value will be set to pvi->AvgTimePerFrame in GetMediaType. Maybe this is the reason I made the code work? In a document from MSDN, there is some explanation of AvgTimePerFrame: "This value can be used to calculate the authored frame rate, which is the intended frame rate for the video to be rendered."

And in this document from MSDN: "For capture filters, the MinFrameInterval and MaxFrameInterval members define the minimum and maximum duration of each frame, as given in the AvgTimePerFrame member of the VIDEOINFOHEADER or VIDEOINFOHEADER2 structure."

Thanks for the research and all the details!

When writing my last comment I forgot about that m_avgTimePerFrame which is actually reported to the application but then later is overwritten by what that application actually requests.
Some applications (like OBS what I mainly test with) seem to basically ignore that initial pvi->AvgTimePerFrame and just select the frame rate based on min/max frame interval. It makes sense for applications without an option to select FPS to just use whatever initially was reported as 'average'.

Is the change on line 107 to m_avgTimePerFrame = 10000000 / 60; enough for you or do you need MaxFrameInterval to be 60 as well?

I think it's good to change the initial value from 30 to 60 FPS due to that being the default for most Unity projects as well.
Making this configurable would be quite tricky but not impossible I guess.

I don't think we should make it configurable. Probably it's possible to tell the operating system that it has more options of "video capabilities". For example, the camera reports that it can output 1920x1080@30Hz, or 1920x1080@60Hz, etc. And the user can choose one he likes.