jscodec / jsvpx

Full Javascript implementation of libvpx vp8 decoder.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

question for embedding your decoder

publicocean0 opened this issue · comments

hi , i d like embed your decoder in my player.

but i have some questions
ipothesis:
-i have a webm stream ... using chunks.
-the demuxer is seekeable .... it permits to seek in a specific position and send out the right frames.

the decoder is stateless? when i seek there no state to reset in the decoder? nothing related to keyframe for example?

commented

The decoder does keep the state. You can only seek to a keyframe, but this is how all decoders work.

ok pratically after a seek i have to call decode api with compressed frames starting from a keyframe. It seams simple

commented

Yes, this is the case with VP8, you have to seek to the nearest keyframe.

then for the player part ... i think i might refresh canvas it is simple but i have 2 doubt:

  • in webm i cant find samplerate for video. what it the frequency for refreshing canvas?
  • based on your experience setTimeout every X ms is the faster solution ?
commented

When you multiplex it, it should put timestamps on each frame. I don't think a solid fps will work perfectly. Do you have audio too?

commented

I'm not an expert on A/V sync but perhaps @Brion knows

Indeed there's no single fps value provided in WebM structure; each frame's packet lists a presentation timestamp relative to the start time.

Very roughly a playback loop (such as is implemented in my ogv.js player needs to do:

  • demux next frame
  • get the current playback time, something like:
    • playbackTime = audioContext.currentTime if you have audio, or just:
    • playbackTime = Date.now() - startTime or if you're really fancy and want to handle pauses better:
    • playbackTime = (Date.now() - timeWeLastPlayedAFrame) + ptsOfLastFrame
  • delay until it's time
    • setTimeout(displayFrame, ptsOfCurrentFrame - playbackTime)
  • draw!
  • go back to start

Thanks a lot @Brion. I got the idea . I m thinking now in the webm blocks there is a blockduration : blockduration/framecount=time per frame. Answering to @brianxautumn. No i have still to see exactly how to do . I have just saw you realized a audiostream feeder for doing it.

The better solution would be realize a MSE object for browser not supporting it.
There is also another little problem to think.
Usually there is a parser/demuxer also in dash component.
MSE might have also a simplifed version of a demuxer+ decoder.
There is a rendundancy demuxing twice . I m asking if is better to think a trick for evoiding it(for example creating a virtual codec 'video/webm codecs="js" ' essentially passing directly frames after initialization segment ) , but anyway realizing a portable solution.