chrisguttandin / web-codecs

A (not yet) extendable and (not yet) complete drop-in replacement for the native WebCodecs API.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How are you generating timestamp for AudioData?

guest271314 opened this issue · comments

Hi @guest271314,

I'm not sure if I understand the question. Are you asking how a timestamp is generated when it's not provided?

As far as I know the timestamp is a mandatory parameter when creating an AudioData object. It should not be necessary to generate it.

If you are using MediaStreamTrackGenerator to create an audio MediaStreamTrack to encode a recording, in real-time or non-real-time timestamp is important.

For example, I am streaming raw, interleaved PCM from parec (audio output to headphones or speakers) to the browser where I parse the PCM and create an AudioData object that I encode to MP3 or Opus in WebM container. I create a Web Audio API OscillatorNode of silence then connect that node to a MediaStreamAudioDestinationNode because MediaStreamTrack of kind audio does not produce silence per specification on Chromium or Chrome, then read that stream with MediaStreamTrackProcessor, just to get the timestamp for the AudioData that deinterleaved PCM is passed to.

That is why I am asking you how you are generating a timestamp. That is, how are you going to test your implementation without passing that timestamp to AudioData?

I just looked at your AudioData test. It doesn't look like you attempted to create a timestamp that would actually be used, or test the expected audio output of consecutive deserialized AudioData with the timestamp you pass.

Are you leaving it up to the user to generate the timestamp, without explaining how to generate a valid timestamp, as WebCodecs specification does?

Yes, exactly. I don't think there is a way for AudioData class to generate a meaningful timestamp on it's own. Please let me know if I'm wrong.

By the way, many of the tests are taken from the Web Platform Tests. That's also the reason why the value 1234 is being used in the test.

Yes, exactly. I don't think there is a way for AudioData class to generate a meaningful timestamp on it's own. Please let me know if I'm wrong.

Right. That is a glaring omission in WebCodecs specification. In general see https://bugs.chromium.org/p/chromium/issues/detail?id=1260519, https://bugs.chromium.org/p/chromium/issues/detail?id=1199377.

By the way, many of the tests are taken from the Web Platform Tests. That's also the reason why the value 1234 is being used in the test.

That doesn't make the test meaningful or exhaustive just its because it's in WPT repository.

As I pointed out above, it is non-trivial to generate the timestamp to avoid gaps and glitches in playback, that's why I "piggy-back" on a stream of silence read with MediaSTreamTrackProcessor to get a timestamp.

I posted this here to see if you are interested in actually investigating this to fill in the glaring omission in the WebCodecs specification.