edimuj / cordova-plugin-audioinput

This iOS/Android Cordova/PhoneGap plugin enables audio capture from the device microphone, by in near real-time forwarding audio to the web layer of your application. A typical usage scenario for this plugin would be to use the captured audio as source for a web audio node chain, where it then can be analyzed, manipulated and/or played.

Home Page:https://github.com/edimuj/app-audioinput-demo

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Question] How to get the mic mediaStream?

lone-cloud opened this issue · comments

Sorry, for the noob question. I came across this lib while trying to find an alternative for navigator.mediaDevices.getUserMedia({ audio: true }) which does not work with cordova-android 6.4.0 for me because of "NotReadableError - Could not start source" even with the mic permission enabled. I've been trying to figure out how get the recording mic's mediaSteam from audioinput. Any suggestions?

commented

It should be relatively easy to get the microphone stream using the plugin. Have you tried the example app? The examples in there should show you how to do it: https://github.com/edimuj/cordova-plugin-audioinput

The other issue you have been having sounds like it has something to do with permissions actually. If your targetSdk is 23+ then the app will basically ignore the static permissions and instead require permissions to be asked for in runtime. One way you could quickly test this is to manually allow your app to access the microphone in the settings. See https://support.google.com/googleplay/answer/6270602?hl=en ”Turn permission on or off” or to set the targetSdk to 22 to go back using the old way by using static permissions again.

Yes, I'm on 23+. I'll check that out later, thx.
I have looked through all of your examples and I feel like I'm missing something obvious. This lib (with streamToWebAudio: true) returns a https://developer.mozilla.org/en-US/docs/Web/API/AudioContext but I need a https://developer.mozilla.org/en-US/docs/Web/API/MediaStream to achieve parity with navigator.mediaDevices.getUserMedia({ audio: true }). I don't see a way to turn the former into the later.

commented

There is actually a way:
If you need a MediaStream, you should be able to connect the audioinput to a MediaStreamAudioDestinationNode as you see in the example there, an oscillator node is connected to such a node, but in this case you would do something like this instead:

audioinput.start({streamToWebAudio: true}); // Start the capture
var dest = audioinput.getAudioContext().createMediaStreamDestination(); // Create the node
audioinput.connect(dest); // Connect the plugin to the node

And now you should be able to reference the MediaStream by using
dest.stream

So the MediaStreamAudioDestinationNode basically converts the data coming from Web Audio (which this plugin is using) into a MediaStream with a single AudioMediaStreamTrack.

Perfect. You're the man.

This is gold thanks for the input!!!!

commented

Hi @edimuj and @hoboman313 , I have been using this plugin for audio recording for few months in my app. However now I want to use it as audio source for WebRTC too. (Reason: I have OnePlus7Pro and it doesn't have microphone jack but USB Type-C input. getUserMedia doesn't catch audio from USB, not sure why).
I followed the sample code given in above answer and copied audio tracks from createMediaStreamDestination to getUserMedia stream. Everything works without error but there is no sound at other side of WebRTC connection.

@hoboman313 , did you get this worked in your WebRTC project?