sony / flutter-elinux-plugins

Flutter plugins for embedded Linux (eLinux)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

WebRTC

HidenoriMatsubayashi opened this issue · comments

Hi @HidenoriMatsubayashi , I have tried some elinux plugins. It uses the CPP code of the flutter windows part without any modification, but it seems that cross-thread calls inside elinux will cause the flutter engine to crash. I don't have any ideas yet, I need to prepare an elinux debugging environment when I have time.

flutter-webrtc/flutter-webrtc#1338

@cloudwebrtc

Great! I'm looking forward to be merged it to flutter-webrtc mainstream.

but it seems that cross-thread calls inside elinux will cause the flutter engine to crash. I don't have any ideas yet, I need to prepare an elinux debugging environment when I have time.

Could you try again with release mode?

The same error is still in release mode, In the log below, the exceptions were triggered once by webrtc and flutter engine respectively. The reason is that the different threads used to create the object and call the object method later

(rtc_video_sink_adapter.cc:13): VideoSinkAdapter: ctor 1273d70
(rtc_video_track_impl.cc:12): VideoTrackImpl: ctor 
(rtc_video_source_impl.cc:16): ~RTCVideoSourceImpl: dtor 
(audio_device_impl.cc:730): RecordingDevices
(audio_device_impl.cc:733): output: 2
(audio_device_impl.cc:712): RecordingDeviceName(0, ...)


#
# Fatal error in: ../../modules/audio_device/linux/audio_device_pulse_linux.cc, line 729
# last system error: 2
# Check failed: thread_checker_.IsCurrent()
# 

💪 Running with sound null safety 💪

An Observatory debugger and profiler on eLinux is available at: http://127.0.0.1:41719/g7IQKGABFQQ=/
(rtc_peerconnection_impl.cc:180): RTCPeerConnectionImpl: ctor
(peer_connection_factory.cc:331): Using default network controller factory
(bitrate_prober.cc:72): Bandwidth probing enabled, set to inactive
(cpu_info.cc:53): Available number of cores: 6
(aimd_rate_control.cc:113): Using aimd rate control with back off factor 0.85
(remote_bitrate_estimator_single_stream.cc:72): RemoteBitrateEstimatorSingleStream: Instantiating.
(remote_estimator_proxy.cc:47): Maximum interval between transport feedback RTCP messages (ms): 250
(peer_connection.cc:1998): TCP candidates are disabled.
(openssl_key_pair.cc:38): Making key pair
(openssl_key_pair.cc:91): Returning key pair
(boringssl_certificate.cc:187): Making certificate for WebRTC
(boringssl_certificate.cc:243): Returning certificate
(sctp_data_channel.cc:87): Accepting maxRetransmits < 0 for backwards compatibility
(sctp_data_channel.cc:97): Accepting maxRetransmitTime < 0 for backwards compatibility
[FATAL:flutter/fml/memory/weak_ptr.h(122)] Check failed: (checker_.checker).IsCreationThreadCurrent(). 
Lost connection to device.

Update:
The crash just happens in debug mode, everything works fine in release/profile mode

In debug mode, it seems that the problem occurs with asserts caused by cross-thread calls

[FATAL:flutter/fml/memory/weak_ptr.h(111)] Check failed: (checker_.checker).IsCreationThreadCurrent().

image

@HidenoriMatsubayashi I tested it on Ubuntu 22.04, it can run well on elinux wayland/x11 (profile/release), tested audio/video loopback sample, data-channel echo test.

@cloudwebrtc Great!

[FATAL:flutter/fml/memory/weak_ptr.h(111)] Check failed: (checker_.checker).IsCreationThreadCurrent().

In debug mode, FML_DCHECK_CREATION_THREAD_IS_CURRENT is enabled in flutter/engine.
See: https://github.com/flutter/engine/blob/main/fml/memory/thread_checker.h

We should only send events like EventChannel from the main thread, I think.

hey @HidenoriMatsubayashi, Is there any way to get the main thread from the plugin? Or any method can send a callback pointer to the task queue of the flutter engine.

flutter linux gtk can work correctly in debug mode, but I don't know the difference between it and elinux in message scheduling.

Let me investigate.

Hey there, thanks for the progress so far on this. I am a newbie in trying to understand eLinux.

  1. Would it be a big lift if I want to use flutter FFI calls to rust, which then uses gstreamer pipeline with appsrc element to get frames to dart? This is leaving WebRTC alone, and as I just need to get frames so that the dart code can send frames via network calls or do something else.

  2. What would be required to play those frames in flutter or incoming frames from udp connection? I want to believe I can make a UDP connection at the dart layer, and pass those frames to rust via FFI, and finally rust uses Gstreamer and sends those frames to what? Skia?

  3. Would you instead recommend me to use QT (since QT excels at embedded & multi-media apps) instead of dealing with flutter since flutter is early in the embedded scene?

Would really appreciate some basic beginner explanation to all of this or some resources! Thank you so much!

I found out more about the above after a load of research. I understand that the best and pretty much only way to render frames from incoming bytes is doing it at the native level. This is C++ for flutter apps in linux.

You create textures from decoded frames and then register these textures with the registrar which is something that flutter provides and seems to be straightforward.

I will be attempting a custom plugin which does basic streaming abilities. Our use case requires something more custom and lightweight than webRTC, so I will keep you all updated. Thanks for the code sample with the current open branch you have @cloudwebrtc !