Official Android Client SDK for LiveKit. Easily add video & audio capabilities to your Android apps.
Docs and guides at https://docs.livekit.io.
API reference can be found at https://docs.livekit.io/client-sdk-android/index.html .
LiveKit for Android is available as a Maven package.
...
dependencies {
implementation "io.livekit:livekit-android:1.1.1"
// Snapshots of the latest development version are available at:
// implementation "io.livekit:livekit-android:1.1.2-SNAPSHOT"
}
You'll also need jitpack as one of your repositories.
subprojects {
repositories {
google()
mavenCentral()
// ...
maven { url 'https://jitpack.io' }
// For SNAPSHOT access
// maven { url 'https://s01.oss.sonatype.org/content/repositories/snapshots/' }
}
}
LiveKit relies on the RECORD_AUDIO
and CAMERA
permissions to use the microphone and camera.
These permission must be requested at runtime. Reference the sample app for an example.
room.localParticipant.setCameraEnabled(true)
room.localParticipant.setMicrophoneEnabled(true)
// create an intent launcher for screen capture
// this *must* be registered prior to onCreate(), ideally as an instance val
val screenCaptureIntentLauncher = registerForActivityResult(
ActivityResultContracts.StartActivityForResult()
) { result ->
val resultCode = result.resultCode
val data = result.data
if (resultCode != Activity.RESULT_OK || data == null) {
return@registerForActivityResult
}
lifecycleScope.launch {
room.localParticipant.setScreenShareEnabled(true, data)
}
}
// when it's time to enable the screen share, perform the following
val mediaProjectionManager =
getSystemService(MEDIA_PROJECTION_SERVICE) as MediaProjectionManager
screenCaptureIntentLauncher.launch(mediaProjectionManager.createScreenCaptureIntent())
LiveKit uses SurfaceViewRenderer
to render video tracks. A TextureView
implementation is also
provided through TextureViewRenderer
. Subscribed audio tracks are automatically played.
class MainActivity : AppCompatActivity() {
lateinit var room: Room
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
// Create Room object.
room = LiveKit.create(applicationContext)
// Setup the video renderer
room.initVideoRenderer(findViewById<SurfaceViewRenderer>(R.id.renderer))
connectToRoom()
}
private fun connectToRoom() {
val url = "wss://your_host"
val token = "your_token"
lifecycleScope.launch {
// Setup event handling.
launch {
room.events.collect { event ->
when (event) {
is RoomEvent.TrackSubscribed -> onTrackSubscribed(event)
else -> {}
}
}
}
// Connect to server.
room.connect(
url,
token,
)
// Turn on audio/video recording.
val localParticipant = room.localParticipant
localParticipant.setMicrophoneEnabled(true)
localParticipant.setCameraEnabled(true)
}
}
private fun onTrackSubscribed(event: RoomEvent.TrackSubscribed) {
val track = event.track
if (track is VideoTrack) {
attachVideo(track)
}
}
private fun attachVideo(videoTrack: VideoTrack) {
videoTrack.addRenderer(findViewById<SurfaceViewRenderer>(R.id.renderer))
findViewById<View>(R.id.progress).visibility = View.GONE
}
}
See the basic sample app for the full implementation.
Properties marked with @FlowObservable
can be accessed as a Kotlin Flow to observe changes
directly:
coroutineScope.launch {
room::activeSpeakers.flow.collectLatest { speakersList ->
/*...*/
}
}
We have a basic quickstart sample app here, showing how to connect to a room, publish your device's audio/video, and display the video of one remote participant.
There are two more full featured video conferencing sample apps:
They both use
the CallViewModel
, which handles the Room
connection and exposes the data needed for a basic video conferencing
app.
The respective ParticipantItem
class in each app is responsible for the displaying of each
participant's UI.
To develop the Android SDK or running the sample app directly from this repo, you'll need:
- Ensure the protocol submodule repo is initialized and updated with
git submodule update --init
For those developing on Apple M1 Macs, please add below to $HOME/.gradle/gradle.properties
protoc_platform=osx-x86_64
- Download webrtc sources from https://webrtc.googlesource.com/src
- Add sources to Android Studio by pointing at the
webrtc/sdk/android
folder.