Live Transcription for Conferencing (Closed Captions - Beta)
100ms' real-time transcription engine generates a live transcript (closed captions) during a conferencing session. The SDK provides a callback with transcript for each peer when they speak.
Minimum Requirements
- Minimum 100ms SDK version required is
2.9.54
Checking if captions are enabled in a room.
To check if WebRTC (not hls) captions are enabled in a room. Look for any transcriptions being in a started state in the room data.
val captionsEnabled = hmsSDK.getRoom()?.transcriptions?.find { it.state == TranscriptionState.STARTED } != null
How to implement captions?
Implement fun onTranscripts(transcripts: HmsTranscripts)
in the HMSUpdateListener
callback.
For an example implementation look at TranscriptionUseCase.kt
in the 100ms-android sample app repository.
Toggling Live Transcripts
To save on cost, live transcriptions can be disabled for everyone at runtime and toggled on again when required.
// Start Real Time Transcription hmsSDK.startRealTimeTranscription( TranscriptionsMode.CAPTION, object : HMSActionResultListener { override fun onError(error: HMSException) {} override fun onSuccess() {} })
// Stop Real Time Transcription hmsSDK.stopRealTimeTranscription( TranscriptionsMode.CAPTION, object : HMSActionResultListener { override fun onError(error: HMSException) {} override fun onSuccess() {} })