Pre-Call Diagnostics

Pre-Call Diagnostic APIs

The Pre-Call Diagnostic APIs provide two types of functionalities :

  1. APIs to test the connected I/O devices like camera, mic, and speaker functionality
  2. APIs to test the connectivity to 100ms Servers - This can identify signaling and media connectivity issues and provide a report at the end of the test.

You can use these APIs in your application to detect issues before a Participant joins a video Room or as part of a troubleshooting page while already joined in the room or at a later stage to debug some issues at the end user's device/network

How to use the Pre-Call Diagnostic APIs

To use the diagnostic APIs first you need to get an instance of HMSDiagnostics from the HMSSDK instance.

// get the HMSSDK instance val hmsSDK = HMSSDK.Builder(application).build() // user id can be null as well val diagnosticsSDK = hmsSDK.getDiagnosticsSDK("user-id")

If you are planning to use the connectivity checks while you have already joined a room, create a new instance of HMSSDK to get the HMSDiagnostic instance

Test microphone

To test if the microphone is correctly working, call the startMicCheck API. Diagnostics SDK will open the microphone in recording mode and record the stream being captured into a file in the cache memory. onSuccess callback will be called if Diagnostics is able to open the microphone and start recording onAudioLevelChanged is called every second with the decibel level detected by the microphone. This can be used on the UI to show an indication that mic is listening to user's audio onError is called in case there was any issue while starting the microphone or if permission not given

Ensure to give the microphone permission before calling this

// get the diagnostic instance val diagnosticsSDK = hmsSDK.getDiagnosticsSDK("user-id") diagnosticsSDK.startMicCheck(getApplication<Application>(), object : HMSAudioDeviceCheckListener { override fun onError(error: HMSException) { // This will be called if there any error while recording } override fun onSuccess() { } override fun onAudioLevelChanged(decibel: Int) { // Show a UI to indicate microphone is listening to user's audio } })

The recording will continue un-till stopMicCheck is called. Ensure to call this method, otherwise the mic will be not be released by the Diagnostics SDK

// get the diagnostic instance val diagnosticsSDK = hmsSDK.getDiagnosticsSDK("user-id") diagnosticsSDK.stopMicCheck()

Test speaker

To test the device speaker phone, call the startSpeakerTest API. This will playback the file that was recorded while testing mic. If there is no recorded file present in the cache, either due to microphone issues or startMicCheck not being called, the speaker test API will play a pre-recorded file

You can check if the speaker is working correctly if you are able to properly listen to the recording from startMicCheck API or the pre recorded file

// get the diagnostic instance val diagnosticsSDK = hmsSDK.getDiagnosticsSDK("user-id") diagnosticsSDK.startSpeakerCheck()

To stop the speaker check, call the stopSpeakerCheck API. This will release any resources that the Diagnostics SDK might be holding while playing back

// get the diagnostic instance val diagnosticsSDK = hmsSDK.getDiagnosticsSDK("user-id") diagnosticsSDK.stopSpeakerCheck()

Test Camera

To check if the camera on the device is working fine, call the startCameraCheck API. This API creates a local video track and returns it. In case there are any issues with the camera module it will be sent in the onError callback of the HMSCameraCheckListener Applications can use HMSVideoView to show the video track to users, using the same way any other HMSVideoTrack is shown. This API takes the CameraFacing as a parameter, where you can specify either the FRONT or REAR camera. By default FRONT camera is chosen if nothing is specified

// get the diagnostic instance val diagnosticsSDK = hmsSDK.getDiagnosticsSDK("user-id") diagnosticsSDK.startCameraCheck(HMSVideoTrackSettings.CameraFacing.FRONT, object : HMSCameraCheckListener { override fun onError(error: HMSException) { // Error in opening or getting frames from camera } override fun onVideoTrack(localVideoTrack: HMSVideoTrack) { // Show this track on a HMSVideoView } })

To stop the camera check and release all the resources, call stopCameraCheck API

// get the diagnostic instance val diagnosticsSDK = hmsSDK.getDiagnosticsSDK("user-id") diagnosticsSDK.stopCameraCheck()

Test Connectivity to 100ms Servers

To test if there is proper connectivity to 100ms signalling and media servers, you have to call the startConnectivityCheck API. This API establishes connections to both the signalling and media server and publishes audio and video data to determine the network quality and generates a report of the connection. This API uses the local device's mic and camera to send audio-video data and receives back the same data to calculate network stats for both upload and download link of user's network. Please note that Camera and microphone permissions are needed to be provided by the app before making a call to this API This test runs for 20 seconds. It may finish earlier if connection cannot be established.

While the connectivity test is being performed, regular updates are sent by the SDK via the onConnectivityStateChanged callback. The application can listen to these updates and notify the UI to show updates regarding the test. When the test is completed, onCompleted callback is sent to the application with the complete ConnectivityCheckResult that has details about all the reports generated.

// get the diagnostic instance val diagnosticsSDK = hmsSDK.getDiagnosticsSDK("user-id") diagnosticsSDK.startConnectivityCheck(regionCode, object : live.hms.video.diagnostics.ConnectivityCheckListener { override fun onCompleted(result: ConnectivityCheckResult) { // called after the connectivity } override fun onConnectivityStateChanged(state: ConnectivityState) { // Updates about the test being performed. } })

Understanding ConnectivityState updates

The startConnectivityCheck API sends regular updates while the test is going on. ConnectivityState is an enum which has the following states:

enum class ConnectivityState { STARTING, INIT_FETCHED, SIGNAL_CONNECTED, ICE_ESTABLISHED, MEDIA_CAPTURED, MEDIA_PUBLISHED, COMPLETED }
NameDescription
STARTINGInitial state of the connectivity test
INIT_FETCHEDInit API is fetched successfully
SIGNAL_CONNECTEDConnected to signalling server successfully
ICE_ESTABLISHEDConnected to Media Server successfully
MEDIA_CAPTUREDLocal Media captured successfully
MEDIA_PUBLISHEDLocal media started uploading successfully
COMPLETEDConnectivity test has completed

The above states can be used to show the progress of the test on UI

When the test completes - either successfully or un-successfully, the onCompleted callback is called with the test result in ConnectivityCheckResult which has the following fields:

NameDescription
testTimestampTimestamp in millisecond when the test was conducted
connectivityStateThe final connectivityState after the test was completed.
signallingReportReport describing the signalling server connection
mediaServerReportReport describing the media server connection
deviceTestReportReport describing the camera, speaker and mic test result
errorsList of HMSException that were encountered while performing the test, if any

SignallingReport comprises of the following field

NameDescription
isConnectedwhether signalling server was successfully connected or not
isInitConnectedwhether test was able to reach INIT server
websocketUrlThe websocket url that was used for the connection to signalling server

MediaServerReport comprises of the following field:

NameDescription
isPublishICEConnectedwhether the publish connection to media server was successful
isSubcribeICEConnectedwhether the subscribe connection to media server was successful
connectionQualityScoreA floating value denoting the overall network quality of the user. It ranges from 0 to 5, with 5 being the best score
statsDenotes the overall audio and video stats that was uploaded and downloaded
publishIceCandidatesGatheredA list of all ICE candidates that were gathered for the publish connection
subscribeIceCandidatesGatheredA list of all ICE candidates that were gathered for the subscribe connection
publishICECandidatePairSelectedThe candidate pair that was selected for the publish connection
'subscribeICECandidatePairSelected'The candidate pair that was selected for the subscribe connection

In the stats, SDK reports the following stats for both audio and video separately in the stats field of MediaServerReport

NameDescription
bytesSentNumber of video/audio Bytes that were uploaded from the client to SFU
bytesReceivedNumber of video/audio Bytes that were downloaded from the SFU
packetsReceivedNumber of video/audio packets that were received from the SFU
packetsLostNumber of video/audio packets that were lost while sending to the SFU
bitrateSentThe upload audio/video bitrate in kbps
bitrateReceivedThe download or received bitrate of the remote audio/video track in kbps
roundTripTimeThe publish side round trip time from the device to the media Server of 100ms. This is same for both audio and video tracks

Have a suggestion? Recommend changes ->

Was this helpful?

1234