Streaming from Cameras
AI Stream
library ensures smooth and high quality streaming experience between application and camera. You can stream your cameras from anywhere using 3G/4G/WiFi. The Stream
library manages the streaming logic and underlying operations and is capable of managing concurrent streaming with multiple endpoints. It can notify you of network connectivity during streaming. Using this library, you can also record audio and video of a stream. It can also record based on motion sensor event. Additionally, it can also provide AI data of detected object for vision cameras.
Getting Ready
Build and start the Stream
library to get started with streaming from an endpoint.
See this section to learn about how to start and build the Stream
library.
Registering callback functions to Stream library
After the start of Stream
library, register some callbacks to the library prior to starting any camera stream. The application will get notified with different messages and actions during an ongoing stream through these callback functions. Registering callback functions are needed only once if they're not unregistered during the life cycle of the application.
Android
1. RegisterStreamStartListener
StreamStartListener streamStartListener = new StreamStartListener() { @Override public void onStreamStart(String withEndpointId, String connectionType) { } @Override public void onStreamStartResponse(String toEndpointId, Stream.StartStreamFailReason startStreamFailReason, boolean isSuccessful) { } }; stream.registerStreamConnectivityChangeListener(streamConnectivityChangeListener);
2. RegisterStreamStopListener
StreamStopListener streamStopListener = new StreamStopListener() { @Override public void onStreamStopResponse(String withEndpointId, Stream.StreamStopReason reason, boolean status) { } @Override public void onStreamStop(String withEndpointId, Stream.StreamStopReason reason, boolean status) { } }; stream.registerStreamStopListener(streamStopListener);
3. RegisterStreamLibraryEventListener
StreamLibraryEventListener streamLibraryEventListener = new StreamLibraryEventListener() { @Override public void onStart(int status) { } @Override public void onStop() { } }; stream.registerStreamLibraryEventListener(streamLibraryEventListener);
4. RegisterStreamPauseListener
StreamPauseListener streamPauseListener = new StreamPauseListener() { @Override public void onStreamPause(String fromEndPoint, Stream.MediaType[] mediaTypeList, int status) { } @Override public void onStreamPauseResponse(String fromEndPoint, Stream.MediaType[] mediaTypeList, int status) { } }; stream.registerStreamPauseListener(streamPauseListener);
5. RegisterStreamResumeListener
StreamResumeListener streamResumeListener = new StreamResumeListener() { @Override public void onStreamResume(String fromEndPoint, Stream.MediaType[] mediaTypeList, int status) { } @Override public void onStreamResumeResponse(String fromEndPoint, Stream.MediaType[] mediaTypeList, int status) { } }; stream.registerStreamResumeListener(streamResumeListener);
6. RegisterStreamRecordingListener
StreamRecordingListener streamRecordingListener = new StreamRecordingListener() { @Override public void onStartRecordingResponse(String toEndpointId, boolean status) { } @Override public void onStopRecordingResponse(String toEndpointId, boolean status) { } @Override public void onReceivedRecordingListResponse(String fromEndPoint, HashMap<Stream.MediaType, String[]> recordingMediaTypeMap) { } }; stream.registerStreamRecordingListener(streamRecordingListener);
7. RegisterStreamInputSourceRemoveListener
StreamInputSourceRemoveListener streamInputSourceRemoveListener = new StreamInputSourceRemoveListener() { @Override public void onRemoveInputSource(String fromEndPoint, int SSRC) { } }; stream.registerStreamInputSourceRemoveListener(streamInputSourceRemoveListener);
8. RegisterStreamInputSourceInfoChangeListener
StreamInputSourceInfoChangeListener streamInputSourceInfoChangeListener = new StreamInputSourceInfoChangeListener() { @Override public void onInputSourceInfoChanged(String fromEndPoint, Stream.InputSourceInformation[] sourceInformation) { } }; stream.registerStreamInputSourceInfoChangeListener(streamInputSourceInfoChangeListener);
9. RegisterStreamConnectivityChangeListener
StreamConnectivityChangeListener streamConnectivityChangeListener = new StreamConnectivityChangeListener() { @Override public void onStreamConnectivityChange(String fromEndpointId, Stream.StreamTransitionState streamTransitionState, String connectionType) { } }; stream.registerStreamConnectivityChangeListener(streamConnectivityChangeListener);
10. RegisterStreamDataReceiveListener
StreamDataReceiveListener streamDataReceiveListener = new StreamDataReceiveListener() { @Override public void onStreamReceive(String fromEndpointId, Stream.MediaType[] requestedMediaTypes) { } @Override public void onReceiveFrame(String fromEndpoint, Stream.MediaType mediaType, byte[] frameData, long frameLength, Stream.FrameProperties frameProperties, long timeStamp) { } @Override public void onReceivedVideoSpecificData(String fromEndpoint, int SSRC, long timeStamp, String data) { } }; stream.registerStreamDataReceiveListener(streamDataReceiveListener);
iOS
1. RegisteronStreamStart
callback
- Call the
onStreamStart(onStreamStartHandler handler)
method of theStream
class to register the streamstart callback function. You will be notified through this callback when a stream is started afterstartStream()
is called for an endpoint.Declaration of
onStreamStartHandler
callback function is:typedef std::function<bool(const EndPointUUID &fromEndPoint, const std::string& connectionType)> onStreamStartHandler;
- Once the start stream callback function is called, call the
attachVideoWindow(const EndPointUUID &fromEndPoint, long long viewId)
method ofStream.h
to set the video rendering view. Rendering view must be a instance ofUIImageView
.stream->onStreamStart([self](const EndPointUUID &fromEndPoint, const std::string& connectionType) ->bool { //fromEndPoint, stream source endpoint UUID //connectionType, can be P2P, NAPT based on connectionType determined by connect library void* renderingViewRef = (__bridge void*) videoView; // videoView is of type UIImageView stream->attachVideoWindow(fromEndPoint, (long long) receivingViewRef); return true; });
2. RegisteronStreamStop
callback
- Call the
onStreamStop(onStreamStopHandler handler)
method of theStream
class to register the stream stop callback function. You will be notified through this callback when a stream is stopped from theStream
library for any reasons.Declaration of
onStreamStopHandler
callback function is:typedef std::function<void(const EndPointUUID &fromEndPoint, const int &status)> onStreamStopHandler;
stream->onStreamStop([self](const EndPointUUID &fromEndPoint, const int &status) { //fromEndPoint, stream source endpoint UUID //status, stream stop status either 1 or 0 });
3. RegisteronStreamReceive
callback
- Call the
onStreamReceive(onStreamReceiveHandler handler)
method of theStream
class to register the stream start callback function. You will be notified through this callback upon the stream receiving being started for an endpoint.Declaration of
onStreamReceiveHandler
callback function is:typedef std::function<bool(const EndPointUUID &toEndPoint, MediaType mediaType)> onStreamReceiveHandler;
and definition of
MediaType
is:enum class MediaType { AUDIO, VIDEO, IMAGE };
stream->onStreamReceive([self](const EndPointUUID &toEndPoint, const MediaType mediaType) -> bool { //fromEndPoint, stream source endpoint UUID //mediaType, either AUDIO, VIDEO or IMAGE return true; }):
4. RegisteronStreamPause
callback
- Call
onStreamPause(onStreamPauseHandler handler)
method ofStream
class to register the stream pause callback function. You will be notified through this callback when a stream is paused afterpauseStream()
is called with a list ofMediaType
for an endpoint.Declaration of
onStreamPauseHandler
callback function is:typedef std::function<void(const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, const int &status)> onStreamPauseHandler;
stream->onStreamPause([self](const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, const int &status) { //fromEndPoint, stream source endpoint UUID //mediaTypeList, paused media types //status, stream pause status either 1(successful) or 0(failed) });
5. RegisteronStreamResume
callback
- Call the
onStreamResume(onStreamResumeHandler handler)
method of theStream
class to register the callback function. You will be notified through this callback when a stream is resumed afterresumeStream()
is called with a list ofMediaType
for an endpoint.Declaration of
onStreamResumeHandler
callback function is:typedef std::function<void(const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, const int &status)> onStreamResumeHandler;
stream->onStreamResume([self](const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, const int &status) { //fromEndPoint, stream source endpoint UUID //mediaTypeList, resumed media types //status, stream resume status either 1(successful) or 0(failed) });
6. RegisteronStreamConnectivityChange
callback
- Call the
onStreamConnectivityChange(onStreamConnectivityChangeHandler handler)
method of theStream
class to register the callback function. You will be notified through this callback when the stream connectivity's state has changed.Declaration of
onStreamConnectivityChangeHandler
callback function is:typedef std::function<void(const EndPointUUID &fromEndpoint, const StreamTransitionState streamTransitionState, const std::string &connectionType)> onStreamConnectivityChangeHandler;
enum class StreamTransitionState { STOPPED, RECOVERING, RESTARTED, RESTART_FAILED, RETRYING, RUNNING };
stream->onStreamConnectivityChange([self] (const EndPointUUID &fromEndpoint, const StreamTransitionState streamTransitionState, const std::string &connectionType) ->bool { //fromEndPoint, stream source endpoint UUID //streamTransitionState, newly changed state //connectionType, can be P2P, NAPT based on connectionType determined by connect library return true; });
7. RegisteronReceivedVideoResolutionChange
callback
- Call the
onReceivedVideoResolutionChange(onReceivedVideoResolutionChangeHandler handler)
method of theStream
class to register the callback function. You will be notified through this callback when the stream's video resolution has changed.Declaration of
onReceivedVideoResolutionChangeHandler
callback function is:typedef std::function<void(const EndPointUUID &fromEndpoint, VideoResolution &updatedVideoResolution)> onReceivedVideoResolutionChangeHandler;
enum class VideoResolution { WH_176x144, WH_192x144, WH_320x240, WH_352x288, WH_640x400, WH_640x480, WH_840x480, WH_848x480, WH_1280x720, WH_1920x1080, WH_3840x2160, WH_7680x4320, WH_15360x8640 };
stream->onReceivedVideoResolutionChange([self] (const EndPointUUID &fromEndpoint, VideoResolution &videoResolution) { //fromEndPoint, stream source endpoint UUID //videoResolution, changed video resolution of stream });
8. RegisteronReceivedVideoSpecificData
callbackIf AI sensor is enabled on the camera device, you will get AI-related information. to get AI-related information (face detections, car detections etc.) from a streaming session, register the following callback function.
- Call the
onReceivedVideoSpecificData(onReceivedVideoSpecificDataHandler handler)
method of theStream
class to register the callback function. Bounding box information of AI model objects are received through this callback method.Declaration of
onReceivedVideoSpecificDataHandler
callback function is:typedef std::function<void(const EndPointUUID &fromEndpoint, uint32_t &ssrc, time_t timestamp, const std::string &data)> onReceivedVideoSpecificDataHandler;
stream->onReceivedVideoSpecificData([self] (const EndPointUUID &fromEndpoint, uint32_t &ssrc, time_t timestamp, const std::string &data) ->bool { //fromEndPoint, stream source endpoint UUID //ssrc, random fixed integer value to specify camera //timestamp, time stamp of the video frame //data, video specific data return true; });
Here,
data
is a JSON Array formatted as a string, representing AI data of detected objects. Below is a sample response of thedata
JSON Array.[ { "eventtype":"com.anyconnect.vision", "label":"Person-1", "tag":"person", "confidence":27, "minx":620.312500, "miny":305.156250, "maxx":639.687500, "maxy":341.718750, "timestamp":1586238133 } ]
9. RegisteronRecordingCompleted
callback
- Call the
onRecordingCompleted(onStopRecordingHandler handler)
method of theStream
class to register the callback function. Thehandler
function will be called when the recording is completed.Declaration of
onStopRecordingHandler
callback function is:typedef std::function<void(const EndPointUUID &fromEndPoint, const int &status)> onStopRecordingHandler;
stream->onRecordingCompleted([self](const EndPointUUID &fromEndPoint, const int &status) { //fromEndPoint, stream source endpoint UUID //status, stream recording status either 1(successful) or 0(failed) });
Set Input Sources and Capabilities
Android
This is necessary when we want to stream from the Android device to other endpoints. For the viewer application, this should not be necessary at all.
String[] audioInputSources = new String[] {"pcm"}; AnyConnectApi.get().getStream().setInputSources(Stream.MediaType.AUDIO, audioInputSources); String[] videoInputSources = new String[] {"0"}; AnyConnectApi.get().getStream().setInputSources(Stream.MediaType.VIDEO, videoInputSources);
iOS
To initiate streaming from your device to other endpoint, set video input sources of your device to the
Stream
library. You also need to set capabilities of corresponding input sources. Setting input sources isoptional
; however, with the current configurations of iOS, you need to set capabilities for video input sources.Default video input source is 0 which means Front camera of your device. Video capabilities include information about video codec, video resolution, fps, bit rate and recording triggers.
Functions declarations are:
StreamRet setInputSources(MediaType mediaType, const std::vector<std::string> &sourceList, onSetInputSourcesHandler handler); //mediaType, selected media type: VIDEO or AUDIO //sourceList, list of input sources are going to be used by stream library in the following streaming //handler, function pointer called on setting input sources typedef std::function<void(const std::vector<std::string> &inputSourceList, const std::vector<InputSourceStatus> &status)> onSetInputSourcesHandler StreamRet setVideoCapabilities(const std::string &inputSource, const VideoCapabilities &videoCapabilities);
Code example of setting video input source and capabilities:
std::vector<std::string> videoInputSources; videoInputSources.push_back("0"); streamRet = stream->setInputSources(MediaType::VIDEO, videoInputSources, [self] (const std::vector<std::string> &inputSourceList, const std::vector<InputSourceStatus> &status) { //status, list of setting status of input sources }); stream->setVideoCapabilities("0", {VideoCodec::H264, VideoResolution::WH_1280x720, DEFAULT_VIDEO_FPS, DEFAULT_VIDEO_BITRATE, {RecordingTriggers::SOUND}}); });
Starting a Stream
Android
- To start a streaming session with a device, call the
startStream(String fromEndpoint, MediaType[] mediaPropertyList)
method with proper parameters. An immediate response with the status of this method call will be returned. If thestatus
isOK
the the stream start is called successfully. It can fail for passing wrong parameters.Stream.StreamRet result = AnyConnectApi.get().getStream().startStream(fromEndpoint, mediaPropertyList); // fromEndpoint, Endpoint id to which you want to start streaming // mediaPropertyList, list of desired media types in the stream. This is a enum defined in Stream.java. Values can be AUDIO, VIDEO and IMAGE if(result == Stream.StreamRet.OK){ //stream start is successful }else { //stream failed to start }
- Now, if the streaming session is established successfully with the Smarter AI Camera device, the
onStreamStart
andonStreamStartResponse
methods will be invoked from your registeredStreamStartListener
callback.StreamStartListener streamStartListener = new StreamStartListener() { @Override public void onStreamStart(String toEndpointId, String connectionType) { // toEndpointId, endpoint Id of the streaming camera //status, can be true or false based on successful or fail connection determined by Connect library // connectionType, can P2P, NAPT based on connectionType determined by Connect library } @Override public boolean onStreamStartResponse(String withEndpointId, StartStreamFailReason startStreamFailReason, boolean isSuccessful) { // withEndpointId, endpointId of the streaming camera // startStreamFailReason, in case the stream fails to start, the reason of failure is given in this parameter // isSuccessful indicates if the stream session is established successfully or not } };
iOS
- To start the stream of a camera with an endpoint, call the
startStream(const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, onStreamRequestHandler handler)
method of theStream
class. An immediate response with the status of this method call will be notified through thehandler
function.Declaration of
onStreamRequestHandler
is:typedef std::function<void(const EndPointUUID &fromEndPoint, const int &status)> onStreamRequestHandler;
- If the
status
isOK
, call theattachVideoWindow(const EndPointUUID &fromEndPoint, long long viewId)
method of theStream
class to set the video rendering view. Rendering view must be a instance ofUIImageView
.std::vector<MediaType> mediaTypeList; mediaTypeList.push_back(MediaType::AUDIO); mediaTypeList.push_back(MediaType::VIDEO); stream->startStream(endpointID, mediaTypeList, [=](const Endpoint &fromEndPoint, const int &status) { if (status) { void* renderingViewRef = (__bridge void*) videoView; // videoView is of type UIImageView stream->attachVideoWindow(fromEndPoint, (long long) receivingViewRef); } else { // start stream failed } });
- After the stream connection has been established, the onStreamStart callback function will be called to notify you that the stream has been started.
Receiving and Processing Frames
Android
After calling
startStream(String fromEndpoint, MediaType[] mediaPropertyList)
, the library callsonStreamStart(String withEndpointId, String connectionType)
when the connection is established between camera and application. At this point, the library calls the following method atSipJniWrapper
every time with a new frame.static void JHandleEncodedVideoData(byte[] encodedData, int length, boolean bKeyFrame, int iWidth, int iHeight, int SSRC, long endpointId) { // encodedData, data that is needed to be decoded for rendering // length, size of the array of encoded data // bKeyFrame, a variable that indicates it is P-Frame or I-Frame // iWidth, width of the frame of the streaming camera // iHeight, height of the frame of the streaming camera // SSRC, an integer that represent a Streaming Source // endpointId, endpointId of the streaming camera }
After receiving the data, decode and render it to
TextureView
orSurfaceView
as per your requirements. To accomplish this, you needMediaCodec
to decode and render the frames. A rough implementation of the procedure can resemble the following:static void JHandleEncodedVideoData(byte[] encodedData, int length, boolean bKeyFrame, int iWidth, int iHeight, int SSRC, long endpointId) { .......... decode(encodedData); render(); }
The section of decoding is given below.
void decode(byte[] encodedData) { int inIndex = mMediaCodec.dequeueInputBuffer(DEQUEUE_INPUT_TIME_OUT_USEC); if (inIndex = 0) { ByteBuffer inputBuffer; if (Build.VERSION.SDK_INT < Build.VERSION_CODES.LOLLIPOP) { inputBuffer = mMediaCodec.getInputBuffers()[inIndex]; inputBuffer.clear(); } else { inputBuffer = mMediaCodec.getInputBuffer(inIndex); } if (inputBuffer != null) { inputBuffer.put(data, 0, length); long presentationTimeUs = System.nanoTime() / 1000L; mMediaCodec.queueInputBuffer(inIndex, 0, length, presentationTimeUs, 0); } } }
A sample implementation of rendering is given below:
void render() { int outIndex = mMediaCodec.dequeueOutputBuffer(mInfo, DEQUEUE_OUTPUT_TIME_OUT_USEC); switch (outIndex) { case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED: // ......................................... break; case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED: // ......................................... break; case MediaCodec.INFO_TRY_AGAIN_LATER: // ......................................... break; default: mMediaCodec.releaseOutputBuffer(outIndex, true); break; }
Above, decoding and rendering are in synchronous manner. See [here](https://developer.android.com/reference/android/media/MediaCodec#configure(android.media.MediaFormat,%20android.view.Surface,%20android.media.MediaCrypto,%20int)
iOS
The stre> am library handles the responsibility of rendering the frames in view for iOS. This part has been described here.
Pausing a Streaming
Android
- To pause a stream, call the
pauseStream(String fromEndpoint, MediaType[] mediaPropertyList)
method with proper parameters.stream.pauseStream(fromEndpoint, mediaPropertyList); // fromEndpoint, Endpoint id to which you want to pause streaming // mediaPropertyList, element of list can be VIDEO and AUDIO
- Now, the response of
pauseStream()
method will be provided ononStreamPauseResponse
andonStreamPause
methods of the registeredStreamPauseListener
callback accordingly.StreamPauseListener streamPauseListener = new StreamPauseListener() { @Override public void onStreamPauseResponse(String fromEndPoint, MediaType[] mediaTypeList, int status) { // this method gets invoked immediately after calling pauseStream method // fromEndPoint, endpoint Id of the device whose stream being paused // mediaTypeList, element of list can be VIDEO and AUDIO // status, can be 1 or 0 based on successful or fail pause determined by Connect library } @Override public void onStreamPause(String fromEndPoint, MediaType[] mediaTypeList, int status)) { // this method gets invoked when connect library pauses the stream // fromEndPoint, endpointId of the device whose stream got paused // mediaTypeList, element of list can be VIDEO and AUDIO // status, can be 1 or 0 based on successful or fail pause determined by Connect library } }
iOS
- To pause a stream, call the
pauseStream(const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, onStreamPauseHandler handler)
method of theStream
class. A response is within status of this method call and you will be notified through thehandler
function.Declaration of
onStreamPauseHandler
is:typedef std::function<void(const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, const int &status)> onStreamPauseHandler;
Example of pause stream:
std::vector<MediaType> mediaTypeList; mediaTypeList.push_back(MediaType::VIDEO); stream->pauseStream(endpointID, mediaTypeList, [=](const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, const int &status) { if (status) { //call for pause stream is successful } });
- After streaming has been paused, the callback function onStreamPause will be called to notify that the stream has been paused.
Resuming a Streaming
Android
- To resume a paused stream, call the
resumeStream(String fromEndpoint, MediaType[] mediaPropertyList)
method with proper parameters.stream.resumeStream(fromEndpoint, mediaPropertyList) // fromEndpoint, Endpoint id to which you want to pause streaming // mediaPropertyList, element of list can be VIDEO and AUDIO
- Now, the response of
resumeStream()
method will be provided ononStreamResumeResponse
andonStreamResume
methods of the registeredStreamResumeListener
callback accordingly.StreamResumeListener streamResumeListener = new StreamResumeListener() { @Override public boolean onStreamResumeResponse(String fromEndPoint, MediaType[] mediaTypeList, int status) { // this method gets invoked immediately after calling resumeStream method // fromEndPoint, endpointId of the streaming camera // mediaTypeList, can be Audio,Video // status, 1 and 0 indicates success or fail accordingly } @Override public boolean onStreamResume(String fromEndPoint, MediaType[] mediaTypeList, int status)) { // this method gets invoked when connect library resumes the stream // fromEndPoint, endpointId of the streaming camera // mediaTypeList, can be Audio,Video // status, 1 and 0 indicates success or fail accordingly } }
iOS
- To resume a paused stream, call the the
resumeStream(const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, onStreamResumeHandler handler)
method of theStream
class. An immediate response is within the status of this method call and you will be notified through thehandler
function.Declaration of
onStreamResumeHandler
is:typedef std::function<void(const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, const int &status)> onStreamResumeHandler;
Example of resume stream:
std::vector<MediaType> mediaTypeList; mediaTypeList.push_back(MediaType::VIDEO); stream->resumeStream(endpointID, mediaTypeList, [=](const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, const int &status) { if (status) { //call for resume stream is successful } });
- After streaming has been resumed, onStreamResume callback function will be called to notify you that the stream has been resumed.
Stopping a Streaming
Android
- To stop a stream, call the
stopStream(String fromEndpoint)
method. This method will return 'OK' if all the parameters are okay to process. Otherwise, it will response with a 'FAIL':stream.stopStream(fromEndpoint) // fromEndpoint, endpoint id to which you want to stop stream session
- Now, the response of
stopStream
method will be provided on theonStreamStopResponse
andonStreamStop
methods of the registeredStreamStopListener
callback.StreamStopListener streamStopListener = new StreamStopListener() { ...................................... @Override void onStreamStopResponse(String withEndpointId, Stream.StreamStopReason reason, boolean status) { // this method gets invoked immediately after calling stopStream method // withEndpointId, endpoint id of the streaming camera // reason defines the exact cause of stream stop // status, immediate success or fail status on call invocation return true; } @Override void onStreamStop(String withEndpointId, Stream.StreamStopReason reason, boolean status) { // this method gets invoked when connect library stops the stream // withEndpointId, endpointId of the streaming camera // reason defines the exact cause of stream stop // status, immediate success or fail status on call invocation return true; } };
- You may get the
onStreamStop()
invoked for many other reasons such as if access is revoked or if internet connection drops, etc.
iOS
- To stop a stream for an endpoint, call the
stopStream(const EndPointUUID &fromEndPoint, onStreamStopHandler handler)
method of theStream
class. An immediate response is within status of this function call and you will be notified through thehandler
function.Declaration of
onStreamStopHandler
is:typedef std::function<void(const EndPointUUID &fromEndPoint, const int &status)> onStreamStopHandler;
Example of stopping a stream:
std::vector<MediaType> mediaTypeList; mediaTypeList.push_back(MediaType::VIDEO); stream->stopStream(endpointID, [](const Endpoint &fromEndPoint, const int &status) { if (status) { //call for stop stream is successful } });
- After the streaming session has been closed for the endpoint, the onStreamStop callback function will be called to notify that the stream has been stopped.
Connectivity Change while Streaming
Android
When the Connect library is trying to switch the network base on current network situations (E.g. Connection up-down,manually switching network etc.), the connectivity state is reported by the
onStreamConnectivityChange(String fromEndpointId, Stream.StreamTransitionState streamTransitionState, String connectionType)
method of the registeredStreamConnectivityChangeListener
callback.StreamConnectivityChangeListener streamConnectivityChangeListener = new StreamConnectivityChangeListener() { @Override public void onStreamConnectivityChange(String fromEndpointId, Stream.StreamTransitionState streamTransitionState, String connectionType) { // fromEndpointId, endpointId of the streaming camera // streamTransitionState, transition state can be STOPPED, RECOVERING, RESTARTED, RESTART_FAILED, RETRYING or RUNNING // connectionType, can P2P, NAPT based on connectionType determined by Connect library } };
iOS
This part is described earlier here
Sending and Receiving Stream Statistics
During streaming, you will be able to display stream statistics of the ongoing stream. Sending statistics includes information like connection type, stream setup time, latency etc. On the other hand receiving statistics includes streaming information like transfer rate, frame rate, received packet counts, video resolution, video quality level, AI container name etc.
Sending Statistics
Android
To get send statistics information from a running stream, call the
getSendStatistics(String endpointId)
method with the proper parameter.Stream.StreamStatistics sendStatistics = stream.getSendStatistics(endpointId);
On the sendStatistics object, you will get:
// senderQualityFactor, Video sending quality. // sampleRate, Audio sample rate.
iOS
To get sending statistics data for a running stream, call the
getSendStatistics(const EndPointUUID &toEndpoint, MediaType mediaType, StreamStatistics::Ptr streamStatistics)
method of theStream
class. Stream statistics will passed through thestreamStatistics
object.Example of fetching sending statistics of VIDEO stream:
com::anyconnect::stream::StreamStatistics::Ptr streamStatistics(new com::anyconnect::stream::StreamStatistics); stream->getSendStatistics(endpointID, MediaType::VIDEO, streamStatistics);
Receiving Statistics
Android
To receive streaming video information, call the
getReceiveStatistics(String endpointId)
method with the proper parameter.Stream.StreamStatistics receiveStatistics = stream.getReceiveStatistics(endpointId);
On the receiveStatistics object, you will get:
// framesReceived, The number of received frames // frameRate, Received frame rate // packetsReceived, The number of received packets // packetsLost, The number of lost packets // bitRateVideoData, Bitrate of video payload // bitRateOverHead, Bitrate of stream header. Includes TCP/IP header, RTP header, Video header and header extension // packetLossRate, Packet loss rate // averageDelay, Average delay in milliseconds for the preceding one second // sampleRate, Audio sample rate
iOS
To get receiving statistics data for a running stream, call the
getReceiveStatistics(const EndPointUUID &fromEndpoint, MediaType mediaType, StreamStatistics::Ptr streamStatistics)
method of theStream
class. Stream statistics will passed through thestreamStatistics
object.Example of fetching receiving statistics of VIDEO stream:
com::anyconnect::stream::StreamStatistics::Ptr streamStatistics(new com::anyconnect::stream::StreamStatistics); stream->getReceiveStatistics(endpointID, MediaType::VIDEO, streamStatistics);
Overview
Android
If an AI sensor is enabled on the camera device, you will get AI-related information. The Ai related events are posted to the
onReceivedVideoSpecificData
method of the registeredStreamDataReceiveListener
callback.@Override public void onReceivedVideoSpecificData(String fromEndpoint, int SSRC, long timeStamp, String data) { // fromEndpoint, endpointId of the streaming camera // SSRC, random fixed integer value to specify camera ( endpoint device can have multiple camera ) // timeStamp, time of receiving any frame....0 always... // data, contains AI information in json format ( E.g. co-ordinates of bounding box etc. )... it contains timestamp also }
Here, data is a JSON Array formatted as string representing AI data of multiple detected objects. Below is a sample response of
data
:[ { "eventtype":"com.anyconnect.vision", "label":"Person-1", "tag":"person", "confidence":27, "minx":620.312500, "miny":305.156250, "maxx":639.687500, "maxy":341.718750, "timestamp":1586238133 } ]
From the response, you will get:
tag - Class of the detected object. E.g. Person, Car etc.
label - Given name of the detected object.
confidence - Confidence level of detection by AI model.
minx/maxx/miny/maxy - Coordinate of rectangle box of the bounding box of 640 * 480 resolution frame.
time - Time when data was captured.
SSRCSSRC means streaming source id. When any endpoint has a single streaming source, fixed default value is return on SSRC. When any endpoint has multiple streaming source, it wil return SSRCs for each camera.
Pause/Resume
- When any streaming is in paused state, you will not get any call invocation from the
onReceivedVideoSpecificData(String fromEndpoint, int SSRC, long timeStamp, String data)
callback.- When any streaming is resumed from paused state, you will get a call invocation from the
onReceivedVideoSpecificData(String fromEndpoint, int SSRC, long timeStamp, String data)
callback if AI sensor is enabled on the camera.
iOS
T> his part is described earlier here
Recording a Stream
Start Recording
Android
- To start a recording on a running stream session, call the
startRecording(String fromEndpoint, long referenceTime, long actionTime, long duration, MediaType[] mediaPropertyList, String[] sourceListArray, bean eventTrigger)
method with proper parameters.@Override public native StreamRet startRecording(String fromEndpoint, long referenceTime, long actionTime, long duration, MediaType[] mediaPropertyList, String[] sourceListArray, boolean eventTrigger) { // fromEndpoint, endpointId of the streaming camera // referenceTime, Current UTC time in seconds // actionTime, The starting time of recording // duration, The intended duration of the recording. Setting it to 0 will cause the recording to continue indefinitely // mediaPropertyList, element of list can be VIDEO and AUDIO // sourceListArray, Endpoint Id of the cameras for which recording has to be started // eventTrigger, A boolean denoting whether the recording was event triggered or not. }
- The response of
startRecording()
will be provided on theonStartRecordingResponse
method of the registeredStreamRecordingListener
callback.@Override public void onStartRecordingResponse(String toEndpointId, boolean status) { // toEndpointId, endpointId of the streaming camera // status, immediate success or fail status on call invocation }
iOS
- To start recording a stream of a camera, call
startRecording
method ofStream
class. An immediate response with status of this function call will be notified through thehandler
function.Declaration of
startRecording
is:bool startRecording(const EndPointUUID &fromEndPoint, time_t referenceTime, time_t actionTime, time_t duration, const std::vector<MediaType> &mediaTypeList, const std::vector<std::string> &sourceList, bool eventTrigger, onStartRecordingHandler handler); // fromEndpoint, endpointId of the recording camera // referenceTime, Current UTC time in seconds // actionTime, The starting time of recording // duration, The intended duration of the recording. Setting it to 0 will cause the recording to continue indefinitely // mediaTypeList, a list can be VIDEO and/or AUDIO // sourceListArray, list of source Ids of endpoint for which recording has to be started // eventTrigger, A boolean denoting whether the recording was event triggered or not
and declaration of
onStartRecordingHandler
is:typedef std::function<bool(const EndPointUUID &fromEndPoint, const int &status)> onStartRecordingHandler;
- After the stream recording has been completed, onRecordingCompleted callback function will be called to notify recording has been completed.
Stop Recording
Android
- To stop an ongoing recording event, call
stopRecording(String fromEndpoint, MediaType[] mediaPropertyList)
method with proper parameters. This method will return OK immediately if all the parameters are okay to process otherwise it response FAIL;public native StreamRet stopRecording(String fromEndpoint, MediaType[] mediaPropertyList) { // fromEndpoint, endpoint id to which you want to stop stream session // mediaPropertyList, element of list can be VIDEO and AUDIO }
- The response of
stopRecording()
will be provided on theonStopRecordingResponse
method of the registeredStreamRecordingListener
callback.@Override public boolean onStopRecordingResponse(String toEndpointId, boolean status) { // toEndpointId, endpointId of the streaming camera // status, immediate success or fail status on call invocation }
iOS
- To stop a ongoing recording, call
stopRecording(const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, onStopRecordingHandler handler)
method ofStream
class. An immediate response with status of this function call will be notified through thehandler
function.Declaration of
onStopRecordingHandler
is:typedef std::function<void(const EndPointUUID &fromEndPoint, const int &status)> onStopRecordingHandler;
std::vector<MediaType> mediaTypeList; mediaTypeList.push_back(MediaType::VIDEO); stream->stopRecording(endpointID, mediaTypeList, [](const Endpoint &fromEndPoint, const int &status) { if (status) { //call for stop recording is successful } });
- After the successful call of stop recording, onRecordingCompleted callback function will be called to notify recording has been completed.
Set Recording Buffer
Android
Buffering time can be set using the following method.
public native StreamRet setRecordingBuffer(int bufferTime, int duration); // bufferTime, the duration of the recording provided in the seconds, which will be // duration, duration of the recordings
iOS
Buffering time can be set using the following method.
void setRecordingBuffer(int bufferTime, int duration); // bufferTime, the time of the recording buffer in seconds // duration, the duration of the recording in seconds
Get Recording List
Android
- To get a recording list, call
getRecordingList(String fromEndpoint, long startTime, long endTime)
method with proper parameters.public native StreamRet getRecordingList(String fromEndpoint, long startTime, long endTime); // fromEndPoint, endpointId of the streaming camera // startTime, Initiation time of the earliest recording to be received // endTime, Initiation time of the latest recording to be received
- The recording list will be provided on the
onReceivedRecordingListResponse()
method of the registeredStreamRecordingListener
callback.@Override public void onReceivedRecordingListResponse(String fromEndPoint, HashMap<MediaType, String[]> recordingMediaTypeMap) { // fromEndPoint, endpointId of the streaming camera // recordingMediaTypeMap, in this map, keys can be AUDIO or VIDEO, value for AUDIO key is the URL link of recorded AUDIO and value for VIDEO key is the URL link of recorded VIDEO respectively. }
iOS
- To get recording list, call the
getRecordingList
method ofStream
class. Recording list will be provided through thehandler
function.Declaration of
getRecordingList
is:void getRecordingList(const EndPointUUID &fromEndPoint, std::time_t startTime, std::time_t endTime, onRecordingListHandler handler); // fromEndpoint, stream source endpoint UUID // startTime, time of the newest recording // endTime, time of the oldest recording // handler, function pointer called when recording list has retrieved.
and declaration of
onRecordingListHandler
is:typedef std::function<void(const EndPointUUID &fromEndPoint, const std::map<MediaType, const std::vector<RecordingURL>> &recordingList)> onRecordingListHandler; // fromEndPoint, stream source endpoint UUID // recordingList, a map of recording URLs for media type (VIDEO, AUDIO)
Multiple Streaming
You can stream multiple cameras at the same time using Stream
library. To start and control multiple streaming, call stream related functions described above in this document individually with the endpointID
of those cameras you want to stream.
Starting Multiple Streaming
Android
To start multiple streaming, call
startStream(String fromEndpoint, MediaType[] mediaPropertyList)
with all theendpointIds
of the cameras that you want to stream.for (String id : endpointIds) { stream.startStream(id, mediaPropertyList); }
You will have the response for this call as single camera streaming with different
endpointIds
for which the calls were succeeded.
iOS
To start multiple streams, call
startStream(const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, onStreamRequestHandler handler)
method ofStream
class individually, described in this section with all theendpointIDs
of those cameras you want to stream.onStreamStart callback function will be called for each endpoints.
Pausing Multiple Streaming
Android
To pause multiple streaming, call
pauseStream(String fromEndpoint, MediaType[] mediaPropertyList)
with all theendpointIds
of the cameras that you want to pause.for (String id : endpointIds) { stream.pauseStream(id, mediaPropertyList); }
You will have the response for this call as single camera pausing with different
endpointIds
for which the calls were succeeded.
iOS
To pause multiple streams, call
pauseStream(const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, onStreamPauseHandler handler)
method ofStream
class individually, described in this section with all theendpointIDs
of the started streams.onStreamPause callback function will be called for each endpoints.
Resuming Multiple Streaming
Android
You can resume multiple camera streaming at the same time with Smarter AI
Stream
library. To resume multiple streaming, callresumeStream(String fromEndpoint, MediaType[] mediaPropertyList)
with all theendpointIds
of the cameras that you want to resume.for (String id : endpointIds) { stream.resumeStream(id, mediaPropertyList); }
You will have the response for this call as single camera resuming with different
endpointIds
for which the calls were succeeded.
iOS
To resume multiple streams, call
resumeStream(const EndPointUUID &fromEndPoint, const std::vector<MediaType> &mediaTypeList, onStreamResumeHandler handler)
method ofStream
class individually, described in this section with all theendpointIDs
of the paused streams.onStreamResume callback function will be called for each endpoints.
Stopping Multiple Streaming
Android
You can stop multiple camera streaming at the same time with Smarter AI
Stream
library. To stop multiple streaming, callstopStream(String fromEndpoint)
method fromStream
class with all theendpointIds
of the started streams.for (String id : endpointIds) { stream.stopStream(id); }
You will have the response for this call as single camera stopping with different
endpointIds
for which the calls were succeeded.
iOS
To stop multiple streams, call
stopStream(const EndPointUUID &fromEndPoint, onStreamStopHandler handler)
method ofStream
class individually, described in this section with all theendpointIDs
of the started streams.onStreamStop callback function will be called for each endpoints.
Receiving and Processing Frame(Android)
The methods of receiving and processing frames are discussed here. To decode and render data from multiple cameras the following implementation can be followed.
HashMap<Long, StreamDataCommand streamDataCommandMap = new HashMap<();
static void JHandleEncodedVideoData(byte[] encodedData, int length, boolean bKeyFrame, int iWidth, int iHeight, int SSRC, long endpointId) {
streamDataCommandMap.get(endpointId).decode(encodedData);
streamDataCommandMap.get(endpointId).render();
}
class StreamDataCommand {
private MediaCodec mMediaCodec;
private BufferInfo mInfo = new BufferInfo();
private static final String MIMETYPE_VIDEO_AVC = "video/avc";
public void configure(Surface mSurface) {
mMediaFormat = MediaFormat.createVideoFormat(MIMETYPE_VIDEO_AVC, iWidth, iHeight);
mMediaCodec = MediaCodec.createDecoderByType(MIMETYPE_VIDEO_AVC);
mMediaCodec.configure(mMediaFormat, mSurface, null, 0);
}
public void decode(byte[] data) {
int inIndex = mMediaCodec.dequeueInputBuffer(DEQUEUE_INPUT_TIME_OUT_USEC);
if (inIndex = 0) {
ByteBuffer inputBuffer;
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.LOLLIPOP) {
inputBuffer = mMediaCodec.getInputBuffers()[inIndex];
inputBuffer.clear();
} else {
inputBuffer = mMediaCodec.getInputBuffer(inIndex);
}
if (inputBuffer != null) {
inputBuffer.put(data, 0, length);
long presentationTimeUs = System.nanoTime() / 1000L;
mMediaCodec.queueInputBuffer(inIndex, 0, length, presentationTimeUs, 0);
}
}
}
public void render() {
int outIndex = mMediaCodec.dequeueOutputBuffer(mInfo, DEQUEUE_OUTPUT_TIME_OUT_USEC);
switch (outIndex) {
case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
// ........................................
break;
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
// .........................................
break;
case MediaCodec.INFO_TRY_AGAIN_LATER:
// .........................................
break;
default:
mMediaCodec.releaseOutputBuffer(outIndex, true);
break;
}
}
}
Updated over 2 years ago