Audio and Video

Last Updated on : 2024-04-15 04:33:48download

The IPC SDK can provide a bunch of audio and video capabilities in addition to live streaming and playback from the SD card.

Local recording

During live streaming or record playback, the ongoing videos can be recorded on a mobile phone.

API description

Record videos and save them to the album of the mobile phone.

- (void)startRecord;

API description

Record videos and save them to the specified location.

- (void)startRecordWithFilePath:(NSString *)filePath;

Parameters

Parameter Description
filePath The file path to save video records as MP4 files that are suffixed with .mp4.

API description

Stop recording and save the records.

- (void)stopRecord;

API description

The delegate callback to be invoked when the video recording is started.

- (void)cameraDidStartRecord:(id<ThingSmartCameraType>)camera;

API description

The delegate callback to be invoked when video recording is stopped and records are saved.

- (void)cameraDidStopRecord:(id<ThingSmartCameraType>)camera;

The call to start or stop recording might fail. In this case, the delegate method -(void)camera:didOccurredErrorAtStep: specificErrorCode:extErrorCodeInfo:; will return the error message.

Example

Objective-C:

- (void)startRecord {
    if (self.isRecording) {
        return;
    }
    // Video recording can be enabled only during video streaming.
    if (self.previewing || self.playbacking) {
        [self.camera startRecord];
        self.recording = YES;
    }
}

- (void)stopRecord {
    if (self.isRecording) {
        [self.camera stopRecord];
        self.recording = NO;
    }
}

- (void)cameraDidStartRecord:(id<ThingSmartCameraType>)camera {
        // Starts video recording and updates the UI.
}

- (void)cameraDidStopRecord:(id<ThingSmartCameraType>)camera {
        // Stops video recording and saves records.
}

- (void)camera:(id<ThingSmartCameraType>)camera didOccurredErrorAtStep:(TYCameraErrorCode)errStepCode specificErrorCode:(NSInteger)errorCode {
    // Failed to start or stop video recording.
    if (errStepCode == Thing_ERROR_RECORD_FAILED) {
        self.recording = NO;
    }
}

- (void)camera:(id<ThingSmartCameraType>)camera didOccurredErrorAtStep:(TYCameraErrorCode)errStepCode specificErrorCode:(NSInteger)errorCode extErrorCodeInfo:(id<ThingSmartCameraExtErrorCodeInfo>)extErrorCodeInfo{
    // Failed to start or stop video recording.
    if (errStepCode == Thing_ERROR_RECORD_FAILED) {
        self.recording = NO;
    }
}

Swift:

func startRecord() {
    if self.isRecording {
        return
    }
    guard self.isPreviewing || self.isPlaybacking else {
        return;
    }
    self.camera.startRecord()
    self.isRecording = true
}

func stopRecord() {
    guard self.isRecording else {
        return
    }
    self.camera.stopRecord()
}

func cameraDidStartRecord(_ camera: ThingSmartCameraType!) {
    // Starts video recording and updates the UI.
}

func cameraDidStopRecord(_ camera: ThingSmartCameraType!) {
    // Stops video recording and saves records.
}

func camera(_ camera: ThingSmartCameraType!, didOccurredErrorAtStep errStepCode: ThingCameraErrorCode, specificErrorCode errorCode: Int) {
    // Failed to start or stop video recording.
    if errStepCode == Thing_ERROR_RECORD_FAILED {
        self.isRecording = false
    }
}

func camera(_ camera: ThingSmartCameraType!, didOccurredErrorAtStep errStepCode: ThingCameraErrorCode, specificErrorCode errorCode: Int, extErrorCodeInfo: ThingSmartCameraExtErrorCodeInfo!) {
    // Failed to start or stop video recording.
    if errStepCode == Thing_ERROR_RECORD_FAILED {
        self.isRecording = false
    }
}

  • During video recording, do not switch between video definition modes or turn on/off the audio channel and live video talk. Otherwise, the recording might fail. We recommend that you disable the settings of the video definition modes, audio channel switch, and live video talk during video recording.
  • The video length might be less than the recording duration in certain conditions. For example, the video keeps freezing, or video frames are saved starting from the key frame next to the start point. The recording duration is provided for reference only.

Video screenshot

During live streaming or record playback, users can take a screenshot from the ongoing video. Three methods are available to take screenshots. Two of the following methods are provided by the ThingSmartCameraType object.

API description

Take a screenshot from the video and save it to the album of the mobile phone.

- (UIImage *)snapShoot;

Return value

Type Description
UIImage The UIImage object of the video screenshot. If nil is returned, the system fails to save the screenshot.

API description

Take a screenshot from the video and save it to the specified location. This API method is not supported by P2P 1.0 devices.

- (UIImage *)snapShootSavedAtPath:(NSString *)filePath thumbnilPath:(NSString *)thumbnilPath;

Parameters

Parameter Description
filePath The path in which the screenshot is stored.
thumbnilPath The path to save the thumbnail. Set the value to nil if this parameter is not required.

Return value

Type Description
UIImage The UIImage object of the video screenshot. If nil is returned, the system fails to save the screenshot.

API description

The screenshot method of the video rendering view ThingSmartVideoType only returns the UIImage object without automatically saving the image.

- (UIImage *)screenshot;

Return value

Type Description
UIImage The UIImage object of the video screenshot. If nil is returned, the system fails to save the screenshot.

Example

Objective-C:

- (void)snapShoot {
    // Screenshots can be captured only during video playing.
    if (self.previewing || self.playbacking) {
        if ([self.camera snapShoot]) {
            // Saves screenshots to the album of a mobile phone.
        }
    }
}

Swift:

func snapShoot() {
    guard self.isPreviewing || self.isPlaybacking else {
        return;
    }
    if let _ = self.camera.snapShoot() {
        // Saves screenshots to the album of a mobile phone.
    }
}

Before taking a screenshot, ensure the app is allowed to access the mobile phone’s album. Failure to do so may result in the app crashing.

Video sound

During live streaming or record playback, users can mute or unmute the video sound. The video is muted by default.

API description

Mute or unmute the video sound.

- (void)enableMute:(BOOL)mute forPlayMode:(ThingSmartCameraPlayMode)playMode;

Parameters

Parameter Description
mute Specify whether to mute the video.
  • YES: Mute
  • NO: Unmute
playMode The current playing mode.

API description

The delegate callback for muting or unmuting the video sound.

- (void)camera:(id<ThingSmartCameraType>)camera didReceiveMuteState:(BOOL)isMute playMode:(ThingSmartCameraPlayMode)playMode;

Parameters

Parameter Description
camera The Camera object that mutes or unmutes the video sound.
isMute The current mute state.
playMode The current playing mode.

API description

Switch between the speaker and earpiece modes. If the return value is not 0, the switching operation fails. This API method is not supported by P2P 1.0 devices.

- (int)enableSpeaker:(BOOL)enabled;

Parameters

Parameter Description
enabled
  • YES: Switch to the speaker mode.
  • NO: Switch to the earpiece mode.

API description

Get the current sound output mode, YES for the speaker and NO for the earpiece. This API method is not supported by P2P 1.0.

- (BOOL)speakerEnabled;

When the system switches between live streaming and video playback, the IPC SDK does not retain the mute state from the previous mode.

  • For example, if you switch from live streaming with sound on to video playback, the sound is still on.
  • If you mute the sound and switch back to live streaming, the sound remains muted.

Therefore, users need to manually adjust the sound settings after switching between playing modes.

Video bitrate

Get the live video bitrate after streaming starts.

API description

Get the live video bitrate.

- (double)getVideoBitRateKBPS;

Live talk

After a P2P connection is created, users can initiate a talk to the visitor using the app. Before the talk, the app must be allowed to access the microphone of the mobile phone.

Check support for live talk

If the device is equipped with a speaker, it supports one-way talk. If the device is equipped with both a speaker and a pickup, it supports two-way talk. You can query device capabilities to check whether the device supports a speaker or pickup.

Two-way talk

During live streaming, the audio from the video is the human voices and ambient sounds collected by the IPC in real time. Turning on the audio channel between the app and the IPC enables a two-way talk.

IPCs without speakers or pickups do not support two-way talk.

One-way talk

The control of one-way talk is subject to your implementation. Mute the video when the one-way talk starts, and unmute it when the talk ends.

API description

Turn on the audio channel from the app to the IPC.

- (void)startTalk;

API description

Turn off the audio channel from the app to the IPC.

- (void)stopTalk;

API description

The delegate callback to be invoked when the audio channel from the app to the IPC is turned on.

- (void)cameraDidBeginTalk:(id<ThingSmartCameraType>)camera;

API description

The delegate callback to be invoked when the audio channel from the app to the IPC is turned off.

- (void)cameraDidStopTalk:(id<ThingSmartCameraType>)camera;

Example

The following example of one-way talk shows the API calls for the audio switch and live talk.

Objective-C:

- (void)startTalk {
    [self.camera startTalk];
    // Disables the audio channel if videos are not muted.
    if (!self.isMuted) {
        [self.camera enableMute:YES forPlayMode:ThingSmartCameraPlayModePreview];
    }
}

- (void)stopTalk {
    [self.camera stopTalk];
}

- (void)cameraDidBeginTalk:(id<ThingSmartCameraType>)camera {
        // Starts live video talk.
}

- (void)cameraDidStopTalk:(id<ThingSmartCameraType>)camera {
        // Stops live video talk.
        // Enables the audio channel if videos are muted.
    if (self.isMuted) {
        [self.camera enableMute:NO forPlayMode:ThingSmartCameraPlayModePreview];
    }
}

- (void)camera:(id<ThingSmartCameraType>)camera didReceiveMuteState:(BOOL)isMute playMode:(ThingSmartCameraPlayMode)playMode {
        // Receives the audio status changes and updates the UI.
        self.isMuted = isMute;
}

- (void)camera:(id<ThingSmartCameraType>)camera didOccurredErrorAtStep:(ThingCameraErrorCode)errStepCode specificErrorCode:(NSInteger)errorCode {
    if (errStepCode == Thing_ERROR_START_TALK_FAILED) {
            // Failed to start video talk. Enable the audio channel again.
                if (self.isMuted) {
                [self.camera enableMute:NO forPlayMode:ThingSmartCameraPlayModePreview];
            }
    }
    else if (errStepCode == Thing_ERROR_ENABLE_MUTE_FAILED) {
                // Failed to enable the muted state.
    }
}

- (void)camera:(id<ThingSmartCameraType>)camera didOccurredErrorAtStep:(ThingCameraErrorCode)errStepCode specificErrorCode:(NSInteger)errorCode extErrorCodeInfo:(id<ThingSmartCameraExtErrorCodeInfo>)extErrorCodeInfo{
    if (errStepCode == Thing_ERROR_START_TALK_FAILED) {
            // Failed to start video talk. Enable the audio channel again.
                if (self.isMuted) {
                [self.camera enableMute:NO forPlayMode:ThingSmartCameraPlayModePreview];
            }
    }
    else if (errStepCode == Thing_ERROR_ENABLE_MUTE_FAILED) {
                // Failed to enable the muted state.
    }
}

Swift:

// The live video streaming mode.
func startTalk() {
    self.camera.startTalk()
    guard self.isMuted else {
        self.camera.enableMute(true, for: .preview)
        return
    }
}

func stopTalk() {
    self.camera.stopTalk()
}

func cameraDidBeginTalk(_ camera: ThingSmartCameraType!) {
    // Starts live video talk.
}

func cameraDidStopTalk(_ camera: ThingSmartCameraType!) {
    // Stops live video talk.
    if self.isMuted {
        self.camera.enableMute(false, for: .preview)
    }
}

func camera(_ camera: ThingSmartCameraType!, didReceiveMuteState isMute: Bool, playMode: ThingSmartCameraPlayMode) {
    self.isMuted = isMute
    // Receives the audio status changes and updates the UI.
}

func camera(_ camera: ThingSmartCameraType!, didOccurredErrorAtStep errStepCode: ThingCameraErrorCode, specificErrorCode errorCode: Int) {
    if errStepCode == Thing_ERROR_START_TALK_FAILED {
        // Failed to start video talk. Enable the audio channel again.
        self.camera.enableMute(false, for: .preview)
    }else if errStepCode == Thing_ERROR_ENABLE_MUTE_FAILED {
        // Failed to enable the muted state.
    }
}

func camera(_ camera: ThingSmartCameraType!, didOccurredErrorAtStep errStepCode: ThingCameraErrorCode, specificErrorCode errorCode: Int, extErrorCodeInfo: ThingSmartCameraExtErrorCodeInfo!) {
    if errStepCode == Thing_ERROR_START_TALK_FAILED {
        // Failed to start video talk. Enable the audio channel again.
        self.camera.enableMute(false, for: .preview)
    }else if errStepCode == Thing_ERROR_ENABLE_MUTE_FAILED {
        // Failed to enable the muted state.
    }
}

Switch between video definitions

Users can switch between definitions during live streaming. Currently, only high definition (HD) and standard definition (SD) are available, and they are supported only for live streaming. Some IPCs only support one definition. Audio-only mode requires device support. Only one video definition mode is supported for storing footage on the SD card.

API description

Get the definition of the current video by using a delegate method.

- (void)getHD;

API description

Switch between video definitions, YES for HD and NO for SD.

- (void)enableHD:(BOOL)hd;

Parameters

Parameter Description
hd Specifies whether the video is in HD.
  • YES: HD
  • NO: SD

API description

The delegate callback to be invoked when the video definition is changed.

- (void)camera:(id<ThingSmartCameraType>)camera didReceiveDefinitionState:(BOOL)isHd;

Parameters

Parameter Description
camera The Camera object that switches between video definitions.
isHd The current definition:
  • YES: HD
  • NO: SD

The preceding API methods are supported by P2P 1.0 devices. However, in v3.20.0, these API methods are deprecated and replaced with the following three API methods that are not supported by P2P 1.0 devices.

API description

Get the video definition.

- (void)getDefinition;

API description

Set the video definition.

- (void)setDefinition:(ThingSmartCameraDefinition)definition;

API description

The delegate callback to be invoked when the video definition is changed.

- (void)camera:(id<ThingSmartCameraType>)camera definitionChanged:(ThingSmartCameraDefinition)definition;

Enum values of ThingSmartCameraDefinition

Value Description
ThingSmartCameraDefinitionProflow Bandwidth efficient streaming
ThingSmartCameraDefinitionStandard Standard definition (SD)
ThingSmartCameraDefinitionHigh High definition (HD)
ThingSmartCameraDefinitionSuper Ultra HD
ThingSmartCameraDefinitionSSuper Super ultra HD
ThingSmartCameraDefinitionAudioOnly Audio-only mode

The supported definitions vary depending on the device. Ordinary devices currently only support SD and HD.

Example

Objective-C:

- (void)changeHD {
        ThingSmartCameraDefinition definition = self.HD ? ThingSmartCameraDefinitionStandard : ThingSmartCameraDefinitionHigh;
        [self.camera setDefinition:definition];
}

// The delegate callback to invoke after the video resolution is changed. It is also executed when live video streaming or video playback is started.
- (void)camera:(id<ThingSmartCameraType>)camera resolutionDidChangeWidth:(NSInteger)width height:(NSInteger)height {
        // Returns the current definition mode.
        [self.camera getDefinition];
}

// The delegate callback to invoke when the video definition mode is changed.
- (void)camera:(id<ThingSmartCameraType>)camera definitionChanged:(ThingSmartCameraDefinition)definition {
    self.HD = definition >= ThingSmartCameraDefinitionHigh;
}

- (void)camera:(id<ThingSmartCameraType>)camera didOccurredErrorAtStep:(TYCameraErrorCode)errStepCode specificErrorCode:(NSInteger)errorCode {
    if (errStepCode == Thing_ERROR_ENABLE_HD_FAILED) {
                // Failed to switch between the video definition modes.
    }
}

- (void)camera:(id<ThingSmartCameraType>)camera didOccurredErrorAtStep:(TYCameraErrorCode)errStepCode specificErrorCode:(NSInteger)errorCode extErrorCodeInfo:(id<ThingSmartCameraExtErrorCodeInfo>)extErrorCodeInfo{
    if (errStepCode == Thing_ERROR_ENABLE_HD_FAILED) {
                // Failed to switch between the video definition modes.
    }
}

Swift:

func changeHD() {
    let definition = self.isHD ? ThingSmartCameraDefinition.standard : ThingSmartCameraDefinition.high
    self.camera.setDefinition(definition)
}

// The delegate callback to invoke after the video resolution is changed. It is also executed when live video streaming or video playback is started.
func camera(_ camera: ThingSmartCameraType!, resolutionDidChangeWidth width: Int, height: Int) {
    // Returns the current definition mode.
    self.camera.getDefinition()
}

func camera(_ camera: ThingSmartCameraType!, definitionChanged definition: ThingSmartCameraDefinition) {
        self.isHD = definition.rawValue >= ThingSmartCameraDefinition.high.rawValue
}

func camera(_ camera: ThingSmartCameraType!, didOccurredErrorAtStep errStepCode: ThingCameraErrorCode, specificErrorCode errorCode: Int) {
    if errStepCode == Thing_ERROR_ENABLE_HD_FAILED {
        // Failed to switch between the video definition modes.
    }
}

func camera(_ camera: ThingSmartCameraType!, didOccurredErrorAtStep errStepCode: ThingCameraErrorCode, specificErrorCode errorCode: Int, extErrorCodeInfo: ThingSmartCameraExtErrorCodeInfo!) {
    if errStepCode == Thing_ERROR_ENABLE_HD_FAILED {
        // Failed to switch between the video definition modes.
    }
}

Raw stream data

The IPC SDK provides the callback that returns raw stream data, including the YUV data of video frames. YUV 420SP is used as the color encoding format. This format is implemented by kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange for iOS.

API description

The delegate callback for the raw stream data.

- (void)camera:(id<ThingSmartCameraType>)camera thing_didReceiveVideoFrame:(CMSampleBufferRef)sampleBuffer frameInfo:(ThingSmartVideoFrameInfo)frameInfo;

Parameters

Parameter Description
camera The Camera object that receives video data.
sampleBuffer The YUV data of video frames.
frameInfo The information about video frames.

ThingSmartVideoFrameInfo struct

Field Type Description
nWidth int The width of video images.
nHeight int The height of video images.
nFrameRate int The frame rate of the video.
nTimeStamp unsigned long long The timestamp of a video frame.
nDuration unsigned long long The total duration of the video attached to an alert, in milliseconds.
nProgress unsigned long long The time point of a video frame in the video attached to an alert, in milliseconds.

You can render video images with your own method, or further process video images. For this purpose, set the autoRender property of the ThingSmartCameraType object to NO, and implement this delegate method. This way, the IPC SDK does not automatically render video images.

You can force sampleBuffer into CVPixelBufferRef. To asynchronously process video frame data, remember to retain the pixel buffer. Otherwise, the video frame data will be released after the delegate method is executed. This will cause a wild pointer exception during asynchronous processing.

Example

Objective-C:

- (void)camera:(id<ThingSmartCameraType>)camera thing_didReceiveVideoFrame:(CMSampleBufferRef)sampleBuffer frameInfo:(ThingSmartVideoFrameInfo)frameInfo {
    CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)sampleBuffer;
    // Retains pixelBuffer to avoid unexpected release.
    CVPixelBufferRetain(pixelBuffer);
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
        // Processes and renders pixelBuffer.
        // ...
        // Releases pixelBuffer.
        CVPixelBufferRelease(pixelBuffer);
    });
}

Swift:

func camera(_ camera: ThingSmartCameraType!, thing_didReceiveVideoFrame sampleBuffer: CMSampleBuffer!, frameInfo: ThingSmartVideoFrameInfo) {
        // Processes and renders video data.
}

Intelligent video analytics

If the IPC detects an object in motion during live streaming, the intelligent video analytics (IVA) feature allows the object to be framed automatically in white on the view.

To achieve this purpose, the IVA feature must be enabled for the IPC first. The device will then report the coordinates of the object along with the video frames. You can use the data point (DP) ID 198 (ipc_object_outline) to enable this feature. For more information about device control API methods, see Device Control.

After IVA is enabled for the IPC, this feature must also be enabled for the IPC SDK during live streaming. This allows the SDK to frame the object in white on the view based on the received coordinates of the object.

The details of IVA DPs will not be described in this topic. You can follow the respective convention on the device.

Enable IVA

API description

Enable or disable the IVA feature after startPreview for video previewing or setDefinition: for definition settings.

- (void)setOutLineEnable:(BOOL)enable;

Parameters

Parameter Description
enable Specifies whether to enable IVA.

Configure IVA properties

You can configure the properties of IVA to control its style, such as the frame color, brush width, and flash frequency.

API description

Define the properties of IVA in the specified format of JSON strings based on the Supplemental Enhancement Information (SEI) reported by the device.

- (int)setSmartRectFeatures:(NSString *)features;

Parameters

The parameter features is a JSON string in the following format:

{
    "SmartRectFeature":[
        {
            "type":0,
            "index":0,
            "brushWidth":1,
            "flashFps":{
                "drawKeepFrames":2,
                "stopKeepFrames":2
            },
            "rgb":0xFF0000,
            "shape":0
        },
        {
            "type":0,
            "index":1,
            "brushWidth":2,
            "flashFps":{
                "drawKeepFrames":3,
                "stopKeepFrames":2
            },
            "rgb":0x00FF00,
            "shape":1
        }
    ]
}
Parameter Type Description
SmartRectFeature Array (Required) The identifier in the fixed format of arrays to represent multiple frame settings.
type Int The type of frame. Valid values:
  • 0: Smart bounding box (default)
  • 1: Line crossing detection
index Int The index of the frame, corresponding to each ID in od of SEI.
shape Int The shape of the rectangular frame. Valid values:
  • 0: Closed rectangle
  • 1: Four corners only
rgb Int The color of the rectangular frame, represented by the red-green-blue (RGB) color model. Value range: 0x000000 to 0xFFFFFF. Default value: 0xFC4747.
brushWidth Int The brush stroke of the rectangular frame. Valid values:
  • 0: Thin
  • 1: Medium
  • 2: Thick
flashFps String The flash frequency of the rectangular frame. Valid values:
  • drawKeepFrames: The number of frames with drawing on.
  • stopKeepFrames: The number of frames with drawing off.

Define SEI protocol

The protocol that governs communication with the device. The IPC SDK parses the data over SEI and implements IVA at the positions with the specified properties.

{
    "AGTX":{
        "time":6885,
        "chn":0,
        "key":1,
        "iva":{
            "od":[
                {
                    "obj":{
                        "id":0,
                        "type":1,
                        "twinkle":1,
                        "rect":[
                            0,0,
                            25,25,
                            50,50,
                            80, 80,
                            100,100
                        ],
                        "vel":[0,10],
                        "cat":"PEDESTRIAN"
                    }
                },
                {
                    "obj":{
                        "id":1,
                        "type":1,
                        "twinkle":1,
                        "rect":[
                            0,0,
                            100,100
                        ],
                        "vel":[0,10],
                        "cat":"PEDESTRIAN"
                    }
                }
            ]
        }
    }
}

The following table describes the parsed parameters in iva reported by the device.

Parameter Description
id The index of each frame.
type The type of frame. Valid values:
  • 0: Smart bounding box
  • 1: Four corners only
twinkle The flash setting. Valid values:
  • 0: The frame is displayed, but flash is disabled.
  • 1: The frame flashes at the frequency passed in.
rect Each pair of numbers represents the coordinates of a point. All points are arranged in clockwise order. The SDK draws these points into a closed polygon. Two points are used to represent a rectangular box.

The coordinates of the rect points in the SEI protocol are typically passed from the app to the device in the format of the device control data point (DP). Coordinate anchor data follows these rules:

  • An even-numbered position represents the numerator of a percentage value on the horizontal (x) axis.

  • An odd-numbered position represents the numerator of a percentage value on the vertical (y) axis.

    The maximum value is 100, and even and odd positions appear in pairs.

Video rendering

After video decoding, rendered images can be further processed by settings. For example, the following features are supported: stretching or scaling, horizontal or vertical mirroring, and rotation by 90, 180, or 270 degrees.

API description

- (int)setDeviceFeatures:(NSDictionary *)features;

Parameters

The parameter features is the directory type of data in the following format:

{
    "DecPostProcess":{
        "video":[
            {
                "restype":"4",
                "oldres":"944*1080",
                "newres":"1920*1080"
            },
            {
                "restype":"2",
                "oldres":"944*1080",
                "newres":"1920*1080"
            }
        ],
        "mirror":0,
        "rotation":2
    }
}
Parameter Description
DecPostProcess (Required) The fixed identifier.
video The array of video resolution settings.
  • restype: The type of video resolution.
  • oldres: The size of the original video reported by the device.
  • newres: The expected size of the video after stretching or scaling.
mirror The mirroring setting. Valid values:
  • 0: Unchanged.
  • 1: Horizontal mirroring.
  • 2: Vertical mirroring.
rotation The angle of rotation. Valid values:
  • 0: No rotation.
  • 1: Rotate 90 degrees.
  • 2: Rotate 180 degrees.
  • 3: Rotate 270 degrees.

Local video

Use the phone’s camera features to perform tasks. For example, start/stop data collection, start/stop/pause/resume sending device data, and switch between the front and rear cameras.

Ensure the app is allowed to access the mobile phone’s camera. Failure to do so may result in the app crashing.

Turn on camera

Initialize the phone’s camera and start capturing video.

API description

Turn on the phone’s camera and start capturing video. The returned int determines the result of the operation. Invoke a callback though - (void)camera:(id<ThingSmartCameraType>)camera didReceiveLocalVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer localVideoInfo:(id<ThingSmartLocalVideoInfoType>)localVideoInfo;.

Set videoInfo to nil. A custom value is not recommended as it may cause video capturing or playback to fail.

-(int)startLocalVideoCaptureWithVideoInfo:(nullable id<ThingSmartLocalVideoInfoType>)videoInfo;

ThingSmartLocalVideoInfoType

Parameter Description
width The width of the video.
height The height of the video.
frameRate The frame rate of the video.

Turn off camera

API description

Turn off the phone’s camera.

-(int)stopLocalVideoCapture;

Switch between cameras

API description

Switch between the front and rear cameras. The returned int determines the result of the operation.

-(int)switchLocalCameraPosition;

Local video stream

See Video Rendering to customize the player properties.

Local video stream callback
- (void)camera:(id<ThingSmartCameraType>)camera didReceiveLocalVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer localVideoInfo:(id<ThingSmartLocalVideoInfoType>)localVideoInfo;

Bind/unbind player

The local camera player inherits the YUV player ThingSmartMediaVideoView.

- (void)bindLocalVideoView:(UIView<ThingSmartVideoViewType> *)videoView;
- (void)unbindLocalVideoView:(UIView<ThingSmartVideoViewType> *)videoView;

Start video talk

After a P2P connection is created, start a video talk to send the local video to the device.

API description

- (int)startVideoTalk;

Callback description

- (void)cameraDidStartVideoTalk:(id<ThingSmartCameraType>)camera;

Stop video talk

Stop a video call to disconnect the channel for transmitting local video to the device. This action does not terminate the P2P connection.

API description

-(int)stopVideoTalk;

Callback description

- (void)cameraDidStopVideoTalk:(id<ThingSmartCameraType>)camera;

Pause video talk

Pause a video talk.

API description

- (int)pauseVideoTalk;

Callback description

- (void)cameraDidPauseVideoTalk:(id<ThingSmartCameraType>)camera;;

Resume video talk

Resume a video talk.

API description

- (int)resumeVideoTalk;

Callback description

- (void)cameraDidResumeVideoTalk:(id<ThingSmartCameraType>)camera;

Start capturing video and audio

Start recording the video and audio.

API description

Set audioInfo to nil. A custom value is not recommended as it may cause capturing or playback to fail.

-(int)startAudioRecordWithAudioInfo:(nullable id<ThingSmartLocalAudioInfoType>)audioInfo;

Parameters

Parameter Description
sampleRate The sampling rate.
channel The sound channel.

Stop capturing video and audio

Stop recording the video and audio.

API description

-(int)stopAudioRecord;

Device capabilities

The ThingSmartCameraAbility class can be used to parse device configurations and get basic device capabilities.

Property Description
defaultDefinition The default definition of live streaming.
videoNum The number of streams supported by the device.
  • If the value is 1, the device only supports one definition mode that can be obtained through defaultDefinition. Switching between definition modes is not supported.
  • If the value is 2, the device supports SD and HD.
isSupportSpeaker Indicates whether the device is equipped with a speaker. If so, the device supports video talk.
isSupportPickup Indicates whether the device is equipped with a pickup. If so, the audio channel can be enabled when video streams are previewed on the app.
rowData The raw data of P2P configurations.

API description

Create the device capabilities class object based on the device data model. This API method is called after a P2P connection is created. After the initial P2P connection, the raw data of P2P configurations is cached in the local sandbox.

+ (instancetype)cameraAbilityWithDeviceModel:(ThingSmartDeviceModel *)deviceModel;