Last Updated on : 2024-09-14 06:08:14download
The IPC SDK can provide a bunch of audio and video capabilities in addition to live streaming and playing back footage on an SD card. For example, record videos on a mobile phone, capture video screenshots, talk with IPCs in real time, and switch between definition modes.
During live streaming or record playback, the ongoing videos can be recorded on a mobile phone.
MediaStore
API methods must be used.Write permissions to the SD card are required to record videos.
API description
int startRecordLocalMp4(String folderPath, Context context, OperationDelegateCallBack callBack);
Parameters
Parameter | Description |
---|---|
folderPath | The path in which videos are stored. |
context | The context. |
callBack | The callback. |
Example
private IThingSmartCameraP2P mCameraP2P;
if (Constants.hasStoragePermission()) {
String picPath = Environment.getExternalStorageDirectory().getAbsolutePath() + "/Camera/";
File file = new File(picPath);
if (!file.exists()) {
file.mkdirs();
}
mCameraP2P.startRecordLocalMp4(picPath, CameraPanelActivity.this, new OperationDelegateCallBack() {
@Override
public void onSuccess(int sessionId, int requestId, String data) {
isRecording = true;
mHandler.sendEmptyMessage(MSG_VIDEO_RECORD_BEGIN);
}
@Override
public void onFailure(int sessionId, int requestId, int errCode) {
mHandler.sendEmptyMessage(MSG_VIDEO_RECORD_FAIL);
}
});
recordStatue(true);
} else {
Constants.requestPermission(CameraPanelActivity.this, Manifest.permission.WRITE_EXTERNAL_STORAGE, Constants.EXTERNAL_STORAGE_REQ_CODE, "open_storage");
}
API description
int stopRecordLocalMp4(OperationDelegateCallBack callBack);
Parameters
Parameter | Description |
---|---|
callBack | The callback. |
Example
mCameraP2P.stopRecordLocalMp4(new OperationDelegateCallBack() {
@Override
public void onSuccess(int sessionId, int requestId, String data) {
// The success callback.
}
@Override
public void onFailure(int sessionId, int requestId, int errCode) {
// The failure callback.
}
});
Capture screenshots of live video images and store them on the SD card of a mobile phone.
To save screenshots to a system album, you must implement this feature on your own. Starting from Android 10, scoped storage is applicable. This feature can be disabled for most of these versions but is required by Android 11. To store media files to a system album, the MediaStore
API methods must be used.
API description
int snapshot(String absoluteFilePath, Context context, OperationDelegateCallBack callBack);
Parameters
Parameter | Description |
---|---|
absoluteFilePath | The path in which the screenshots are stored. |
context | The context. |
callBack | The callback. |
Example
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
String path = Environment.getExternalStorageDirectory().getAbsolutePath() + "/Camera/";
File file = new File(path);
if (!file.exists()) {
file.mkdirs();
}
picPath = path;
}
mCameraP2P.snapshot(picPath, CameraPanelActivity.this, new OperationDelegateCallBack() {
@Override
public void onSuccess(int sessionId, int requestId, String data) {
// The file path is returned by `data`.
mHandler.sendMessage(MessageUtil.getMessage(MSG_SCREENSHOT, ARG1_OPERATE_SUCCESS, data));
}
@Override
public void onFailure(int sessionId, int requestId, int errCode) {
mHandler.sendMessage(MessageUtil.getMessage(MSG_SCREENSHOT, ARG1_OPERATE_FAIL));
}
});
During live streaming or video playback, the audio channel can be enabled or disabled. By default, it is disabled.
API description
Mute or unmute the video sound.
void setMute(int mute, OperationDelegateCallBack callBack);
Parameters
Parameter | Description |
---|---|
mute | Audio mode
|
Example
mCameraP2P.setMute(1, new OperationDelegateCallBack() {
@Override
public void onSuccess(int sessionId, int requestId, String data) {
// The operation result is returned by `data`.
previewMute = Integer.valueOf(data);
}
@Override
public void onFailure(int sessionId, int requestId, int errCode) {
}
});
API description
Switch between the speaker and earpiece modes. This API method is not supported by P2P 1.0 devices.
void setLoudSpeakerStatus(boolean enable);
Parameters
Parameter | Description |
---|---|
enable |
|
API description
Set up a custom audio effect.
void setAudioEffect(@AudioEffect int type);
Parameters
Parameter | Description |
---|---|
type | The audio effect type. |
AudioEffect
Enum value | Description |
---|---|
NONE | The original voice. |
BOY | The voice of a boy. |
UNCLE | The voice of an uncle. |
ROBOT | The voice of a robot. |
GIRL | The voice of a girl. |
After a P2P connection is created, the live video talk feature can be enabled to talk to an IP camera (IPC). Before the talk, the app must be granted access to the microphone of the mobile phone.
Transmit audio data from the mobile phone to the IPC.
Example
if (Constants.hasRecordPermission()) {
mCameraP2P.startAudioTalk(new OperationDelegateCallBack() {
@Override
public void onSuccess(int sessionId, int requestId, String data) {
isSpeaking = true;
ToastUtil.shortToast(CameraPanelActivity.this, "start talk success");
}
@Override
public void onFailure(int sessionId, int requestId, int errCode) {
isSpeaking = false;
ToastUtil.shortToast(CameraPanelActivity.this, "operation fail");
}
});
} else {
Constants.requestPermission(CameraPanelActivity.this, Manifest.permission.RECORD_AUDIO, Constants.EXTERNAL_AUDIO_REQ_CODE, "open_recording");
}
Stop transmitting audio data from the mobile phone to the IPC.
Example
mCameraP2P.stopAudioTalk(new OperationDelegateCallBack() {
@Override
public void onSuccess(int sessionId, int requestId, String data) {
isSpeaking = false;
}
@Override
public void onFailure(int sessionId, int requestId, int errCode) {
isSpeaking = false;
}
});
The video talk and video recording features are mutually exclusive. The former can only be enabled during video preview.
During live streaming, the audio from the video is the human voices and ambient sounds collected by the IPC in real time. Turning on the audio channel between the app and the IPC enables a two-way talk.
IPCs without speakers or pickups do not support two-way talk.
The control of one-way talk is subject to your implementation.
Users can switch between definitions during live streaming. Currently, only high definition (HD) and standard definition (SD) modes are supported. A few IPCs only support one of the modes.
This feature is only available during live streaming. Only one video definition mode is supported for storing footage on the SD card.
Audio-only mode requires device support.
Parameters
Parameter | Description |
---|---|
clarity | The video definition mode.
|
Request the definition mode of the videos sent from the IPC.
Example
mCameraP2P.getVideoClarity(new OperationDelegateCallBack() {
@Override
public void onSuccess(int sessionId, int requestId, String data) {
}
@Override
public void onFailure(int sessionId, int requestId, int errCode) {
}
});
Invoke this function after startPreview
is called to generate the preview images.
Set the definition mode of the videos sent from the IPC.
Example
mCameraP2P.setVideoClarity(2, new OperationDelegateCallBack() {
@Override
public void onSuccess(int sessionId, int requestId, String data) {
videoClarity = Integer.valueOf(data);
}
@Override
public void onFailure(int sessionId, int requestId, int errCode) {
}
});
The IPC SDK provides the callback that returns raw stream data, including the YUV data of video frames. YUV 420SP is used as the color encoding format.
API description
To enable the callback for video frames, you must register a listener with IThingSmartCameraP2P
in AbsP2pCameraListener
. You only need to rewrite the callback as preferred.
void registerP2PCameraListener(AbsP2pCameraListener listener);
API description
public void onReceiveFrameYUVData(int sessionId, ByteBuffer y, ByteBuffer u, ByteBuffer v, int width, int height, int nFrameRate, int nIsKeyFrame, long timestamp, long nProgress, long nDuration, Object camera)
Parameters
Parameter | Description |
---|---|
Y | The luma (Y′) information of video streams. |
u | The chroma (U) channel information of video streams. |
v | The chroma (V) channel information of video streams. |
width | The width of video images. |
height | The height of video images. |
nFrameRate | The frame rate of the video. |
nIsKeyFrame | Indicates whether a keyframe or an I-frame is used. |
timestamp | The timestamp. |
nProgress | The video’s time progress displayed in the Message Center module. |
nDuration | The video duration displayed in the Message Center module. |
API description
public void onSessionStatusChanged(Object camera, int sessionId, int sessionStatus)
Parameters
Parameter | Description |
---|---|
sessionId | The ID of the P2P connection. |
sessionStatus | The status of the P2P connection.
|
Example
private IThingSmartCameraP2P mCameraP2P;
@Override
protected void onResume() {
super.onResume();
if (null != mCameraP2P) {
// Registers a P2P listener.
mCameraP2P.registerP2PCameraListener(p2pCameraListener);
}
}
@Override
protected void onPause() {
super.onPause();
if (null != mCameraP2P) {
// Unregisters a P2P listener.
mCameraP2P.removeOnP2PCameraListener(p2pCameraListener);
}
}
private AbsP2pCameraListener p2pCameraListener = new AbsP2pCameraListener() {
@Override
public void onSessionStatusChanged(Object o, int i, int i1) {
super.onSessionStatusChanged(o, i, i1);
// The callback for connection status changes.
}
};
Do not call other API methods in the callback thread of onSessionStatusChanged
. Otherwise, a deadlock might occur.
Audio data collected by the app can be further processed, such as voice changing. To implement this feature, the IPC SDK provides the callback that returns the audio data collected by the app. The collected audio data has been processed with echo cancellation. To enable the callback for audio data, you must register a listener with IThingSmartCameraP2P
.
API description
Register the listener.
void registerSpeakerEchoProcessor(ISpeakerEchoProcessor processor);
API description
Destroy the listener.
void unregisterSpeakerEchoProcessor();
API description
Transmit the processed audio data to the IPC for playback.
void sendAudioTalkData(byte[] outbuf, int length);
Parameters
Parameter | Description |
---|---|
outbuf | The byte array of audio data. |
length | The data length. |
Example
mCameraP2P.registerSpeakerEchoProcessor(new ISpeakerEchoProcessor() {
@Override
public void receiveSpeakerEchoData(ByteBuffer pcm, int sampleRate) {
// The pulse-code modulation (PCM) data that indicates the sampling rate.
}
});
If the IPC detects an object in motion during live streaming, the intelligent video analytics (IVA) feature allows the object to be framed automatically in white on the view.
To achieve this purpose, the IVA feature must be enabled for the IPC first. The device will then report the coordinates of the object along with the video frames. You can use the data point (DP) ID 198
(ipc_object_outline
) to enable this feature. For more information about device control API methods, see Device Control.
After IVA is enabled for the IPC, this feature must also be enabled for the IPC SDK during live streaming. This allows the SDK to frame the object in white on the view based on the received coordinates of the object.
API description
Enable or disable the IVA feature. This API method is usually called after startPreview
for video previewing or setVideoClarity
for definition settings.
void setEnableIVA(boolean enableIVA);
Parameters
Parameter | Description |
---|---|
enable | Specifies whether to enable IVA. |
You can configure the properties of IVA to control its style, such as the frame color, brush width, and flash frequency.
API description
Define the properties of IVA in the specified format of JSON strings based on the Supplemental Enhancement Information (SEI) reported by the device.
void setSmartRectFeatures(String rectFeatures);
Parameters
rectFeatures
is a JSON string in the following format:
{
"SmartRectFeature":[
{
"type":0,
"index":0,
"brushWidth":1,
"flashFps":{
"drawKeepFrames":2,
"stopKeepFrames":2
},
"rgb":0xFF0000,
"shape":0
},
{
"type":0,
"index":1,
"brushWidth":2,
"flashFps":{
"drawKeepFrames":3,
"stopKeepFrames":2
},
"rgb":0x00FF00,
"shape":1
}
]
}
Parameter | Type | Description |
---|---|---|
SmartRectFeature | Array | (Required) The identifier in the fixed format of arrays to represent multiple frame settings. |
type | Int | The type of frame. Valid values:
|
index | Int | The index of the frame, corresponding to each ID in od of SEI. |
shape | Int | The shape of the rectangular frame. Valid values:
|
rgb | Int | The color of the rectangular frame, represented by the red-green-blue (RGB) color model. Value range: 0x000000 to 0xFFFFFF . Default value: 0xFC4747 . |
brushWidth | Int | The brush stroke of the rectangular frame. Valid values:
|
flashFps | String | The flash frequency of the rectangular frame. Valid values:
|
Define SEI protocol
The protocol that governs communication with the device. The IPC SDK parses the data over SEI and implements IVA at the positions with the specified properties.
{
"AGTX":{
"time":6885,
"chn":0,
"key":1,
"iva":{
"od":[
{
"obj":{
"id":0,
"type":1,
"twinkle":1,
"rect":[
0,0,
25,25,
50,50,
80, 80,
100,100
],
"vel":[0,10],
"cat":"PEDESTRIAN"
}
},
{
"obj":{
"id":1,
"type":1,
"twinkle":1,
"rect":[
0,0,
100,100
],
"vel":[0,10],
"cat":"PEDESTRIAN"
}
}
]
}
}
}
The following table describes the parsed parameters in iva
reported by the device.
Parameter | Description |
---|---|
id | The index of each frame. |
type | The type of frame. Valid values:
|
twinkle | The flash setting. Valid values:
|
rect | Each pair of numbers represents the coordinates of a point. All points are arranged in clockwise order. The SDK draws these points into a closed polygon. Two points are used to represent a rectangular box. |
The coordinates of the rect
points in the SEI protocol are typically passed from the app to the device in the format of the device control data point (DP). Coordinate anchor data follows these rules:
An even-numbered position represents the numerator of a percentage value on the horizontal (x) axis.
An odd-numbered position represents the numerator of a percentage value on the vertical (y) axis.
The maximum value is 100, and even and odd positions appear in pairs.
After video decoding, rendered images can be further processed by settings. For example, the following features are supported: stretching or scaling, horizontal or vertical mirroring, and rotation by 90, 180, or 270 degrees.
API description
void setDeviceFeatures(String renderFeatures);
Parameters
renderFeatures
is a JSON string in the following format:
{
"DecPostProcess":{
"video":[
{
"restype":"4",
"oldres":"944*1080",
"newres":"1920*1080"
},
{
"restype":"2",
"oldres":"944*1080",
"newres":"1920*1080"
}
],
"mirror":0,
"rotation":2
}
}
Parameter | Description |
---|---|
DecPostProcess | (Required) The fixed identifier. |
video | The array of video resolution settings.
|
mirror | The mirroring setting. Valid values:
|
rotation | The angle of rotation. Valid values:
|
Use the phone’s camera features to perform tasks. For example, start/stop data collection, start/stop/pause/resume sending device data, and switch between the front and rear cameras.
Initialize the phone’s camera and start capturing video.
You must invoke this method on the main thread and grant it Manifest.permission.CAMERA
.
API description
Turn on the phone’s camera and return a Boolean value to indicate the operation result. Invoke a callback through IRegistorIOTCListener#receiveLocalVideoFrame(int sessionId, ByteBuffer y, ByteBuffer u, ByteBuffer v, int width, int height)
.
@MainThread
@RequiresPermission(Manifest.permission.CAMERA)
boolean startVideoCapture();
@MainThread
@RequiresPermission(Manifest.permission.CAMERA)
boolean startVideoCapture(int width, int height, int frameRate);
Parameters
Parameter | Description |
---|---|
width | The width of the video. |
height | The height of the video. |
frameRate | The frame rate of the video. |
API description
Turn off the phone’s camera.
void stopVideoCapture();
Switch to the front or rear camera on the phone.
This method should be called on the main thread.
API description
Switch between the front and rear cameras on the phone and return a Boolean value to indicate the operation result.
boolean switchCamera();
Based on Live Streaming, create a player, implement the method IRegistorIOTCListener#receiveLocalVideoFrame(int sessionId, ByteBuffer y, ByteBuffer u, ByteBuffer v, int width, int height)
, and bind with the P2P object to render local videos.
Example
The local camera player inherits the YUV player.
public class LocalCameraMonitor extends YUVMonitorTextureView implements IRegistorIOTCListener {
public LocalCameraMonitor(Context context) {
this(context, null);
}
public LocalCameraMonitor(Context context, AttributeSet attrs) {
super(context, attrs);
}
@Override
public void receiveFrameYUVData(int sessionId, ByteBuffer y, ByteBuffer u, ByteBuffer v, ThingVideoFrameInfo videoFrameInfo, Object camera) {
}
@Override
public void receiveLocalVideoFrame(int sessionId, ByteBuffer y, ByteBuffer u, ByteBuffer v, int width, int height) {
updateFrameYUVData(y, u, v, width, height);
}
@Override
public void receivePCMData(int sessionId, ByteBuffer pcm, ThingAudioFrameInfo audioFrameInfo, Object camera) {
}
@Override
public void onSessionStatusChanged(Object camera, int sessionId, int sessionStatus) {
}
}
After a P2P connection is created, start a video talk to send the local video to the device.
API description
void startVideoTalk(OperationDelegateCallBack callBack);
Stop a video call to disconnect the channel for transmitting local video to the device. This action does not terminate the P2P connection.
API description
void stopVideoTalk(OperationDelegateCallBack callBack);
Pause a video talk.
API description
void pauseVideoTalk(OperationDelegateCallBack callback);
Resume a video talk.
API description
void resumeVideoTalk(OperationDelegateCallBack callback);
API description
double getVideoBitRateKbps();
Example
double rate = mCameraP2P.getVideoBitRateKbps();
API description
Enable users to zoom in video views by double-tapping by default. You can also disable this feature.
public void setCameraViewDoubleClickEnable(boolean enable);
Example
ThingCameraView mVideoView = findViewById(R.id.camera_video_view);
mVideoView.setCameraViewDoubleClickEnable(false);
API description
ICameraConfigInfo
is the device capability class that encapsulates multiple API methods to implement device features.
ICameraConfigInfo getCameraConfig(String devId);
Example
IThingIPCCore cameraInstance = ThingIPCSdk.getCameraInstance();
if (cameraInstance != null) {
ICameraConfigInfo cameraConfig = cameraInstance.getCameraConfig(devId);
if (cameraConfig != null) {
int videoNum = cameraConfig.getVideoNum();
}
}
API description
Request the streams supported by the device. If the value is 1
, the device only supports HD or SD. Device streams are obtained with the default definition mode.
int getVideoNum();
API description
Request the default definition of the device. If the device only supports one stream channel, it also supports only one definition mode.
int getDefaultDefinition();
API description
Check whether the device is equipped with a speaker. If so, the device supports video talk.
boolean isSupportSpeaker();
API description
Check whether the device is equipped with a pickup. If so, videos from the device are audible.
boolean isSupportPickup();
API description
int getDefaultTalkBackMode();
API description
Check whether users can switch between video talk modes. If so, the device supports both one-way and two-way video talk.
boolean isSupportChangeTalkBackMode();
API description
Request the raw data of P2P configurations.
String getRawDataJsonStr();
API description
List<Integer> getSupportPlaySpeedList();
Return value
The following table lists the return values that are mapped to constants of ThingIPCConstant
.
Constant of ThingIPCConstant |
Return value | Description |
---|---|---|
ThingIPCConstant.THING_SPEED_05TIMES | 0 | 0.5x |
ThingIPCConstant.THING_SPEED_10TIMES | 1 | 1x |
ThingIPCConstant.THING_SPEED_20TIMES | 3 | 2x |
ThingIPCConstant.THING_SPEED_40TIMES | 7 | 4x |
ThingIPCConstant.THING_SPEED_80TIMES | 8 | 8x |
ThingIPCConstant.THING_SPEED_160TIMES | 9 | 16x |
ThingIPCConstant.THING_SPEED_320TIMES | 10 | 32x |
Is this page helpful?
YesFeedbackIs this page helpful?
YesFeedback