PoseLandmarker
class PoseLandmarker : NSObject
@brief Performs pose landmarks detection on images.
This API expects a pre-trained pose landmarks model asset bundle.
-
The array of connections between all the landmarks in the detected pose.
Declaration
Swift
class var poseLandmarks: [Connection] { get }
-
Creates a new instance of
PoseLandmarker
from an absolute path to a model asset bundle stored locally on the device and the defaultPoseLandmarkerOptions
.Declaration
Swift
convenience init(modelPath: String) throws
Parameters
modelPath
An absolute path to a model asset bundle stored locally on the device.
error
An optional error parameter populated when there is an error in initializing the pose landmarker.
Return Value
A new instance of
PoseLandmarker
with the given model path.nil
if there is an error in initializing the pose landmarker. -
Creates a new instance of
PoseLandmarker
from the givenPoseLandmarkerOptions
.Declaration
Swift
init(options: PoseLandmarkerOptions) throws
Parameters
options
The options of type
PoseLandmarkerOptions
to use for configuring thePoseLandmarker
.error
An optional error parameter populated when there is an error in initializing the pose landmarker.
Return Value
A new instance of
PoseLandmarker
with the given options.nil
if there is an error in initializing the pose landmarker. -
Performs pose landmarks detection on the provided
MPImage
using the whole image as region of interest. Rotation will be applied according to theorientation
property of the providedMPImage
. Only use this method when thePoseLandmarker
is created with running mode.image
.This method supports performing pose landmarks detection on RGBA images. If your
MPImage
has a source type of.pixelBuffer
or.sampleBuffer
, the underlying pixel buffer must usekCVPixelFormatType_32BGRA
as its pixel format.If your
MPImage
has a source type of.image
ensure that the color space is RGB with an Alpha channel.Declaration
Swift
func detect(image: MPImage) throws -> PoseLandmarkerResult
Parameters
image
The
MPImage
on which pose landmarks detection is to be performed.error
An optional error parameter populated when there is an error in performing pose landmark detection on the input image.
Return Value
An
PoseLandmarkerResult
object that contains the pose landmarks detection results. -
Performs pose landmarks detection on the provided video frame of type
MPImage
using the whole image as region of interest. Rotation will be applied according to theorientation
property of the providedMPImage
. Only use this method when thePoseLandmarker
is created with running mode.video
.It’s required to provide the video frame’s timestamp (in milliseconds). The input timestamps must be monotonically increasing.
This method supports performing pose landmarks detection on RGBA images. If your
MPImage
has a source type of.pixelBuffer
or.sampleBuffer
, the underlying pixel buffer must usekCVPixelFormatType_32BGRA
as its pixel format.If your
MPImage
has a source type of.image
ensure that the color space is RGB with an Alpha channel.Declaration
Swift
func detect(videoFrame image: MPImage, timestampInMilliseconds: Int) throws -> PoseLandmarkerResult
Parameters
image
The
MPImage
on which pose landmarks detection is to be performed.timestampInMilliseconds
The video frame’s timestamp (in milliseconds). The input timestamps must be monotonically increasing.
error
An optional error parameter populated when there is an error in performing pose landmark detection on the input video frame.
Return Value
An
PoseLandmarkerResult
object that contains the pose landmarks detection results. -
Sends live stream image data of type
MPImage
to perform pose landmarks detection using the whole image as region of interest. Rotation will be applied according to theorientation
property of the providedMPImage
. Only use this method when thePoseLandmarker
is created with running mode.liveStream
.The object which needs to be continuously notified of the available results of pose landmark detection must confirm to
PoseLandmarkerLiveStreamDelegate
protocol and implement theposeLandmarker(_:didFinishDetectionWithResult:timestampInMilliseconds:error:)
delegate method.It’s required to provide a timestamp (in milliseconds) to indicate when the input image is sent to the pose landmarker. The input timestamps must be monotonically increasing.
This method supports performing pose landmarks detection on RGBA images. If your
MPImage
has a source type of.pixelBuffer
or.sampleBuffer
, the underlying pixel buffer must usekCVPixelFormatType_32BGRA
as its pixel format.If the input
MPImage
has a source type of.image
ensure that the color space is RGB with an Alpha channel.If this method is used for performing pose landmarks detection on live camera frames using
AVFoundation
, ensure that you requestAVCaptureVideoDataOutput
to output frames inkCMPixelFormat_32BGRA
using itsvideoSettings
property.Declaration
Swift
func detectAsync(image: MPImage, timestampInMilliseconds: Int) throws
Parameters
image
A live stream image data of type
MPImage
on which pose landmarks detection is to be performed.timestampInMilliseconds
The timestamp (in milliseconds) which indicates when the input image is sent to the pose landmarker. The input timestamps must be monotonically increasing.
error
An optional error parameter populated when there is an error in performing pose landmark detection on the input live stream image data.
Return Value
YES
if the image was sent to the task successfully, otherwiseNO
. -
Undocumented
-
Undocumented