PoseLandmarker
class PoseLandmarker : NSObject@brief Performs pose landmarks detection on images.
This API expects a pre-trained pose landmarks model asset bundle.
- 
                  
                  The array of connections between all the landmarks in the detected pose. DeclarationSwift class var poseLandmarks: [Connection] { get }
- 
                  
                  Creates a new instance of PoseLandmarkerfrom an absolute path to a model asset bundle stored locally on the device and the defaultPoseLandmarkerOptions.DeclarationSwift convenience init(modelPath: String) throwsParametersmodelPathAn absolute path to a model asset bundle stored locally on the device. errorAn optional error parameter populated when there is an error in initializing the pose landmarker. Return ValueA new instance of PoseLandmarkerwith the given model path.nilif there is an error in initializing the pose landmarker.
- 
                  
                  Creates a new instance of PoseLandmarkerfrom the givenPoseLandmarkerOptions.DeclarationSwift init(options: PoseLandmarkerOptions) throwsParametersoptionsThe options of type PoseLandmarkerOptionsto use for configuring thePoseLandmarker.errorAn optional error parameter populated when there is an error in initializing the pose landmarker. Return ValueA new instance of PoseLandmarkerwith the given options.nilif there is an error in initializing the pose landmarker.
- 
                  
                  Performs pose landmarks detection on the provided MPImageusing the whole image as region of interest. Rotation will be applied according to theorientationproperty of the providedMPImage. Only use this method when thePoseLandmarkeris created with running mode.image.This method supports performing pose landmarks detection on RGBA images. If your MPImagehas a source type of.pixelBufferor.sampleBuffer, the underlying pixel buffer must usekCVPixelFormatType_32BGRAas its pixel format.If your MPImagehas a source type of.imageensure that the color space is RGB with an Alpha channel.DeclarationSwift func detect(image: MPImage) throws -> PoseLandmarkerResultParametersimageThe MPImageon which pose landmarks detection is to be performed.errorAn optional error parameter populated when there is an error in performing pose landmark detection on the input image. Return ValueAn PoseLandmarkerResultobject that contains the pose landmarks detection results.
- 
                  
                  Performs pose landmarks detection on the provided video frame of type MPImageusing the whole image as region of interest. Rotation will be applied according to theorientationproperty of the providedMPImage. Only use this method when thePoseLandmarkeris created with running mode.video.It’s required to provide the video frame’s timestamp (in milliseconds). The input timestamps must be monotonically increasing. This method supports performing pose landmarks detection on RGBA images. If your MPImagehas a source type of.pixelBufferor.sampleBuffer, the underlying pixel buffer must usekCVPixelFormatType_32BGRAas its pixel format.If your MPImagehas a source type of.imageensure that the color space is RGB with an Alpha channel.DeclarationSwift func detect(videoFrame image: MPImage, timestampInMilliseconds: Int) throws -> PoseLandmarkerResultParametersimageThe MPImageon which pose landmarks detection is to be performed.timestampInMillisecondsThe video frame’s timestamp (in milliseconds). The input timestamps must be monotonically increasing. errorAn optional error parameter populated when there is an error in performing pose landmark detection on the input video frame. Return ValueAn PoseLandmarkerResultobject that contains the pose landmarks detection results.
- 
                  
                  Sends live stream image data of type MPImageto perform pose landmarks detection using the whole image as region of interest. Rotation will be applied according to theorientationproperty of the providedMPImage. Only use this method when thePoseLandmarkeris created with running mode.liveStream.The object which needs to be continuously notified of the available results of pose landmark detection must confirm to PoseLandmarkerLiveStreamDelegateprotocol and implement theposeLandmarker(_:didFinishDetectionWithResult:timestampInMilliseconds:error:)delegate method.It’s required to provide a timestamp (in milliseconds) to indicate when the input image is sent to the pose landmarker. The input timestamps must be monotonically increasing. This method supports performing pose landmarks detection on RGBA images. If your MPImagehas a source type of.pixelBufferor.sampleBuffer, the underlying pixel buffer must usekCVPixelFormatType_32BGRAas its pixel format.If the input MPImagehas a source type of.imageensure that the color space is RGB with an Alpha channel.If this method is used for performing pose landmarks detection on live camera frames using AVFoundation, ensure that you requestAVCaptureVideoDataOutputto output frames inkCMPixelFormat_32BGRAusing itsvideoSettingsproperty.DeclarationSwift func detectAsync(image: MPImage, timestampInMilliseconds: Int) throwsParametersimageA live stream image data of type MPImageon which pose landmarks detection is to be performed.timestampInMillisecondsThe timestamp (in milliseconds) which indicates when the input image is sent to the pose landmarker. The input timestamps must be monotonically increasing. errorAn optional error parameter populated when there is an error in performing pose landmark detection on the input live stream image data. Return ValueYESif the image was sent to the task successfully, otherwiseNO.
- 
                  
                  Undocumented 
- 
                  
                  Undocumented