FaceLandmarker
class FaceLandmarker : NSObject@brief Class that performs face landmark detection on images.
The API expects a TFLite model with mandatory TFLite Model Metadata.
- 
                  
                  Creates a new instance of FaceLandmarkerfrom an absolute path to a TensorFlow Lite model file stored locally on the device and the defaultFaceLandmarker.DeclarationSwift convenience init(modelPath: String) throwsParametersmodelPathAn absolute path to a TensorFlow Lite model file stored locally on the device. Return ValueA new instance of FaceLandmarkerwith the given model path.nilif there is an error in initializing the face landmaker.
- 
                  
                  Creates a new instance of FaceLandmarkerfrom the givenFaceLandmarkerOptions.DeclarationSwift init(options: FaceLandmarkerOptions) throwsParametersoptionsThe options of type FaceLandmarkerOptionsto use for configuring theFaceLandmarker.Return ValueA new instance of FaceLandmarkerwith the given options.nilif there is an error in initializing the face landmaker.
- 
                  
                  Performs face landmark detection on the provided MPImageusing the whole image as region of interest. Rotation will be applied according to theorientationproperty of the providedMPImage. Only use this method when theFaceLandmarkeris created with.image.This method supports performing face landmark detection on RGBA images. If your MPImagehas a source type of.pixelBufferor.sampleBuffer, the underlying pixel buffer must usekCVPixelFormatType_32BGRAas its pixel format.If your MPImagehas a source type of.imageensure that the color space is RGB with an Alpha channel.DeclarationSwift func detect(image: MPImage) throws -> FaceLandmarkerResultParametersimageThe MPImageon which face landmark detection is to be performed.Return ValueAn FaceLandmarkerResultthat contains a list of landmarks.nilif there is an error in initializing the face landmaker.
- 
                  
                  Performs face landmark detection on the provided video frame of type MPImageusing the whole image as region of interest. Rotation will be applied according to theorientationproperty of the providedMPImage. Only use this method when theFaceLandmarkeris created with running mode.video.This method supports performing face landmark detection on RGBA images. If your MPImagehas a source type of.pixelBufferor.sampleBuffer, the underlying pixel buffer must usekCVPixelFormatType_32BGRAas its pixel format.If your MPImagehas a source type of.imageensure that the color space is RGB with an Alpha channel.DeclarationSwift func detect(videoFrame image: MPImage, timestampInMilliseconds: Int) throws -> FaceLandmarkerResultParametersimageThe MPImageon which face landmark detection is to be performed.timestampInMillisecondsThe video frame’s timestamp (in milliseconds). The input timestamps must be monotonically increasing. Return ValueAn FaceLandmarkerResultthat contains a list of landmarks.nilif there is an error in initializing the face landmaker.
- 
                  
                  Sends live stream image data of type MPImageto perform face landmark detection using the whole image as region of interest. Rotation will be applied according to theorientationproperty of the providedMPImage. Only use this method when theFaceLandmarkeris created with.liveStream.The object which needs to be continuously notified of the available results of face detection must confirm to FaceLandmarkerLiveStreamDelegateprotocol and implement thefaceLandmarker(_:didFinishDetectionWithResult:timestampInMilliseconds:error:)delegate method.It’s required to provide a timestamp (in milliseconds) to indicate when the input image is sent to the face detector. The input timestamps must be monotonically increasing. This method supports performing face landmark detection on RGBA images. If your MPImagehas a source type of.pixelBufferor.sampleBuffer, the underlying pixel buffer must usekCVPixelFormatType_32BGRAas its pixel format.If the input MPImagehas a source type of.imageensure that the color space is RGB with an Alpha channel.If this method is used for classifying live camera frames using AVFoundation, ensure that you requestAVCaptureVideoDataOutputto output frames inkCMPixelFormat_32BGRAusing itsvideoSettingsproperty.DeclarationSwift func detectAsync(image: MPImage, timestampInMilliseconds: Int) throwsParametersimageA live stream image data of type MPImageon which face landmark detection is to be performed.timestampInMillisecondsThe timestamp (in milliseconds) which indicates when the input image is sent to the face detector. The input timestamps must be monotonically increasing. Return Valuetrueif the image was sent to the task successfully, otherwisefalse.
- 
                  
                  Returns the connections between all the landmarks in the lips. DeclarationSwift class func lipsConnections() -> [Connection]Return ValueAn array of connections between all the landmarks in the lips. 
- 
                  
                  Returns the connections between all the landmarks in the left eye. DeclarationSwift class func leftEyeConnections() -> [Connection]Return ValueAn array of connections between all the landmarks in the left eye. 
- 
                  
                  Returns the connections between all the landmarks in the left eyebrow. DeclarationSwift class func leftEyebrowConnections() -> [Connection]Return ValueAn array of connections between all the landmarks in the left eyebrow. 
- 
                  
                  Returns the connections between all the landmarks in the left iris. DeclarationSwift class func leftIrisConnections() -> [Connection]Return ValueAn array of connections between all the landmarks in the left iris. 
- 
                  
                  Returns the connections between all the landmarks in the right eye. DeclarationSwift class func rightEyeConnections() -> [Connection]Return ValueAn array of connections between all the landmarks in the right eyr. 
- 
                  
                  Returns the connections between all the landmarks in the right eyebrow. DeclarationSwift class func rightEyebrowConnections() -> [Connection]Return ValueAn array of connections between all the landmarks in the right eyebrow. 
- 
                  
                  Returns the connections between all the landmarks in the right iris. DeclarationSwift class func rightIrisConnections() -> [Connection]Return ValueAn array of connections between all the landmarks in the right iris. 
- 
                  
                  Returns the connections between all the landmarks of the face oval. DeclarationSwift class func faceOvalConnections() -> [Connection]Return ValueAn array of connections between all the landmarks of the face oval. 
- 
                  
                  Returns the connections between making up the contours of the face. DeclarationSwift class func contoursConnections() -> [Connection]Return ValueAn array of connections between all the contours of the face. 
- 
                  
                  Returns the connections between all the landmarks making up the tesselation of the face. DeclarationSwift class func tesselationConnections() -> [Connection]Return ValueAn array of connections between all the landmarks making up the tesselation of the face. 
- 
                  
                  Returns the connections between all the landmarks in the face. DeclarationSwift class func faceConnections() -> [Connection]Return ValueAn array of connections between all the landmarks in the face. 
- 
                  
                  Undocumented 
- 
                  
                  Undocumented