MPPFaceLandmarker
@interface MPPFaceLandmarker : NSObject
@brief Class that performs face landmark detection on images.
The API expects a TFLite model with mandatory TFLite Model Metadata.
-
Creates a new instance of
FaceLandmarker
from an absolute path to a TensorFlow Lite model file stored locally on the device and the defaultFaceLandmarker
.Declaration
Objective-C
- (nullable instancetype)initWithModelPath:(nonnull NSString *)modelPath error:(NSError *_Nullable *_Nullable)error;
Parameters
modelPath
An absolute path to a TensorFlow Lite model file stored locally on the device.
Return Value
A new instance of
FaceLandmarker
with the given model path.nil
if there is an error in initializing the face landmaker. -
Creates a new instance of
FaceLandmarker
from the givenFaceLandmarkerOptions
.Declaration
Objective-C
- (nullable instancetype)initWithOptions: (nonnull MPPFaceLandmarkerOptions *)options error:(NSError *_Nullable *_Nullable)error;
Parameters
options
The options of type
FaceLandmarkerOptions
to use for configuring theFaceLandmarker
.Return Value
A new instance of
FaceLandmarker
with the given options.nil
if there is an error in initializing the face landmaker. -
Performs face landmark detection on the provided
MPImage
using the whole image as region of interest. Rotation will be applied according to theorientation
property of the providedMPImage
. Only use this method when theFaceLandmarker
is created with.image
.This method supports performing face landmark detection on RGBA images. If your
MPImage
has a source type of.pixelBuffer
or.sampleBuffer
, the underlying pixel buffer must usekCVPixelFormatType_32BGRA
as its pixel format.If your
MPImage
has a source type of.image
ensure that the color space is RGB with an Alpha channel.Declaration
Objective-C
- (nullable MPPFaceLandmarkerResult *) detectImage:(nonnull MPPImage *)image error:(NSError *_Nullable *_Nullable)error;
Parameters
image
The
MPImage
on which face landmark detection is to be performed.Return Value
An
FaceLandmarkerResult
that contains a list of landmarks.nil
if there is an error in initializing the face landmaker. -
Performs face landmark detection on the provided video frame of type
MPImage
using the whole image as region of interest. Rotation will be applied according to theorientation
property of the providedMPImage
. Only use this method when theFaceLandmarker
is created with running mode.video
.This method supports performing face landmark detection on RGBA images. If your
MPImage
has a source type of.pixelBuffer
or.sampleBuffer
, the underlying pixel buffer must usekCVPixelFormatType_32BGRA
as its pixel format.If your
MPImage
has a source type of.image
ensure that the color space is RGB with an Alpha channel.Declaration
Objective-C
- (nullable MPPFaceLandmarkerResult *) detectVideoFrame:(nonnull MPPImage *)image timestampInMilliseconds:(NSInteger)timestampInMilliseconds error:(NSError *_Nullable *_Nullable)error;
Parameters
image
The
MPImage
on which face landmark detection is to be performed.timestampInMilliseconds
The video frame’s timestamp (in milliseconds). The input timestamps must be monotonically increasing.
Return Value
An
FaceLandmarkerResult
that contains a list of landmarks.nil
if there is an error in initializing the face landmaker. -
Sends live stream image data of type
MPImage
to perform face landmark detection using the whole image as region of interest. Rotation will be applied according to theorientation
property of the providedMPImage
. Only use this method when theFaceLandmarker
is created with.liveStream
.The object which needs to be continuously notified of the available results of face detection must confirm to
FaceLandmarkerLiveStreamDelegate
protocol and implement thefaceLandmarker(_:didFinishDetectionWithResult:timestampInMilliseconds:error:)
delegate method.It’s required to provide a timestamp (in milliseconds) to indicate when the input image is sent to the face detector. The input timestamps must be monotonically increasing.
This method supports performing face landmark detection on RGBA images. If your
MPImage
has a source type of.pixelBuffer
or.sampleBuffer
, the underlying pixel buffer must usekCVPixelFormatType_32BGRA
as its pixel format.If the input
MPImage
has a source type of.image
ensure that the color space is RGB with an Alpha channel.If this method is used for classifying live camera frames using
AVFoundation
, ensure that you requestAVCaptureVideoDataOutput
to output frames inkCMPixelFormat_32BGRA
using itsvideoSettings
property.Declaration
Objective-C
- (BOOL)detectAsyncImage:(nonnull MPPImage *)image timestampInMilliseconds:(NSInteger)timestampInMilliseconds error:(NSError *_Nullable *_Nullable)error;
Parameters
image
A live stream image data of type
MPImage
on which face landmark detection is to be performed.timestampInMilliseconds
The timestamp (in milliseconds) which indicates when the input image is sent to the face detector. The input timestamps must be monotonically increasing.
Return Value
true
if the image was sent to the task successfully, otherwisefalse
. -
Returns the connections between all the landmarks in the lips.
Declaration
Objective-C
+ (nonnull NSArray<MPPConnection *> *)lipsConnections;
Return Value
An array of connections between all the landmarks in the lips.
-
Returns the connections between all the landmarks in the left eye.
Declaration
Objective-C
+ (nonnull NSArray<MPPConnection *> *)leftEyeConnections;
Return Value
An array of connections between all the landmarks in the left eye.
-
Returns the connections between all the landmarks in the left eyebrow.
Declaration
Objective-C
+ (nonnull NSArray<MPPConnection *> *)leftEyebrowConnections;
Return Value
An array of connections between all the landmarks in the left eyebrow.
-
Returns the connections between all the landmarks in the left iris.
Declaration
Objective-C
+ (nonnull NSArray<MPPConnection *> *)leftIrisConnections;
Return Value
An array of connections between all the landmarks in the left iris.
-
Returns the connections between all the landmarks in the right eye.
Declaration
Objective-C
+ (nonnull NSArray<MPPConnection *> *)rightEyeConnections;