Creates a new instance of GestureRecognizer from an absolute path to a TensorFlow Lite model
file stored locally on the device and the default GestureRecognizerOptions.
Declaration
Swift
convenienceinit(modelPath:String)throws
Parameters
modelPath
An absolute path to a TensorFlow Lite model file stored locally on the device.
error
An optional error parameter populated when there is an error in initializing the
gesture recognizer.
Return Value
A new instance of GestureRecognizer with the given model path. nil if there is an
error in initializing the gesture recognizer.
Performs gesture recognition on the provided MPImage using the whole image as region of
interest. Rotation will be applied according to the orientation property of the provided
MPImage. Only use this method when the GestureRecognizer is created with running mode,
.image.
This method supports performing gesture recognition on RGBA images. If your MPImage has a
source type of .pixelBuffer or .sampleBuffer, the underlying pixel buffer must use
kCVPixelFormatType_32BGRA as its pixel format.
If your MPImage has a source type of .image ensure that the color space is RGB with an Alpha
channel.
Performs gesture recognition on the provided video frame of type MPImage using the whole image
as region of interest. Rotation will be applied according to the orientation property of the
provided MPImage. Only use this method when the GestureRecognizer is created with running
mode, .video.
It’s required to provide the video frame’s timestamp (in milliseconds). The input timestamps must
be monotonically increasing.
This method supports performing gesture recognition on RGBA images. If your MPImage has a
source type of .pixelBuffer or .sampleBuffer, the underlying pixel buffer must use
kCVPixelFormatType_32BGRA as its pixel format.
If your MPImage has a source type of .image ensure that the color space is RGB with an Alpha
channel.
Sends live stream image data of type MPImage to perform gesture recognition using the whole
image as region of interest. Rotation will be applied according to the orientation property of
the provided MPImage. Only use this method when the GestureRecognizer is created with running
mode, .liveStream.
The object which needs to be continuously notified of the available results of gesture
recognition must confirm to GestureRecognizerLiveStreamDelegate protocol and implement the
gestureRecognizer(_:didFinishRecognitionWithResult:timestampInMilliseconds:error:)
delegate method.
It’s required to provide a timestamp (in milliseconds) to indicate when the input image is sent
to the gesture recognizer. The input timestamps must be monotonically increasing.
This method supports performing gesture recognition on RGBA images. If your MPImage has a
source type of .pixelBuffer or .sampleBuffer, the underlying pixel buffer must use
kCVPixelFormatType_32BGRA as its pixel format.
If the input MPImage has a source type of .image ensure that the color space is RGB with an
Alpha channel.
If this method is used for performing gesture recognition on live camera frames using
AVFoundation, ensure that you request AVCaptureVideoDataOutput to output frames in
kCMPixelFormat_32BGRA using its videoSettings property.
A live stream image data of type MPImage on which gesture recognition is to be
performed.
timestampInMilliseconds
The timestamp (in milliseconds) which indicates when the input
image is sent to the gesture recognizer. The input timestamps must be monotonically increasing.
error
An optional error parameter populated when there is an error in performing gesture
recognition on the input live stream image data.
Return Value
YES if the image was sent to the task successfully, otherwise NO.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-05-08 UTC."],[],[],null,["# MediaPipeTasksVision Framework Reference\n\nGestureRecognizer\n=================\n\n class GestureRecognizer : NSObject\n\n@brief Performs gesture recognition on images.\n\nThis API expects a pre-trained TFLite hand gesture recognizer model or a custom one created using\nMediaPipe Solutions Model Maker. See\n\u003chttps://developers.google.com/mediapipe/solutions/model_maker\u003e.\n- `\n ``\n ``\n `\n\n ### [init(modelPath:)](#/c:objc(cs)MPPGestureRecognizer(im)initWithModelPath:error:)\n\n `\n ` \n Creates a new instance of `GestureRecognizer` from an absolute path to a TensorFlow Lite model\n file stored locally on the device and the default [GestureRecognizerOptions](../Classes/GestureRecognizerOptions.html). \n\n #### Declaration\n\n Swift \n\n convenience init(modelPath: String) throws\n\n #### Parameters\n\n |-------------------|------------------------------------------------------------------------------------------------------|\n | ` `*modelPath*` ` | An absolute path to a TensorFlow Lite model file stored locally on the device. |\n | ` `*error*` ` | An optional error parameter populated when there is an error in initializing the gesture recognizer. |\n\n #### Return Value\n\n A new instance of `GestureRecognizer` with the given model path. `nil` if there is an\n error in initializing the gesture recognizer.\n- `\n ``\n ``\n `\n\n ### [init(options:)](#/c:objc(cs)MPPGestureRecognizer(im)initWithOptions:error:)\n\n `\n ` \n Creates a new instance of `GestureRecognizer` from the given [GestureRecognizerOptions](../Classes/GestureRecognizerOptions.html). \n\n #### Declaration\n\n Swift \n\n init(options: ../Classes/GestureRecognizerOptions.html) throws\n\n #### Parameters\n\n |-----------------|------------------------------------------------------------------------------------------------------------------------------------------|\n | ` `*options*` ` | The options of type [GestureRecognizerOptions](../Classes/GestureRecognizerOptions.html) to use for configuring the `GestureRecognizer`. |\n | ` `*error*` ` | An optional error parameter populated when there is an error in initializing the gesture recognizer. |\n\n #### Return Value\n\n A new instance of `GestureRecognizer` with the given options. `nil` if there is an error\n in initializing the gesture recognizer.\n- `\n ``\n ``\n `\n\n ### [recognize(image:)](#/c:objc(cs)MPPGestureRecognizer(im)recognizeImage:error:)\n\n `\n ` \n Performs gesture recognition on the provided [MPImage](../Classes/MPImage.html) using the whole image as region of\n interest. Rotation will be applied according to the `orientation` property of the provided\n [MPImage](../Classes/MPImage.html). Only use this method when the `GestureRecognizer` is created with running mode,\n [.image](../Constants.html#/c:MPPImage.h@MPPImageSourceTypeImage).\n\n This method supports performing gesture recognition on RGBA images. If your [MPImage](../Classes/MPImage.html) has a\n source type of [.pixelBuffer](../Constants.html#/c:MPPImage.h@MPPImageSourceTypePixelBuffer) or [.sampleBuffer](../Constants.html#/c:MPPImage.h@MPPImageSourceTypeSampleBuffer), the underlying pixel buffer must use\n `kCVPixelFormatType_32BGRA` as its pixel format.\n\n If your [MPImage](../Classes/MPImage.html) has a source type of [.image](../Constants.html#/c:MPPImage.h@MPPImageSourceTypeImage) ensure that the color space is RGB with an Alpha\n channel. \n\n #### Declaration\n\n Swift \n\n func recognize(image: ../Classes/MPImage.html) throws -\u003e ../Classes/GestureRecognizerResult.html\n\n #### Parameters\n\n |---------------|--------------------------------------------------------------------------------------------------------------------|\n | ` `*image*` ` | The [MPImage](../Classes/MPImage.html) on which gesture recognition is to be performed. |\n | ` `*error*` ` | An optional error parameter populated when there is an error in performing gesture recognition on the input image. |\n\n #### Return Value\n\n An [GestureRecognizerResult](../Classes/GestureRecognizerResult.html) object that contains the hand gesture recognition\n results.\n- `\n ``\n ``\n `\n\n ### [recognize(videoFrame:timestampInMilliseconds:)](#/c:objc(cs)MPPGestureRecognizer(im)recognizeVideoFrame:timestampInMilliseconds:error:)\n\n `\n ` \n Performs gesture recognition on the provided video frame of type [MPImage](../Classes/MPImage.html) using the whole image\n as region of interest. Rotation will be applied according to the `orientation` property of the\n provided [MPImage](../Classes/MPImage.html). Only use this method when the `GestureRecognizer` is created with running\n mode, `.video`.\n\n It's required to provide the video frame's timestamp (in milliseconds). The input timestamps must\n be monotonically increasing.\n\n This method supports performing gesture recognition on RGBA images. If your [MPImage](../Classes/MPImage.html) has a\n source type of [.pixelBuffer](../Constants.html#/c:MPPImage.h@MPPImageSourceTypePixelBuffer) or [.sampleBuffer](../Constants.html#/c:MPPImage.h@MPPImageSourceTypeSampleBuffer), the underlying pixel buffer must use\n `kCVPixelFormatType_32BGRA` as its pixel format.\n\n If your [MPImage](../Classes/MPImage.html) has a source type of [.image](../Constants.html#/c:MPPImage.h@MPPImageSourceTypeImage) ensure that the color space is RGB with an Alpha\n channel. \n\n #### Declaration\n\n Swift \n\n func recognize(videoFrame image: ../Classes/MPImage.html, timestampInMilliseconds: Int) throws -\u003e ../Classes/GestureRecognizerResult.html\n\n #### Parameters\n\n |---------------------------------|--------------------------------------------------------------------------------------------------------------------------|\n | ` `*image*` ` | The [MPImage](../Classes/MPImage.html) on which gesture recognition is to be performed. |\n | ` `*timestampInMilliseconds*` ` | The video frame's timestamp (in milliseconds). The input timestamps must be monotonically increasing. |\n | ` `*error*` ` | An optional error parameter populated when there is an error in performing gesture recognition on the input video frame. |\n\n #### Return Value\n\n An [GestureRecognizerResult](../Classes/GestureRecognizerResult.html) object that contains the hand gesture recognition\n results.\n- `\n ``\n ``\n `\n\n ### [recognizeAsync(image:timestampInMilliseconds:)](#/c:objc(cs)MPPGestureRecognizer(im)recognizeAsyncImage:timestampInMilliseconds:error:)\n\n `\n ` \n Sends live stream image data of type [MPImage](../Classes/MPImage.html) to perform gesture recognition using the whole\n image as region of interest. Rotation will be applied according to the `orientation` property of\n the provided [MPImage](../Classes/MPImage.html). Only use this method when the `GestureRecognizer` is created with running\n mode, `.liveStream`.\n\n The object which needs to be continuously notified of the available results of gesture\n recognition must confirm to [GestureRecognizerLiveStreamDelegate](../Protocols/GestureRecognizerLiveStreamDelegate.html) protocol and implement the\n `gestureRecognizer(_:didFinishRecognitionWithResult:timestampInMilliseconds:error:)`\n delegate method.\n\n It's required to provide a timestamp (in milliseconds) to indicate when the input image is sent\n to the gesture recognizer. The input timestamps must be monotonically increasing.\n\n This method supports performing gesture recognition on RGBA images. If your [MPImage](../Classes/MPImage.html) has a\n source type of [.pixelBuffer](../Constants.html#/c:MPPImage.h@MPPImageSourceTypePixelBuffer) or [.sampleBuffer](../Constants.html#/c:MPPImage.h@MPPImageSourceTypeSampleBuffer), the underlying pixel buffer must use\n `kCVPixelFormatType_32BGRA` as its pixel format.\n\n If the input [MPImage](../Classes/MPImage.html) has a source type of [.image](../Constants.html#/c:MPPImage.h@MPPImageSourceTypeImage) ensure that the color space is RGB with an\n Alpha channel.\n\n If this method is used for performing gesture recognition on live camera frames using\n `AVFoundation`, ensure that you request `AVCaptureVideoDataOutput` to output frames in\n `kCMPixelFormat_32BGRA` using its `videoSettings` property. \n\n #### Declaration\n\n Swift \n\n func recognizeAsync(image: ../Classes/MPImage.html, timestampInMilliseconds: Int) throws\n\n #### Parameters\n\n |---------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------|\n | ` `*image*` ` | A live stream image data of type [MPImage](../Classes/MPImage.html) on which gesture recognition is to be performed. |\n | ` `*timestampInMilliseconds*` ` | The timestamp (in milliseconds) which indicates when the input image is sent to the gesture recognizer. The input timestamps must be monotonically increasing. |\n | ` `*error*` ` | An optional error parameter populated when there is an error in performing gesture recognition on the input live stream image data. |\n\n #### Return Value\n\n `YES` if the image was sent to the task successfully, otherwise `NO`.\n- `\n ``\n ``\n `\n\n ### [-init](#/c:objc(cs)MPPGestureRecognizer(im)init)\n\n `\n ` \n Undocumented\n- `\n ``\n ``\n `\n\n ### [+new](#/c:objc(cs)MPPGestureRecognizer(cm)new)\n\n `\n ` \n Undocumented"]]