GestureRecognizer
class GestureRecognizer : NSObject@brief Performs gesture recognition on images.
This API expects a pre-trained TFLite hand gesture recognizer model or a custom one created using MediaPipe Solutions Model Maker. See https://developers.google.com/mediapipe/solutions/model_maker.
- 
                  
                  Creates a new instance of GestureRecognizerfrom an absolute path to a TensorFlow Lite model file stored locally on the device and the defaultGestureRecognizerOptions.DeclarationSwift convenience init(modelPath: String) throwsParametersmodelPathAn absolute path to a TensorFlow Lite model file stored locally on the device. errorAn optional error parameter populated when there is an error in initializing the gesture recognizer. Return ValueA new instance of GestureRecognizerwith the given model path.nilif there is an error in initializing the gesture recognizer.
- 
                  
                  Creates a new instance of GestureRecognizerfrom the givenGestureRecognizerOptions.DeclarationSwift init(options: GestureRecognizerOptions) throwsParametersoptionsThe options of type GestureRecognizerOptionsto use for configuring theGestureRecognizer.errorAn optional error parameter populated when there is an error in initializing the gesture recognizer. Return ValueA new instance of GestureRecognizerwith the given options.nilif there is an error in initializing the gesture recognizer.
- 
                  
                  Performs gesture recognition on the provided MPImageusing the whole image as region of interest. Rotation will be applied according to theorientationproperty of the providedMPImage. Only use this method when theGestureRecognizeris created with running mode,.image.This method supports performing gesture recognition on RGBA images. If your MPImagehas a source type of.pixelBufferor.sampleBuffer, the underlying pixel buffer must usekCVPixelFormatType_32BGRAas its pixel format.If your MPImagehas a source type of.imageensure that the color space is RGB with an Alpha channel.DeclarationSwift func recognize(image: MPImage) throws -> GestureRecognizerResultParametersimageThe MPImageon which gesture recognition is to be performed.errorAn optional error parameter populated when there is an error in performing gesture recognition on the input image. Return ValueAn GestureRecognizerResultobject that contains the hand gesture recognition results.
- 
                  
                  Performs gesture recognition on the provided video frame of type MPImageusing the whole image as region of interest. Rotation will be applied according to theorientationproperty of the providedMPImage. Only use this method when theGestureRecognizeris created with running mode,.video.It’s required to provide the video frame’s timestamp (in milliseconds). The input timestamps must be monotonically increasing. This method supports performing gesture recognition on RGBA images. If your MPImagehas a source type of.pixelBufferor.sampleBuffer, the underlying pixel buffer must usekCVPixelFormatType_32BGRAas its pixel format.If your MPImagehas a source type of.imageensure that the color space is RGB with an Alpha channel.DeclarationSwift func recognize(videoFrame image: MPImage, timestampInMilliseconds: Int) throws -> GestureRecognizerResultParametersimageThe MPImageon which gesture recognition is to be performed.timestampInMillisecondsThe video frame’s timestamp (in milliseconds). The input timestamps must be monotonically increasing. errorAn optional error parameter populated when there is an error in performing gesture recognition on the input video frame. Return ValueAn GestureRecognizerResultobject that contains the hand gesture recognition results.
- 
                  
                  Sends live stream image data of type MPImageto perform gesture recognition using the whole image as region of interest. Rotation will be applied according to theorientationproperty of the providedMPImage. Only use this method when theGestureRecognizeris created with running mode,.liveStream.The object which needs to be continuously notified of the available results of gesture recognition must confirm to GestureRecognizerLiveStreamDelegateprotocol and implement thegestureRecognizer(_:didFinishRecognitionWithResult:timestampInMilliseconds:error:)delegate method.It’s required to provide a timestamp (in milliseconds) to indicate when the input image is sent to the gesture recognizer. The input timestamps must be monotonically increasing. This method supports performing gesture recognition on RGBA images. If your MPImagehas a source type of.pixelBufferor.sampleBuffer, the underlying pixel buffer must usekCVPixelFormatType_32BGRAas its pixel format.If the input MPImagehas a source type of.imageensure that the color space is RGB with an Alpha channel.If this method is used for performing gesture recognition on live camera frames using AVFoundation, ensure that you requestAVCaptureVideoDataOutputto output frames inkCMPixelFormat_32BGRAusing itsvideoSettingsproperty.DeclarationSwift func recognizeAsync(image: MPImage, timestampInMilliseconds: Int) throwsParametersimageA live stream image data of type MPImageon which gesture recognition is to be performed.timestampInMillisecondsThe timestamp (in milliseconds) which indicates when the input image is sent to the gesture recognizer. The input timestamps must be monotonically increasing. errorAn optional error parameter populated when there is an error in performing gesture recognition on the input live stream image data. Return ValueYESif the image was sent to the task successfully, otherwiseNO.
- 
                  
                  Undocumented 
- 
                  
                  Undocumented