An object that confirms to GestureRecognizerLiveStreamDelegate protocol. This object must
implement gestureRecognizer(_:didFinishRecognitionWithResult:timestampInMilliseconds:error:) to
receive the results of performing asynchronous gesture recognition on images (i.e, when
runningMode = .liveStream).
Sets the optional ClassifierOptions controlling the canned gestures classifier, such as score
threshold, allow list and deny list of gestures. The categories for canned gesture classifiers
are: [“None”, “Closed_Fist”, “Open_Palm”, “Pointing_Up”, “Thumb_Down”, “Thumb_Up”, “Victory”,
“ILoveYou”].
TODO: Note this option is subject to change, after scoring merging calculator is implemented.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-05-08 UTC."],[],[],null,["# MediaPipeTasksVision Framework Reference\n\nMPPGestureRecognizerOptions\n===========================\n\n\n @interface MPPGestureRecognizerOptions : ../Classes/MPPTaskOptions.html \u003cNSCopying\u003e\n\nOptions for setting up a `GestureRecognizer`.\n- `\n ``\n ``\n `\n\n ### [runningMode](#/c:objc(cs)MPPGestureRecognizerOptions(py)runningMode)\n\n `\n ` \n Running mode of the gesture recognizer task. Defaults to `.video`.\n `GestureRecognizer` can be created with one of the following running modes:\n 1. `image`: The mode for performing gesture recognition on single image inputs.\n 2. `video`: The mode for performing gesture recognition on the decoded frames of a video.\n 3. `liveStream`: The mode for performing gesture recognition on a live stream of input data, such as from the camera. \n\n #### Declaration\n\n Objective-C \n\n @property (nonatomic) ../Enums/MPPRunningMode.html runningMode;\n\n- `\n ``\n ``\n `\n\n ### [gestureRecognizerLiveStreamDelegate](#/c:objc(cs)MPPGestureRecognizerOptions(py)gestureRecognizerLiveStreamDelegate)\n\n `\n ` \n An object that confirms to `GestureRecognizerLiveStreamDelegate` protocol. This object must\n implement `gestureRecognizer(_:didFinishRecognitionWithResult:timestampInMilliseconds:error:)` to\n receive the results of performing asynchronous gesture recognition on images (i.e, when\n [runningMode](../Classes/MPPGestureRecognizerOptions.html#/c:objc(cs)MPPGestureRecognizerOptions(py)runningMode) = `.liveStream`). \n\n #### Declaration\n\n Objective-C \n\n @property (nonatomic, weak, nullable) id\u003c../Protocols/MPPGestureRecognizerLiveStreamDelegate.html\u003e gestureRecognizerLiveStreamDelegate;\n\n- `\n ``\n ``\n `\n\n ### [numHands](#/c:objc(cs)MPPGestureRecognizerOptions(py)numHands)\n\n `\n ` \n Sets the maximum number of hands can be detected by the GestureRecognizer. \n\n #### Declaration\n\n Objective-C \n\n @property (nonatomic) NSInteger numHands;\n\n- `\n ``\n ``\n `\n\n ### [minHandDetectionConfidence](#/c:objc(cs)MPPGestureRecognizerOptions(py)minHandDetectionConfidence)\n\n `\n ` \n Sets minimum confidence score for the hand detection to be considered successful \n\n #### Declaration\n\n Objective-C \n\n @property (nonatomic) float minHandDetectionConfidence;\n\n- `\n ``\n ``\n `\n\n ### [minHandPresenceConfidence](#/c:objc(cs)MPPGestureRecognizerOptions(py)minHandPresenceConfidence)\n\n `\n ` \n Sets minimum confidence score of hand presence score in the hand landmark detection. \n\n #### Declaration\n\n Objective-C \n\n @property (nonatomic) float minHandPresenceConfidence;\n\n- `\n ``\n ``\n `\n\n ### [minTrackingConfidence](#/c:objc(cs)MPPGestureRecognizerOptions(py)minTrackingConfidence)\n\n `\n ` \n Sets the minimum confidence score for the hand tracking to be considered successful. \n\n #### Declaration\n\n Objective-C \n\n @property (nonatomic) float minTrackingConfidence;\n\n- `\n ``\n ``\n `\n\n ### [cannedGesturesClassifierOptions](#/c:objc(cs)MPPGestureRecognizerOptions(py)cannedGesturesClassifierOptions)\n\n `\n ` \n Sets the optional `ClassifierOptions` controlling the canned gestures classifier, such as score\n threshold, allow list and deny list of gestures. The categories for canned gesture classifiers\n are: \\[\"None\", \"Closed_Fist\", \"Open_Palm\", \"Pointing_Up\", \"Thumb_Down\", \"Thumb_Up\", \"Victory\",\n \"ILoveYou\"\\].\n\n TODO: Note this option is subject to change, after scoring merging calculator is implemented. \n\n #### Declaration\n\n Objective-C \n\n @property (nonatomic, copy, nullable) ../Classes/MPPClassifierOptions.html *cannedGesturesClassifierOptions;\n\n- `\n ``\n ``\n `\n\n ### [customGesturesClassifierOptions](#/c:objc(cs)MPPGestureRecognizerOptions(py)customGesturesClassifierOptions)\n\n `\n ` \n Sets the optional `ClassifierOptions` controlling the custom gestures classifier, such as score\n threshold, allow list and deny list of gestures.\n\n TODO: Note this option is subject to change, after scoring merging calculator is implemented. \n\n #### Declaration\n\n Objective-C \n\n @property (nonatomic, copy, nullable) ../Classes/MPPClassifierOptions.html *customGesturesClassifierOptions;"]]