An object that confirms to HandLandmarkerLiveStreamDelegate protocol. This object must
implement handLandmarker:didFinishDetectionWithResult:timestampInMilliseconds:error: to
receive the results of performing asynchronous hand landmark detection on images (i.e, when
runningMode = .liveStream).
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-05-08 UTC."],[],[],null,["# MediaPipeTasksVision Framework Reference\n\nHandLandmarkerOptions\n=====================\n\n class HandLandmarkerOptions : ../Classes/TaskOptions.html, NSCopying\n\nOptions for setting up a [HandLandmarker](../Classes/HandLandmarker.html).\n- `\n ``\n ``\n `\n\n ### [runningMode](#/c:objc(cs)MPPHandLandmarkerOptions(py)runningMode)\n\n `\n ` \n Running mode of the hand landmarker task. Defaults to [.image](../Constants.html#/c:MPPImage.h@MPPImageSourceTypeImage).\n [HandLandmarker](../Classes/HandLandmarker.html) can be created with one of the following running modes:\n 1. [image](../Constants.html#/c:MPPImage.h@MPPImageSourceTypeImage): The mode for performing hand landmark detection on single image inputs.\n 2. `video`: The mode for performing hand landmark detection on the decoded frames of a video.\n 3. `liveStream`: The mode for performing hand landmark detection on a live stream of input data, such as from the camera. \n\n #### Declaration\n\n Swift \n\n var runningMode: ../Enums/RunningMode.html { get set }\n\n- `\n ``\n ``\n `\n\n ### [handLandmarkerLiveStreamDelegate](#/c:objc(cs)MPPHandLandmarkerOptions(py)handLandmarkerLiveStreamDelegate)\n\n `\n ` \n An object that confirms to [HandLandmarkerLiveStreamDelegate](../Protocols/HandLandmarkerLiveStreamDelegate.html) protocol. This object must\n implement `handLandmarker:didFinishDetectionWithResult:timestampInMilliseconds:error:` to\n receive the results of performing asynchronous hand landmark detection on images (i.e, when\n [runningMode](../Classes/HandLandmarkerOptions.html#/c:objc(cs)MPPHandLandmarkerOptions(py)runningMode) = `.liveStream`). \n\n #### Declaration\n\n Swift \n\n weak var handLandmarkerLiveStreamDelegate: ../Protocols/HandLandmarkerLiveStreamDelegate.html? { get set }\n\n- `\n ``\n ``\n `\n\n ### [numHands](#/c:objc(cs)MPPHandLandmarkerOptions(py)numHands)\n\n `\n ` \n The maximum number of hands that can be detected by the [HandLandmarker](../Classes/HandLandmarker.html). \n\n #### Declaration\n\n Swift \n\n var numHands: Int { get set }\n\n- `\n ``\n ``\n `\n\n ### [minHandDetectionConfidence](#/c:objc(cs)MPPHandLandmarkerOptions(py)minHandDetectionConfidence)\n\n `\n ` \n The minimum confidence score for the hand detection to be considered successful. \n\n #### Declaration\n\n Swift \n\n var minHandDetectionConfidence: Float { get set }\n\n- `\n ``\n ``\n `\n\n ### [minHandPresenceConfidence](#/c:objc(cs)MPPHandLandmarkerOptions(py)minHandPresenceConfidence)\n\n `\n ` \n The minimum confidence score of hand presence score in the hand landmark detection. \n\n #### Declaration\n\n Swift \n\n var minHandPresenceConfidence: Float { get set }\n\n- `\n ``\n ``\n `\n\n ### [minTrackingConfidence](#/c:objc(cs)MPPHandLandmarkerOptions(py)minTrackingConfidence)\n\n `\n ` \n The minimum confidence score for the hand tracking to be considered successful. \n\n #### Declaration\n\n Swift \n\n var minTrackingConfidence: Float { get set }"]]