This protocol defines an interface for the delegates of FaceDetector face to receive
results of performing asynchronous face detection on images (i.e, when runningMode =
.liveStream).
The delegate of FaceDetector must adopt FaceDetectorLiveStreamDelegate protocol.
The methods in this protocol are optional.
This protocol defines an interface for the delegates of FaceLandmarker face to receive
results of performing asynchronous face detection on images (i.e, when runningMode =
.liveStream).
The delegate of FaceLandmarker must adopt FaceLandmarkerLiveStreamDelegate protocol.
The methods in this protocol are optional.
This protocol defines an interface for the delegates of GestureRecognizer object to receive
results of performing asynchronous gesture recognition on images (i.e, when runningMode =
.liveStream).
The delegate of GestureRecognizer must adopt GestureRecognizerLiveStreamDelegate protocol.
The methods in this protocol are optional.
This protocol defines an interface for the delegates of HandLandmarker object to receive
results of performing asynchronous hand landmark detection on images (i.e, when
runningMode = .liveStream).
The delegate of HandLandmarker must adopt HandLandmarkerLiveStreamDelegate protocol.
The methods in this protocol are optional.
This protocol defines an interface for the delegates of ImageClassifier object to receive
results of asynchronous classification of images (i.e, when runningMode = .liveStream).
The delegate of ImageClassifier must adopt ImageClassifierLiveStreamDelegate protocol.
The methods in this protocol are optional.
This protocol defines an interface for the delegates of ImageEmbedder object to receive
results of asynchronous embedding extraction on images (i.e, when runningMode = .liveStream).
The delegate of ImageEmbedder must adopt ImageEmbedderLiveStreamDelegate protocol.
The methods in this protocol are optional.
This protocol defines an interface for the delegates of ImageSegmenter object to receive
results of performing asynchronous segmentation on images (i.e, when runningMode =
liveStream).
The delegate of ImageSegmenter must adopt ImageSegmenterLiveStreamDelegate protocol.
The methods in this protocol are optional.
This protocol defines an interface for the delegates of ObjectDetector object to receive
results of performing asynchronous object detection on images (i.e, when runningMode =
.liveStream).
The delegate of ObjectDetector must adopt ObjectDetectorLiveStreamDelegate protocol.
The methods in this protocol are optional.
This protocol defines an interface for the delegates of PoseLandmarker to receive
results of performing asynchronous pose landmark detection on images (i.e, when runningMode =
.liveStream).
The delegate of PoseLandmarker must adopt PoseLandmarkerLiveStreamDelegate protocol.
The methods in this protocol are optional.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-05-08 UTC."],[],[],null,["# MediaPipeTasksVision Framework Reference\n\nProtocols\n=========\n\nThe following protocols are available globally.\n- `\n ``\n ``\n `\n\n ### [MPPFaceDetectorLiveStreamDelegate](/edge/api/mediapipe/objc/vision/Protocols/MPPFaceDetectorLiveStreamDelegate)\n\n `\n ` \n This protocol defines an interface for the delegates of `FaceDetector` face to receive\n results of performing asynchronous face detection on images (i.e, when `runningMode` =\n `.liveStream`).\n\n The delegate of `FaceDetector` must adopt `FaceDetectorLiveStreamDelegate` protocol.\n The methods in this protocol are optional. \n\n #### Declaration\n\n Objective-C \n\n @protocol MPPFaceDetectorLiveStreamDelegate \u003cNSObject\u003e\n\n- `\n ``\n ``\n `\n\n ### [MPPFaceLandmarkerLiveStreamDelegate](/edge/api/mediapipe/objc/vision/Protocols/MPPFaceLandmarkerLiveStreamDelegate)\n\n `\n ` \n This protocol defines an interface for the delegates of `FaceLandmarker` face to receive\n results of performing asynchronous face detection on images (i.e, when `runningMode` =\n `.liveStream`).\n\n The delegate of `FaceLandmarker` must adopt `FaceLandmarkerLiveStreamDelegate` protocol.\n The methods in this protocol are optional. \n\n #### Declaration\n\n Objective-C \n\n @protocol MPPFaceLandmarkerLiveStreamDelegate \u003cNSObject\u003e\n\n- `\n ``\n ``\n `\n\n ### [MPPGestureRecognizerLiveStreamDelegate](/edge/api/mediapipe/objc/vision/Protocols/MPPGestureRecognizerLiveStreamDelegate)\n\n `\n ` \n This protocol defines an interface for the delegates of `GestureRecognizer` object to receive\n results of performing asynchronous gesture recognition on images (i.e, when `runningMode` =\n `.liveStream`).\n\n The delegate of `GestureRecognizer` must adopt `GestureRecognizerLiveStreamDelegate` protocol.\n The methods in this protocol are optional. \n\n #### Declaration\n\n Objective-C \n\n @protocol MPPGestureRecognizerLiveStreamDelegate \u003cNSObject\u003e\n\n- `\n ``\n ``\n `\n\n ### [MPPHandLandmarkerLiveStreamDelegate](/edge/api/mediapipe/objc/vision/Protocols/MPPHandLandmarkerLiveStreamDelegate)\n\n `\n ` \n This protocol defines an interface for the delegates of `HandLandmarker` object to receive\n results of performing asynchronous hand landmark detection on images (i.e, when\n `runningMode` = `.liveStream`).\n\n The delegate of `HandLandmarker` must adopt `HandLandmarkerLiveStreamDelegate` protocol.\n The methods in this protocol are optional. \n\n #### Declaration\n\n Objective-C \n\n @protocol MPPHandLandmarkerLiveStreamDelegate \u003cNSObject\u003e\n\n- `\n ``\n ``\n `\n\n ### [MPPImageClassifierLiveStreamDelegate](/edge/api/mediapipe/objc/vision/Protocols/MPPImageClassifierLiveStreamDelegate)\n\n `\n ` \n This protocol defines an interface for the delegates of `ImageClassifier` object to receive\n results of asynchronous classification of images (i.e, when `runningMode` = `.liveStream`).\n\n The delegate of `ImageClassifier` must adopt `ImageClassifierLiveStreamDelegate` protocol.\n The methods in this protocol are optional. \n\n #### Declaration\n\n Objective-C \n\n @protocol MPPImageClassifierLiveStreamDelegate \u003cNSObject\u003e\n\n- `\n ``\n ``\n `\n\n ### [MPPImageEmbedderLiveStreamDelegate](/edge/api/mediapipe/objc/vision/Protocols/MPPImageEmbedderLiveStreamDelegate)\n\n `\n ` \n This protocol defines an interface for the delegates of `ImageEmbedder` object to receive\n results of asynchronous embedding extraction on images (i.e, when `runningMode` = `.liveStream`).\n\n The delegate of `ImageEmbedder` must adopt `ImageEmbedderLiveStreamDelegate` protocol.\n The methods in this protocol are optional. \n\n #### Declaration\n\n Objective-C \n\n @protocol MPPImageEmbedderLiveStreamDelegate \u003cNSObject\u003e\n\n- `\n ``\n ``\n `\n\n ### [MPPImageSegmenterLiveStreamDelegate](/edge/api/mediapipe/objc/vision/Protocols/MPPImageSegmenterLiveStreamDelegate)\n\n `\n ` \n This protocol defines an interface for the delegates of `ImageSegmenter` object to receive\n results of performing asynchronous segmentation on images (i.e, when `runningMode` =\n `liveStream`).\n\n The delegate of `ImageSegmenter` must adopt `ImageSegmenterLiveStreamDelegate` protocol.\n The methods in this protocol are optional. \n\n #### Declaration\n\n Objective-C \n\n @protocol MPPImageSegmenterLiveStreamDelegate \u003cNSObject\u003e\n\n- `\n ``\n ``\n `\n\n ### [MPPObjectDetectorLiveStreamDelegate](/edge/api/mediapipe/objc/vision/Protocols/MPPObjectDetectorLiveStreamDelegate)\n\n `\n ` \n This protocol defines an interface for the delegates of `ObjectDetector` object to receive\n results of performing asynchronous object detection on images (i.e, when `runningMode` =\n `.liveStream`).\n\n The delegate of `ObjectDetector` must adopt `ObjectDetectorLiveStreamDelegate` protocol.\n The methods in this protocol are optional. \n\n #### Declaration\n\n Objective-C \n\n @protocol MPPObjectDetectorLiveStreamDelegate \u003cNSObject\u003e\n\n- `\n ``\n ``\n `\n\n ### [MPPPoseLandmarkerLiveStreamDelegate](/edge/api/mediapipe/objc/vision/Protocols/MPPPoseLandmarkerLiveStreamDelegate)\n\n `\n ` \n This protocol defines an interface for the delegates of `PoseLandmarker` to receive\n results of performing asynchronous pose landmark detection on images (i.e, when `runningMode` =\n `.liveStream`).\n\n The delegate of `PoseLandmarker` must adopt `PoseLandmarkerLiveStreamDelegate` protocol.\n The methods in this protocol are optional. \n\n #### Declaration\n\n Objective-C \n\n @protocol MPPPoseLandmarkerLiveStreamDelegate \u003cNSObject\u003e"]]