[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-05-08 UTC."],[],[],null,["# MediaPipeTasksVision Framework Reference\n\nMPPRunningMode\n==============\n\n enum MPPRunningMode : NSUInteger {}\n\nMediaPipe vision task running mode. A MediaPipe vision task can be run with three different\nmodes: image, video and live stream.\n- `\n ``\n ``\n `\n\n ### [MPPRunningModeImage](#/c:@E@MPPRunningMode@MPPRunningModeImage)\n\n `\n ` \n The mode for running a mediapipe vision task on single image inputs. \n\n #### Declaration\n\n Objective-C \n\n MPPRunningModeImage\n\n- `\n ``\n ``\n `\n\n ### [MPPRunningModeVideo](#/c:@E@MPPRunningMode@MPPRunningModeVideo)\n\n `\n ` \n The mode for running a mediapipe vision task on the decoded frames of a video. \n\n #### Declaration\n\n Objective-C \n\n MPPRunningModeVideo\n\n- `\n ``\n ``\n `\n\n ### [MPPRunningModeLiveStream](#/c:@E@MPPRunningMode@MPPRunningModeLiveStream)\n\n `\n ` \n The mode for running a mediapipe vision task on a live stream of input data, such as from the\n camera. \n\n #### Declaration\n\n Objective-C \n\n MPPRunningModeLiveStream"]]