MediaPipeTasksVision Framework Reference

Classes

The following classes are available globally.

  • Holds the base options that is used for creation of any type of task. It has fields with important information acceleration configuration, TFLite model source etc.

    Declaration

    Objective-C

    
    @interface MPPBaseOptions : NSObject <NSCopying>
  • Category is a util class that contains a label, its display name, a float value as score, and the index of the label in the corresponding label file. Typically it’s used as the result of classification tasks.

    Declaration

    Objective-C

    
    @interface MPPCategory : NSObject
  • Represents the list of classification for a given classifier head. Typically used as a result for classification tasks.

    Declaration

    Objective-C

    
    @interface MPPClassifications : NSObject
  • Represents the classification results of a model. Typically used as a result for classification tasks.

    Declaration

    Objective-C

    
    @interface MPPClassificationResult : NSObject
  • Classifier options shared across MediaPipe iOS classification tasks.

    Declaration

    Objective-C

    
    @interface MPPClassifierOptions : NSObject <NSCopying>
  • The value class representing a landmark connection.

    Declaration

    Objective-C

    
    @interface MPPConnection : NSObject
  • Normalized keypoint represents a point in 2D space with x, y coordinates. x and y are normalized to [0.0, 1.0] by the image width and height respectively.

    Declaration

    Objective-C

    
    @interface MPPNormalizedKeypoint : NSObject
  • Represents one detected object in the results of ObjectDetector.

    Declaration

    Objective-C

    
    @interface MPPDetection : NSObject
  • Represents the embedding for a given embedder head. Typically used in embedding tasks.

    One and only one of the two ‘floatEmbedding’ and ‘quantizedEmbedding’ will contain data, based on whether or not the embedder was configured to perform scala quantization.

    Declaration

    Objective-C

    
    @interface MPPEmbedding : NSObject
  • Represents the embedding results of a model. Typically used as a result for embedding tasks.

    Declaration

    Objective-C

    
    @interface MPPEmbeddingResult : NSObject
  • @brief Class that performs face detection on images.

    The API expects a TFLite model with mandatory TFLite Model Metadata.

    The API supports models with one image input tensor and one or more output tensors. To be more specific, here are the requirements:

    Input tensor (kTfLiteUInt8/kTfLiteFloat32)

    • image input of size [batch x height x width x channels].
    • batch inference is not supported (batch is required to be 1).
    • only RGB inputs are supported (channels is required to be 3).
    • if type is kTfLiteFloat32, NormalizationOptions are required to be attached to the metadata for input normalization.

    Output tensors must be the 4 outputs of a DetectionPostProcess op, i.e:(kTfLiteFloat32) (kTfLiteUInt8/kTfLiteFloat32)

    • locations tensor of size [num_results x 4], the inner array representing bounding boxes in the form [top, left, right, bottom].
    • BoundingBoxProperties are required to be attached to the metadata and must specify type=BOUNDARIES and coordinate_type=RATIO. (kTfLiteFloat32)
    • classes tensor of size [num_results], each value representing the integer index of a class.
    • scores tensor of size [num_results], each value representing the score of the detected face.
    • optional score calibration can be attached using ScoreCalibrationOptions and an AssociatedFile with type TENSOR_AXIS_SCORE_CALIBRATION. See metadata_schema.fbs [1] for more details. (kTfLiteFloat32)
    • integer num_results as a tensor of size [1]

    Declaration

    Objective-C

    
    @interface MPPFaceDetector : NSObject
  • Options for setting up a FaceDetector.

    Declaration

    Objective-C

    
    @interface MPPFaceDetectorOptions : MPPTaskOptions <NSCopying>
  • Represents the detection results generated by FaceDetector.

    Declaration

    Objective-C

    
    @interface MPPFaceDetectorResult : MPPTaskResult
  • @brief Class that performs face landmark detection on images.

    The API expects a TFLite model with mandatory TFLite Model Metadata.

    Declaration

    Objective-C

    
    @interface MPPFaceLandmarker : NSObject
  • Options for setting up a FaceLandmarker.

    Declaration

    Objective-C

    
    @interface MPPFaceLandmarkerOptions : MPPTaskOptions <NSCopying>
  • A matrix that can be used for tansformations.

    Declaration

    Objective-C

    
    @interface MPPTransformMatrix : NSObject
  • Represents the detection results generated by FaceLandmarker.

    Declaration

    Objective-C

    
    @interface MPPFaceLandmarkerResult : MPPTaskResult
  • Class that performs face stylization on images.

    Declaration

    Objective-C

    
    @interface MPPFaceStylizer : NSObject
  • Options for setting up a FaceStylizer.

    Declaration

    Objective-C

    
    @interface MPPFaceStylizerOptions : MPPTaskOptions <NSCopying>
  • Represents the stylized image generated by FaceStylizer.

    Declaration

    Objective-C

    
    @interface MPPFaceStylizerResult : MPPTaskResult
  • @brief Performs gesture recognition on images.

    This API expects a pre-trained TFLite hand gesture recognizer model or a custom one created u