MPPObjectDetector
@interface MPPObjectDetector : NSObject
@brief Class that performs object detection on images.
The API expects a TFLite model with mandatory TFLite Model Metadata.
The API supports models with one image input tensor and one or more output tensors. To be more specific, here are the requirements:
Input tensor (kTfLiteUInt8/kTfLiteFloat32)
- image input of size
[batch x height x width x channels]
. - batch inference is not supported (
batch
is required to be 1). - only RGB inputs are supported (
channels
is required to be 3). - if type is kTfLiteFloat32, NormalizationOptions are required to be attached to the metadata for input normalization.
Output tensors must be the 4 outputs of a DetectionPostProcess
op, i.e:(kTfLiteFloat32)
(kTfLiteUInt8/kTfLiteFloat32)
- locations tensor of size
[num_results x 4]
, the inner array representing bounding boxes in the form [top, left, right, bottom]. - BoundingBoxProperties are required to be attached to the metadata and must specify type=BOUNDARIES and coordinate_type=RATIO. (kTfLiteFloat32)
- classes tensor of size
[num_results]
, each value representing the integer index of a class. - optional (but recommended) label map(s) can be attached as AssociatedFiles with type
TENSOR_VALUE_LABELS, containing one label per line. The first such AssociatedFile (if any) is
used to fill the
class_name
field of the results. Thedisplay_name
field is filled from the AssociatedFile (if any) whose locale matches thedisplay_names_locale
field of theObjectDetectorOptions
used at creation time (“en” by default, i.e. English). If none of these are available, only theindex
field of the results will be filled. (kTfLiteFloat32) - scores tensor of size
[num_results]
, each value representing the score of the detected object. - optional score calibration can be attached using ScoreCalibrationOptions and an AssociatedFile with type TENSOR_AXIS_SCORE_CALIBRATION. See metadata_schema.fbs [1] for more details. (kTfLiteFloat32)
- integer num_results as a tensor of size
[1]
-
Creates a new instance of
ObjectDetector
from an absolute path to a TensorFlow Lite model file stored locally on the device and the defaultObjectDetector
.Declaration
Objective-C
- (nullable instancetype)initWithModelPath:(nonnull NSString *)modelPath error:(NSError *_Nullable *_Nullable)error;
Parameters
modelPath
An absolute path to a TensorFlow Lite model file stored locally on the device.
error
An optional error parameter populated when there is an error in initializing the object detector.
Return Value
A new instance of
ObjectDetector
with the given model path.nil
if there is an error in initializing the object detector. -
Creates a new instance of
ObjectDetector
from the givenObjectDetectorOptions
.Declaration
Objective-C
- (nullable instancetype)initWithOptions: (nonnull MPPObjectDetectorOptions *)options error:(NSError *_Nullable *_Nullable)error;
Parameters
options
The options of type
ObjectDetectorOptions
to use for configuring theObjectDetector
.error
An optional error parameter populated when there is an error in initializing the object detector.
Return Value
A new instance of
ObjectDetector
with the given options.nil
if there is an error in initializing the object detector. -
Performs object detection on the provided MPImage using the whole image as region of interest. Rotation will be applied according to the
orientation
property of the providedMPImage
. Only use this method when theObjectDetector
is created with.image
.This method supports detecting objects in RGBA images. If your
MPImage
has a source type of.pixelBuffer
or.sampleBuffer
, the underlying pixel buffer must usekCVPixelFormatType_32BGRA
as its pixel format.If your
MPImage
has a source type of.image
ensure that the color space is RGB with an Alpha channel.Declaration
Objective-C
- (nullable MPPObjectDetectorResult *) detectImage:(nonnull MPPImage *)image error:(NSError *_Nullable *_Nullable)error;
Parameters
image
The
.image
on which object detection is to be performed.