tflite::impl::InterpreterBuilder

Summary

Constructors and Destructors

InterpreterBuilder(const FlatBufferModel & model, const OpResolver & op_resolver, const InterpreterOptions *options_experimental)
For this constructor, the ErrorReporter will be extracted from the FlatBufferModel.
InterpreterBuilder(const ::tflite::Model *model, const OpResolver & op_resolver, ErrorReporter *error_reporter, const InterpreterOptions *options_experimental, const Allocation *allocation)
Builds an interpreter given only the raw flatbuffer Model object (instead of a FlatBufferModel).
InterpreterBuilder(const InterpreterBuilder &)
~InterpreterBuilder()

Public functions

AddDelegate(TfLiteDelegate *delegate)
void
Any delegates added with AddDelegate will be applied to the Interpreter generated by operator(), in the order that they were added.
AddDelegate(TfLiteOpaqueDelegateStruct *opaque_delegate)
void
SetNumThreads(int num_threads)
TfLiteStatus
Sets the number of CPU threads to use for the interpreter.
SetTelemetryProfiler(std::unique_ptr< telemetry::TelemetryProfiler > profiler)
void
operator()(std::unique_ptr< Interpreter > *interpreter)
TfLiteStatus
Builds an interpreter and stores it in *interpreter.
operator()(std::unique_ptr< Interpreter > *interpreter, int num_threads)
TfLiteStatus
Same as above, but also sets the number of CPU threads to use (overriding any previous call to SetNumThreads).
operator=(const InterpreterBuilder &)=delete

Public functions

AddDelegate

void AddDelegate(
  TfLiteDelegate *delegate
)

Any delegates added with AddDelegate will be applied to the Interpreter generated by operator(), in the order that they were added.

(The delegate parameter passed to AddDelegate should be non-null, otherwise an error will be reported, and the call to AddDelegate will have no other effect.) The lifetime of the delegate must be at least as long as the lifetime of any Interpreter generated by this InterpreterBuilder.

AddDelegate

void AddDelegate(
  TfLiteOpaqueDelegateStruct *opaque_delegate
)

InterpreterBuilder

 InterpreterBuilder(
  const FlatBufferModel & model,
  const OpResolver & op_resolver,
  const InterpreterOptions *options_experimental
)

For this constructor, the ErrorReporter will be extracted from the FlatBufferModel.

options object is copied during construction. So caller can release it after calling the constructor.

InterpreterBuilder

 InterpreterBuilder(
  const ::tflite::Model *model,
  const OpResolver & op_resolver,
  ErrorReporter *error_reporter,
  const InterpreterOptions *options_experimental,
  const Allocation *allocation
)

Builds an interpreter given only the raw flatbuffer Model object (instead of a FlatBufferModel).

Mostly used for testing. If error_reporter is null, then DefaultErrorReporter() is used. options object is copied during construction. So caller can release it after calling the constructor.

InterpreterBuilder

 InterpreterBuilder(
  const InterpreterBuilder &
)=delete

SetNumThreads

TfLiteStatus SetNumThreads(
  int num_threads
)

Sets the number of CPU threads to use for the interpreter.

Returns kTfLiteOk on success, kTfLiteError on error.

SetTelemetryProfiler

void SetTelemetryProfiler(
  std::unique_ptr< telemetry::TelemetryProfiler > profiler
)

operator()

TfLiteStatus operator()(
  std::unique_ptr< Interpreter > *interpreter
)

Builds an interpreter and stores it in *interpreter.

On success, returns kTfLiteOk and sets *interpreter to a valid Interpreter. On failure, returns an error status and sets *interpreter to nullptr.

operator()

TfLiteStatus operator()(
  std::unique_ptr< Interpreter > *interpreter,
  int num_threads
)

Same as above, but also sets the number of CPU threads to use (overriding any previous call to SetNumThreads).

Deprecated: use the SetNumThreads method instead.

operator=

InterpreterBuilder & operator=(
  const InterpreterBuilder &
)=delete

~InterpreterBuilder

 ~InterpreterBuilder()