tflite
\file
Summary
For documentation, see tensorflow/lite/core/interpreter.h.
Memory management for TF Lite.
This provides a few C++ helpers that are useful for manipulating C structures in C++.
Main abstraction controlling the tflite interpreter. Do NOT include this file directly, instead include third_party/tensorflow/lite/interpreter.h See third_party/tensorflow/lite/c/common.h for the API for defining operations (TfLiteRegistration).
Provides functionality to construct an interpreter for a model.
WARNING: Users of TensorFlow Lite should not include this file directly, but should instead include "third_party/tensorflow/lite/interpreter_builder.h". Only the TensorFlow Lite implementation itself should include this file directly.
Deserialization infrastructure for tflite. Provides functionality to go from a serialized tflite model in flatbuffer format to an in-memory representation of the model.
WARNING: Users of TensorFlow Lite should not include this file directly, but should instead include "third_party/tensorflow/lite/model_builder.h". Only the TensorFlow Lite implementation itself should include this file directly.
Typedefs |
|
---|---|
FlatBufferModel
|
usingimpl::FlatBufferModel
|
Interpreter
|
typedef::tflite::impl::Interpreter
An interpreter for a graph of nodes that input and output from tensors. |
InterpreterBuilder
|
usingimpl::InterpreterBuilder
Build an interpreter capable of interpreting model . |
Functions |
|
---|---|
DefaultErrorReporter()
|
|
GetRegistrationFromOpCode(const OperatorCode *opcode, const OpResolver & op_resolver, ErrorReporter *error_reporter, const TfLiteRegistration **registration)
|
TfLiteStatus
|
Classes |
|
---|---|
tflite:: |
A memory allocation handle. This could be a mmap or shared memory. |
tflite:: |
A functor that reports error to supporting system. |
tflite:: |
|
tflite:: |
Note that not all platforms support MMAP-based allocation. |
tflite:: |
|
tflite:: |
An OpResolver that is mutable, also used as the op in gen_op_registration. |
tflite:: |
Abstract interface that returns TfLiteRegistrations given op codes or custom op names. |
tflite:: |
Provides a range iterable wrapper for TfLiteIntArray* (C lists) that TfLite C api uses. |
Structs |
|
---|---|
tflite:: |
Namespaces |
|
---|---|
tflite:: |
An RAII object that represents a read-only tflite model, copied from disk, or mmapped. |
tflite:: |
Typedefs
FlatBufferModel
impl::FlatBufferModel FlatBufferModel
Interpreter
::tflite::impl::Interpreter Interpreter
An interpreter for a graph of nodes that input and output from tensors.
Each node of the graph processes a set of input tensors and produces a set of output Tensors. All inputs/output tensors are referenced by index.
Usage:
// Create model from file. Note that the model instance must outlive the
// interpreter instance.
auto model = tflite::FlatBufferModel::BuildFromFile(...);
if (model == nullptr) {
// Return error.
}
// Create an Interpreter with an InterpreterBuilder.
std::unique_ptr interpreter;
tflite::ops::builtin::BuiltinOpResolver resolver;
if (InterpreterBuilder(*model, resolver)(&interpreter) != kTfLiteOk) {
// Return failure.
}
if (interpreter->AllocateTensors() != kTfLiteOk) {
// Return failure.
}
auto input = interpreter->typed_tensor(0);
for (int i = 0; i < input_size; i++) {
input[i] = ...; interpreter->Invoke();
Note: For nearly all practical use cases, one should not directly construct an Interpreter object, but rather use the InterpreterBuilder.
\warning This class is not thread-safe. The cli