The wrapper class for a TFLite model and a TFLite interpreter.
Note: A Model can only holds 1 TFLite model at a time, and always holds a TFLite
interpreter instance to run it.
Nested Classes
| class | Model.Builder |
This class is deprecated. Please use Model.createModel(Context, String, Options).
|
|
| enum | Model.Device | The runtime device type used for executing classification. | |
| class | Model.Options | Options for running the model. | |
Public Methods
| void |
close()
|
| static Model |
createModel(Context context, String modelPath, Model.Options options)
Loads a model from assets and initialize TFLite interpreter with given options.
|
| static Model |
createModel(Context context, String modelPath)
Loads a model from assets and initialize TFLite interpreter.
|
| MappedByteBuffer |
getData()
Returns the memory-mapped model data.
|
| Tensor |
getInputTensor(int inputIndex)
Gets the Tensor associated with the provided input index.
|
| Tensor |
getOutputTensor(int outputIndex)
Gets the Tensor associated with the provided output index.
|
| int[] |
getOutputTensorShape(int outputIndex)
Returns the output shape.
|
| String |
getPath()
Returns the path of the model file stored in Assets.
|
| void |
Inherited Methods
Public Methods
public void close ()
public static Model createModel (Context context, String modelPath, Model.Options options)
Loads a model from assets and initialize TFLite interpreter with given options.
Parameters
| context | The App Context. |
|---|---|
| modelPath | The path of the model file. |
| options | The options for running the model. |
Throws
| IOException | if any exception occurs when open the model file. |
|---|
See Also
public static Model createModel (Context context, String modelPath)
Loads a model from assets and initialize TFLite interpreter.
The default options are: (1) CPU device; (2) one thread.
Parameters
| context | The App Context. |
|---|---|
| modelPath | The path of the model file. |
Throws
| IOException | if any exception occurs when open the model file. |
|---|
public Tensor getInputTensor (int inputIndex)
Gets the Tensor associated with the provided input index.
Parameters
| inputIndex |
|---|
Throws
| IllegalStateException | if the interpreter is closed. |
|---|
public Tensor getOutputTensor (int outputIndex)
Gets the Tensor associated with the provided output index.
Parameters
| outputIndex |
|---|
Throws
| IllegalStateException | if the interpreter is closed. |
|---|
public int[] getOutputTensorShape (int outputIndex)
Returns the output shape. Useful if output shape is only determined when graph is created.
Parameters
| outputIndex |
|---|
Throws
| IllegalStateException | if the interpreter is closed. |
|---|
public void run (Object[] inputs, Map<Integer, Object> outputs)
Runs model inference on multiple inputs, and returns multiple outputs.
Parameters
| inputs | an array of input data. The inputs should be in the same order as inputs of the
model. Each input can be an array or multidimensional array, or a ByteBuffer of primitive types including int, float, long, and byte. ByteBuffer is the preferred way to pass large input data, whereas string types
require using the (multi-dimensional) array input path. When ByteBuffer is
used, its content should remain unchanged until model inference is done. |
|---|---|
| outputs | a map mapping output indices to multidimensional arrays of output data or ByteBuffers of primitive types including int, float, long, and byte. It only
needs to keep entries for the outputs to be used.
|