com.google.ai.edge.litert

Interfaces

ModelProvider

A model provider that provides a model file and relevant information.

NpuAcceleratorProvider

An interface to provide the NPU libraries.

NpuCompatibilityChecker

An interface to checks if the device is compatible with NPU.

Classes

BuiltinNpuAcceleratorProvider

An implementation of NpuAcceleratorProvider, which provides the NPU libraries without dynamic downloading.

CompiledModel

Class that represents a compiled LiteRT model.

CompiledModel.Options

Options to specify hardware acceleration for compiling a model.

Environment

Environment to hold configuration options for LiteRT runtime.

JniHandle

A base class for all Kotlin types that wrap a JNI handle.

Model

Model represents a LiteRT model file.

ModelSelector

ModelSelector allows to dynamically select a ModelProvider from a given set of providers.

TensorBuffer

TensorBuffer represents the raw memory where tensor data is stored.

TensorBufferRequirements

Requirements for allocating a TensorBuffer.

Exceptions

LiteRtException

Exception for various LiteRT API errors.

Enums

Accelerator

Hardware accelerators supported by LiteRT.

Environment.Option

Options configurable in LiteRT environment.

ModelProvider.Type

Model files could be either an asset or a file.

Status

Status codes for LiteRtException.

TensorBufferType

The type of the tensor buffer.