Build LiteRT for Android

This document describes how to build LiteRT Android library on your own. Normally, you do not need to locally build LiteRT Android library. If you just want to use it, see the Android quickstart for more details on how to use them in your Android projects.

Use Nightly Snapshots

To use nightly snapshots, add the following repo to your root Gradle build config.

allprojects {
    repositories {      // should be already there
        mavenCentral()  // should be already there
        maven {         // add this repo to use snapshots
          name 'ossrh-snapshot'
          url 'https://oss.sonatype.org/content/repositories/snapshots'
        }
    }
}

add nightly snapshots to dependencies (or edit as needed) to your build.gradle

...
dependencies {
    ...
    implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly-SNAPSHOT'
    implementation 'com.google.ai.edge.litert:litert-gpu:0.0.0-nightly-SNAPSHOT'
    implementation 'com.google.ai.edge.litert:litert-support:0.0.0-nightly-SNAPSHOT'
    ...
}
...

Build LiteRT locally

In some cases, you might wish to use a local build of LiteRT. For example, you may be building a custom binary that includes operations selected from TensorFlow, or you may wish to make local changes to LiteRT.

Set up build environment using Docker

  • Download the Docker file. By downloading the Docker file, you agree that the following terms of service govern your use thereof:

By clicking to accept, you hereby agree that all use of the Android Studio and Android Native Development Kit will be governed by the Android Software Development Kit License Agreement available at https://developer.android.com/studio/terms (such URL may be updated or changed by Google from time to time).

You must acknowledge the terms of service to download the file.

  • You can optionally change the Android SDK or NDK version. Put the downloaded Docker file in an empty folder and build your docker image by running:
docker build . -t tflite-builder -f tflite-android.Dockerfile
  • Start the docker container interactively by mounting your current folder to /host_dir inside the container (note that /tensorflow_src is the TensorFlow repository inside the container):
docker run -it -v $PWD:/host_dir tflite-builder bash

If you use PowerShell on Windows, replace "$PWD" with "pwd".

If you would like to use a TensorFlow repository on the host, mount that host directory instead (-v hostDir:/host_dir).

  • Once you are inside the container, you can run the following to download additional Android tools and libraries (note that you may need to accept the license):
sdkmanager \
  "build-tools;${ANDROID_BUILD_TOOLS_VERSION}" \
  "platform-tools" \
  "platforms;android-${ANDROID_API_LEVEL}"

Now you should proceed to the Configure WORKSPACE and .bazelrc section to configure the build settings.

After you finish building the libraries, you can copy them to /host_dir inside the container so that you can access them on the host.

Set up build environment without Docker

Install Bazel and Android Prerequisites

Bazel is the primary build system for TensorFlow. To build with it, you must have it and the Android NDK and SDK installed on your system.

  1. Install the latest version of the Bazel build system.
  2. The Android NDK is required to build the native (C/C++) LiteRT code. The current recommended version is 25b, which may be found here.
  3. The Android SDK and build tools may be obtained here, or alternatively as part of Android Studio. Build tools API >= 23 is the recommended version for building LiteRT.

Configure WORKSPACE and .bazelrc

This is a one-time configuration step that is required to build the LiteRT libraries. Run the ./configure script in the root TensorFlow checkout directory, and answer "Yes" when the script asks to interactively configure the ./WORKSPACE for Android builds. The script will attempt to configure settings using the following environment variables:

  • ANDROID_SDK_HOME
  • ANDROID_SDK_API_LEVEL
  • ANDROID_NDK_HOME
  • ANDROID_NDK_API_LEVEL

If these variables aren't set, they must be provided interactively in the script prompt. Successful configuration should yield entries similar to the following in the .tf_configure.bazelrc file in the root folder:

build --action_env ANDROID_NDK_HOME="/usr/local/android/android-ndk-r25b"
build --action_env ANDROID_NDK_API_LEVEL="21"
build --action_env ANDROID_BUILD_TOOLS_VERSION="30.0.3"
build --action_env ANDROID_SDK_API_LEVEL="30"
build --action_env ANDROID_SDK_HOME="/usr/local/android/android-sdk-linux"

Build and install

Once Bazel is properly configured, you can build the LiteRT AAR from the root checkout directory as follows:

bazel build -c opt --cxxopt=--std=c++17 --config=android_arm64 \
  --fat_apk_cpu=x86,x86_64,arm64-v8a,armeabi-v7a \
  --define=android_dexmerger_tool=d8_dexmerger \
  --define=android_incremental_dexing_tool=d8_dexbuilder \
  //tensorflow/lite/java:tensorflow-lite

This will generate an AAR file in bazel-bin/tensorflow/lite/java/. Note that this builds a "fat" AAR with several different architectures; if you don't need all of them, use the subset appropriate for your deployment environment.

You can build smaller AAR files targeting only a set of models as follows:

bash tensorflow/lite/tools/build_aar.sh \
  --input_models=model1,model2 \
  --target_archs=x86,x86_64,arm64-v8a,armeabi-v7a

Above script will generate the tensorflow-lite.aar file and optionally the tensorflow-lite-select-tf-ops.aar file if one of the models is using Tensorflow ops. For more details, please see the Reduce LiteRT binary size section.

Add AAR directly to project

Move the tensorflow-lite.aar file into a directory called libs in your project. Modify your app's build.gradle file to reference the new directory and replace the existing LiteRT dependency with the new local library, e.g.:

allprojects {
    repositories {
        mavenCentral()
        maven {  // Only for snapshot artifacts
            name 'ossrh-snapshot'
            url 'https://oss.sonatype.org/content/repositories/snapshots'
        }
        flatDir {
            dirs 'libs'
        }
    }
}

dependencies {
    compile(name:'tensorflow-lite', ext:'aar')
}

Install AAR to local Maven repository

Execute the following command from your root checkout directory:

mvn install:install-file \
  -Dfile=bazel-bin/tensorflow/lite/java/tensorflow-lite.aar \
  -DgroupId=org.tensorflow \
  -DartifactId=tensorflow-lite -Dversion=0.1.100 -Dpackaging=aar

In your app's build.gradle, ensure you have the mavenLocal() dependency and replace the standard LiteRT dependency with the one that has support for select TensorFlow ops:

allprojects {
    repositories {
        mavenCentral()
        maven {  // Only for snapshot artifacts
            name 'ossrh-snapshot'
            url 'https://oss.sonatype.org/content/repositories/snapshots'
        }
        mavenLocal()
    }
}

dependencies {
    implementation 'org.tensorflow:tensorflow-lite:0.1.100'
}

Note that the 0.1.100 version here is purely for the sake of testing/development. With the local AAR installed, you can use the standard LiteRT Java inference APIs in your app code.