[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["没有我需要的信息","missingTheInformationINeed","thumb-down"],["太复杂/步骤太多","tooComplicatedTooManySteps","thumb-down"],["内容需要更新","outOfDate","thumb-down"],["翻译问题","translationIssue","thumb-down"],["示例/代码问题","samplesCodeIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-07-24。"],[],[],null,["# Build LiteRT for ARM boards\n\nThis page describes how to build the LiteRT libraries for ARM-based\ncomputers.\n\nLiteRT supports two build systems and supported features from each\nbuild system are not identical. Check the following table to pick a proper build\nsystem.\n\n| Feature | Bazel | CMake |\n|-------------------------------------------------------------------------------------------|------------------------------|------------------------------------------------------|\n| Predefined toolchains | armhf, aarch64 | armel, armhf, aarch64 |\n| Custom toolchains | harder to use | easy to use |\n| [Select TF ops](../models/ops_select) | supported | not supported |\n| [GPU delegate](../performance/gpu) | only available for Android | any platform that supports OpenCL |\n| XNNPack | supported | supported |\n| [Python Wheel](../build/cmake_pip) | supported | supported |\n| [C API](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/c/README.md) | supported | [supported](./cmake#build_tensorflow_lite_c_library) |\n| [C++ API](../inference#load_and_run_a_model_in_c) | supported for Bazel projects | supported for CMake projects |\n\nCross-compilation for ARM with CMake\n------------------------------------\n\nIf you have a CMake project or if you want to use a custom toolchain, you'd\nbetter use CMake for cross compilation. There is a separate\n[Cross compilation LiteRT with CMake](./cmake_arm)\npage available for this.\n\nCross-compilation for ARM with Bazel\n------------------------------------\n\nIf you have a Bazel project or if you want to use TF ops, you'd better use Bazel\nbuild system. You'll use the integrated\n[ARM GCC 8.3 toolchains](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/tools/toolchains/embedded/arm-linux)\nwith Bazel to build an ARM32/64 shared library.\n\n| Target Architecture | Bazel Configuration | Compatible Devices |\n|---------------------|-------------------------|----------------------------------------|\n| armhf (ARM32) | --config=elinux_armhf | RPI3, RPI4 with 32 bit Raspberry Pi OS |\n| AArch64 (ARM64) | --config=elinux_aarch64 | Coral, RPI4 with Ubuntu 64 bit |\n\n| **Note:** The generated shared library requires glibc 2.28 or higher to run.\n\nThe following instructions have been tested on Ubuntu 16.04.3 64-bit PC (AMD64)\nand TensorFlow devel docker image\n[tensorflow/tensorflow:devel](https://hub.docker.com/r/tensorflow/tensorflow/tags/).\n\nTo cross compile LiteRT with Bazel, follow the steps:\n\n#### Step 1. Install Bazel\n\nBazel is the primary build system for TensorFlow. Install the latest version of\nthe [Bazel build system](https://bazel.build/versions/master/docs/install.html).\n| **Note:** If you're using the TensorFlow Docker image, Bazel is already available.\n\n#### Step 2. Clone TensorFlow repository\n\n git clone https://github.com/tensorflow/tensorflow.git tensorflow_src\n\n| **Note:** If you're using the TensorFlow Docker image, the repo is already provided in `/tensorflow_src/`.\n\n#### Step 3. Build ARM binary\n\n##### C library\n\n bazel build --config=elinux_aarch64 -c opt //tensorflow/lite/c:libtensorflowlite_c.so\n\nYou can find a shared library in:\n`bazel-bin/tensorflow/lite/c/libtensorflowlite_c.so`.\n| **Note:** Use `elinux_armhf` for [32bit ARM hard float](https://wiki.debian.org/ArmHardFloatPort) build.\n\nCheck\n[LiteRT C API](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/c/README.md)\npage for the detail.\n\n##### C++ library\n\n bazel build --config=elinux_aarch64 -c opt //tensorflow/lite:libtensorflowlite.so\n\nYou can find a shared library in:\n`bazel-bin/tensorflow/lite/libtensorflowlite.so`.\n\nCurrently, there is no straightforward way to extract all header files needed,\nso you must include all header files in tensorflow/lite/ from the TensorFlow\nrepository. Additionally, you will need header files from FlatBuffers and\nAbseil.\n\n##### Etc\n\nYou can also build other Bazel targets with the toolchain. Here are some useful\ntargets.\n\n- //tensorflow/lite/tools/benchmark:benchmark_model\n- //tensorflow/lite/examples/label_image:label_image"]]