ตัวอย่างต่อไปนี้แสดงวิธีการแปลง
SavedModel ลงใน TensorFlow
โมเดล Lite
importtensorflowastf# Convert the modelconverter=tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)# path to the SavedModel directorytflite_model=converter.convert()# Save the model.withopen('model.tflite','wb')asf:f.write(tflite_model)
แปลงโมเดล Keras
ตัวอย่างต่อไปนี้แสดงวิธีการแปลง
โมเดล Keras ลงใน TensorFlow
โมเดล Lite
importtensorflowastf# Create a model using high-level tf.keras.* APIsmodel=tf.keras.models.Sequential([tf.keras.layers.Dense(units=1,input_shape=[1]),tf.keras.layers.Dense(units=16,activation='relu'),tf.keras.layers.Dense(units=1)])model.compile(optimizer='sgd',loss='mean_squared_error')# compile the modelmodel.fit(x=[-1,0,1],y=[-3,-1,1],epochs=5)# train the model# (to generate a SavedModel) tf.saved_model.save(model, "saved_model_keras_dir")# Convert the model.converter=tf.lite.TFLiteConverter.from_keras_model(model)tflite_model=converter.convert()# Save the model.withopen('model.tflite','wb')asf:f.write(tflite_model)
importtensorflowastf# Create a model using low-level tf.* APIsclassSquared(tf.Module):@tf.function(input_signature=[tf.TensorSpec(shape=[None],dtype=tf.float32)])def__call__(self,x):returntf.square(x)model=Squared()# (ro run your model) result = Squared(5.0) # This prints "25.0"# (to generate a SavedModel) tf.saved_model.save(model, "saved_model_tf_dir")concrete_func=model.__call__.get_concrete_function()# Convert the model.converter=tf.lite.TFLiteConverter.from_concrete_functions([concrete_func],model)tflite_model=converter.convert()# Save the model.withopen('model.tflite','wb')asf:f.write(tflite_model)
[[["เข้าใจง่าย","easyToUnderstand","thumb-up"],["แก้ปัญหาของฉันได้","solvedMyProblem","thumb-up"],["อื่นๆ","otherUp","thumb-up"]],[["ไม่มีข้อมูลที่ฉันต้องการ","missingTheInformationINeed","thumb-down"],["ซับซ้อนเกินไป/มีหลายขั้นตอนมากเกินไป","tooComplicatedTooManySteps","thumb-down"],["ล้าสมัย","outOfDate","thumb-down"],["ปัญหาเกี่ยวกับการแปล","translationIssue","thumb-down"],["ตัวอย่าง/ปัญหาเกี่ยวกับโค้ด","samplesCodeIssue","thumb-down"],["อื่นๆ","otherDown","thumb-down"]],["อัปเดตล่าสุด 2025-07-24 UTC"],[],[],null,["# Convert TensorFlow models\n\nThis page describes how to convert a TensorFlow model\nto a LiteRT model (an optimized\n[FlatBuffer](https://google.github.io/flatbuffers/) format identified by the\n`.tflite` file extension) using the LiteRT converter.\n| **Note:** This guide assumes you've both [installed TensorFlow 2.x](https://www.tensorflow.org/install/pip#tensorflow-2-packages-are-available) and trained models in TensorFlow 2.x. If your model is trained in TensorFlow 1.x, considering [migrating to TensorFlow 2.x](https://www.tensorflow.org/guide/migrate/tflite). To identify the installed TensorFlow version, run `print(tf.__version__)`.\n\nConversion workflow\n-------------------\n\nThe diagram below illustrations the high-level workflow for converting\nyour model:\n\n**Figure 1.** Converter workflow.\n\nYou can convert your model using one of the following options:\n\n1. [Python API](#python_api) (***recommended***): This allows you to integrate the conversion into your development pipeline, apply optimizations, add metadata and many other tasks that simplify the conversion process.\n2. [Command line](#cmdline): This only supports basic model conversion.\n\n| **Note:** In case you encounter any issues during model conversion, create a [GitHub issue](https://github.com/tensorflow/tensorflow/issues/new?template=60-tflite-converter-issue.md).\n\nPython API\n----------\n\n*Helper code: To learn more about the LiteRT converter\nAPI, run `print(help(tf.lite.TFLiteConverter))`.*\n\nConvert a TensorFlow model using\n[`tf.lite.TFLiteConverter`](../../api/tflite/python/tf/lite/TFLiteConverter).\nA TensorFlow model is stored using the SavedModel format and is\ngenerated either using the high-level `tf.keras.*` APIs (a Keras model) or\nthe low-level `tf.*` APIs (from which you generate concrete functions). As a\nresult, you have the following three options (examples are in the next few\nsections):\n\n- [`tf.lite.TFLiteConverter.from_saved_model()`](../../api/tflite/python/tf/lite/TFLiteConverter#from_saved_model) (**recommended** ): Converts a [SavedModel](https://www.tensorflow.org/guide/saved_model).\n- [`tf.lite.TFLiteConverter.from_keras_model()`](../../api/tflite/python/tf/lite/TFLiteConverter#from_keras_model): Converts a [Keras](https://www.tensorflow.org/guide/keras/overview) model.\n- [`tf.lite.TFLiteConverter.from_concrete_functions()`](../../api/tflite/python/tf/lite/TFLiteConverter#from_concrete_functions): Converts [concrete functions](https://www.tensorflow.org/guide/intro_to_graphs).\n\n### Convert a SavedModel (recommended)\n\nThe following example shows how to convert a\n[SavedModel](https://www.tensorflow.org/guide/saved_model) into a TensorFlow\nLite model. \n\n import tensorflow as tf\n\n # Convert the model\n converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir) # path to the SavedModel directory\n tflite_model = converter.convert()\n\n # Save the model.\n with open('model.tflite', 'wb') as f:\n f.write(tflite_model)\n\n### Convert a Keras model\n\nThe following example shows how to convert a\n[Keras](https://www.tensorflow.org/guide/keras/overview) model into a TensorFlow\nLite model. \n\n import tensorflow as tf\n\n # Create a model using high-level tf.keras.* APIs\n model = tf.keras.models.Sequential([\n tf.keras.layers.Dense(units=1, input_shape=[1]),\n tf.keras.layers.Dense(units=16, activation='relu'),\n tf.keras.layers.Dense(units=1)\n ])\n model.compile(optimizer='sgd', loss='mean_squared_error') # compile the model\n model.fit(x=[-1, 0, 1], y=[-3, -1, 1], epochs=5) # train the model\n # (to generate a SavedModel) tf.saved_model.save(model, \"saved_model_keras_dir\")\n\n # Convert the model.\n converter = tf.lite.TFLiteConverter.from_keras_model(model)\n tflite_model = converter.convert()\n\n # Save the model.\n with open('model.tflite', 'wb') as f:\n f.write(tflite_model)\n\n### Convert concrete functions\n\nThe following example shows how to convert\n[concrete functions](https://www.tensorflow.org/guide/intro_to_graphs) into a\nLiteRT model. \n\n import tensorflow as tf\n\n # Create a model using low-level tf.* APIs\n class Squared(tf.Module):\n @tf.function(input_signature=[tf.TensorSpec(shape=[None], dtype=tf.float32)])\n def __call__(self, x):\n return tf.square(x)\n model = Squared()\n # (ro run your model) result = Squared(5.0) # This prints \"25.0\"\n # (to generate a SavedModel) tf.saved_model.save(model, \"saved_model_tf_dir\")\n concrete_func = model.__call__.get_concrete_function()\n\n # Convert the model.\n\n converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func],\n model)\n tflite_model = converter.convert()\n\n # Save the model.\n with open('model.tflite', 'wb') as f:\n f.write(tflite_model)\n\n### Other features\n\n- Apply [optimizations](./model_optimization). A common\n optimization used is\n [post training quantization](./post_training_quantization),\n which can further reduce your model latency and size with minimal loss in\n accuracy.\n\n- Add [metadata](./metadata), which makes it easier to create platform\n specific wrapper code when deploying models on devices.\n\n### Conversion errors\n\nThe following are common conversion errors and their solutions:\n\n- Error: `Some ops are not supported by the native TFLite runtime, you can\n enable TF kernels fallback using TF Select.`\n\n Solution: The error occurs as your model has TF ops that don't have a\n corresponding TFLite implementation. You can resolve this by\n [using the TF op in the TFLite model](./ops_select)\n (recommended).\n If you want to generate a model with TFLite ops only, you can either add a\n request for the missing TFLite op in\n [GitHub issue #21526](https://github.com/tensorflow/tensorflow/issues/21526)\n (leave a comment if your request hasn't already been mentioned) or\n [create the TFLite op](./ops_custom#create_and_register_the_operator)\n yourself.\n- Error: `.. is neither a custom op nor a flex op`\n\n Solution: If this TF op is:\n - Supported in TF: The error occurs because the TF op is missing from the\n [allowlist](./op_select_allowlist) (an exhaustive list of\n TF ops supported by TFLite). You can resolve this as follows:\n\n 1. [Add missing ops to the allowlist](./op_select_allowlist#add_tensorflow_core_operators_to_the_allowed_list).\n 2. [Convert the TF model to a TFLite model and run inference](./ops_select).\n - Unsupported in TF: The error occurs because TFLite is unaware of the\n custom TF operator defined by you. You can resolve this as follows:\n\n 1. [Create the TF op](https://www.tensorflow.org/guide/create_op).\n 2. [Convert the TF model to a TFLite model](./op_select_allowlist#users_defined_operators).\n 3. [Create the TFLite op](./ops_custom#create_and_register_the_operator) and run inference by linking it to the TFLite runtime.\n\nCommand Line Tool\n-----------------\n\n| **Note:** It is highly recommended that you use the [Python API](#python_api) listed above instead, if possible.\n\nIf you've\n[installed TensorFlow 2.x from pip](https://www.tensorflow.org/install/pip), use\nthe `tflite_convert` command. To view all the available flags, use the\nfollowing command: \n\n $ tflite_convert --help\n\n `--output_file`. Type: string. Full path of the output file.\n `--saved_model_dir`. Type: string. Full path to the SavedModel directory.\n `--keras_model_file`. Type: string. Full path to the Keras H5 model file.\n `--enable_v1_converter`. Type: bool. (default False) Enables the converter and flags used in TF 1.x instead of TF 2.x.\n\n You are required to provide the `--output_file` flag and either the `--saved_model_dir` or `--keras_model_file` flag.\n\nIf you have the\n[TensorFlow 2.x source](https://www.tensorflow.org/install/source)\ndonwloaded and want to run the converter from that source without building and\ninstalling the package,\nyou can replace '`tflite_convert`' with\n'`bazel run tensorflow/lite/python:tflite_convert --`' in the command.\n\n### Converting a SavedModel\n\n tflite_convert \\\n --saved_model_dir=/tmp/mobilenet_saved_model \\\n --output_file=/tmp/mobilenet.tflite\n\n### Converting a Keras H5 model\n\n tflite_convert \\\n --keras_model_file=/tmp/mobilenet_keras_model.h5 \\\n --output_file=/tmp/mobilenet.tflite"]]