I modelli di machine learning (ML) che utilizzi con LiteRT possono essere addestrati
utilizzando JAX, PyTorch o TensorFlow e poi convertito in un flatbuffer TFLite
formato.
Per ulteriori dettagli, consulta le pagine seguenti:
[[["Facile da capire","easyToUnderstand","thumb-up"],["Il problema è stato risolto","solvedMyProblem","thumb-up"],["Altra","otherUp","thumb-up"]],[["Mancano le informazioni di cui ho bisogno","missingTheInformationINeed","thumb-down"],["Troppo complicato/troppi passaggi","tooComplicatedTooManySteps","thumb-down"],["Obsoleti","outOfDate","thumb-down"],["Problema di traduzione","translationIssue","thumb-down"],["Problema relativo a esempi/codice","samplesCodeIssue","thumb-down"],["Altra","otherDown","thumb-down"]],["Ultimo aggiornamento 2025-07-24 UTC."],[],[],null,["# Supporting multiple frameworks with TFLite\n\nThe machine learning (ML) models you use with LiteRT can be trained\nusing JAX, PyTorch or TensorFlow and then converted to a TFLite flatbuffer\nformat.\n\nSee the following pages for more details:\n\n- [Converting from JAX](/edge/litert/models/convert_jax)\n- [Converting from PyTorch](/edge/litert/models/convert_pytorch)\n- [Converting from TensorFlow](/edge/litert/models/convert_tf)\n\nAn overview of the TFLite Converter which is an important component of\nsupporting different frameworks with TFLite is on [Model conversion\noverview](/edge/litert/models/convert)."]]