android{// Other settings// Specify tflite file should not be compressed for the app apkaaptOptions{noCompress"tflite"}}dependencies{// Other dependencies// Import the Task Text Library dependencyimplementation'org.tensorflow:tensorflow-lite-task-text:0.4.4'}
步驟 2:使用 API 執行推論
// InitializationBertNLClassifierOptionsoptions=BertNLClassifierOptions.builder().setBaseOptions(BaseOptions.builder().setNumThreads(4).build()).build();BertNLClassifierclassifier=BertNLClassifier.createFromFileAndOptions(context,modelFile,options);// Run inferenceList<Category>results=classifier.classify(input);
target 'MySwiftAppWithTaskAPI' do
use_frameworks!
pod 'TensorFlowLiteTaskText', '~> 0.4.4'
end
步驟 2:使用 API 執行推論
// InitializationletbertNLClassifier=TFLBertNLClassifier.bertNLClassifier(modelPath:bertModelPath)// Run inferenceletcategories=bertNLClassifier.classify(text:input)
// InitializationBertNLClassifierOptionsoptions;options.mutable_base_options()->mutable_model_file()->set_file_name(model_path);std::unique_ptr<BertNLClassifier>classifier=BertNLClassifier::CreateFromOptions(options).value();// Run inference with your input, `input_text`.std::vector<core::Category>categories=classifier->Classify(input_text);
# Importsfromtflite_support.taskimporttext# Initializationclassifier=text.BertNLClassifier.create_from_file(model_path)# Run inferencetext_classification_result=classifier.classify(text)
[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["缺少我需要的資訊","missingTheInformationINeed","thumb-down"],["過於複雜/步驟過多","tooComplicatedTooManySteps","thumb-down"],["過時","outOfDate","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["示例/程式碼問題","samplesCodeIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-07-24 (世界標準時間)。"],[],[],null,["# Integrate BERT natural language classifier\n\nThe Task Library `BertNLClassifier` API is very similar to the `NLClassifier`\nthat classifies input text into different categories, except that this API is\nspecially tailored for Bert related models that require Wordpiece and\nSentencepiece tokenizations outside the TFLite model.\n\nKey features of the BertNLClassifier API\n----------------------------------------\n\n- Takes a single string as input, performs classification with the string and\n outputs pairs as classification results.\n\n- Performs out-of-graph [Wordpiece](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/cc/text/tokenizers/bert_tokenizer.h)\n or\n [Sentencepiece](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/cc/text/tokenizers/sentencepiece_tokenizer.h)\n tokenizations on input text.\n\nSupported BertNLClassifier models\n---------------------------------\n\nThe following models are compatible with the `BertNLClassifier` API.\n\n- Bert Models created by [TensorFlow Lite Model Maker for text\n Classfication](../modify/text_classification).\n\n- Custom models that meet the [model compatibility\n requirements](#model-compatibility-requirements).\n\nRun inference in Java\n---------------------\n\n### Step 1: Import Gradle dependency and other settings\n\nCopy the `.tflite` model file to the assets directory of the Android module\nwhere the model will be run. Specify that the file should not be compressed, and\nadd the TensorFlow Lite library to the module's `build.gradle` file: \n\n android {\n // Other settings\n\n // Specify tflite file should not be compressed for the app apk\n aaptOptions {\n noCompress \"tflite\"\n }\n\n }\n\n dependencies {\n // Other dependencies\n\n // Import the Task Text Library dependency\n implementation 'org.tensorflow:tensorflow-lite-task-text:0.4.4'\n }\n\n| **Note:** starting from version 4.1 of the Android Gradle plugin, .tflite will be added to the noCompress list by default and the aaptOptions above is not needed anymore.\n\n### Step 2: Run inference using the API\n\n // Initialization\n BertNLClassifierOptions options =\n BertNLClassifierOptions.builder()\n .setBaseOptions(BaseOptions.builder().setNumThreads(4).build())\n .build();\n BertNLClassifier classifier =\n BertNLClassifier.createFromFileAndOptions(context, modelFile, options);\n\n // Run inference\n List\u003cCategory\u003e results = classifier.classify(input);\n\nSee the [source\ncode](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/java/src/java/org/tensorflow/lite/task/text/nlclassifier/BertNLClassifier.java)\nfor more details.\n\nRun inference in Swift\n----------------------\n\n### Step 1: Import CocoaPods\n\nAdd the TensorFlowLiteTaskText pod in Podfile \n\n target 'MySwiftAppWithTaskAPI' do\n use_frameworks!\n pod 'TensorFlowLiteTaskText', '~\u003e 0.4.4'\n end\n\n### Step 2: Run inference using the API\n\n // Initialization\n let bertNLClassifier = TFLBertNLClassifier.bertNLClassifier(\n modelPath: bertModelPath)\n\n // Run inference\n let categories = bertNLClassifier.classify(text: input)\n\nSee the [source\ncode](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/ios/task/text/nlclassifier/Sources/TFLBertNLClassifier.h)\nfor more details.\n\nRun inference in C++\n--------------------\n\n // Initialization\n BertNLClassifierOptions options;\n options.mutable_base_options()-\u003emutable_model_file()-\u003eset_file_name(model_path);\n std::unique_ptr\u003cBertNLClassifier\u003e classifier = BertNLClassifier::CreateFromOptions(options).value();\n\n // Run inference with your input, `input_text`.\n std::vector\u003ccore::Category\u003e categories = classifier-\u003eClassify(input_text);\n\nSee the [source\ncode](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/cc/task/text/bert_nl_classifier.h)\nfor more details.\n\nRun inference in Python\n-----------------------\n\n### Step 1: Install the pip package\n\n pip install tflite-support\n\n### Step 2: Using the model\n\n # Imports\n from tflite_support.task import text\n\n # Initialization\n classifier = text.BertNLClassifier.create_from_file(model_path)\n\n # Run inference\n text_classification_result = classifier.classify(text)\n\nSee the [source\ncode](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/python/task/text/bert_nl_classifier.py)\nfor more options to configure `BertNLClassifier`.\n\nExample results\n---------------\n\nHere is an example of the classification results of movie reviews using the\n[MobileBert](../modify/text_classification) model from Model Maker.\n\nInput: \"it's a charming and often affecting journey\"\n\nOutput: \n\n category[0]: 'negative' : '0.00006'\n category[1]: 'positive' : '0.99994'\n\nTry out the simple [CLI demo tool for\nBertNLClassifier](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/examples/task/text/desktop/README.md#bertnlclassifier)\nwith your own model and test data.\n\nModel compatibility requirements\n--------------------------------\n\nThe `BetNLClassifier` API expects a TFLite model with mandatory [TFLite Model\nMetadata](../../models/metadata.md).\n\nThe Metadata should meet the following requirements:\n\n- input_process_units for Wordpiece/Sentencepiece Tokenizer\n\n- 3 input tensors with names \"ids\", \"mask\" and \"segment_ids\" for the output of\n the tokenizer\n\n- 1 output tensor of type float32, with a optionally attached label file. If a\n label file is attached, the file should be a plain text file with one label\n per line and the number of labels should match the number of categories as\n the model outputs."]]