@@ -40,16 +39,14 @@ Thanks!
TensorFlow Lite Flutter plugin provides a flexible and fast solution for accessing TensorFlow Lite interpreter and performing inference. The API is similar to the TFLite Java and Swift APIs. It directly binds to TFLite C API making it efficient (low-latency). Offers acceleration support using NNAPI, GPU delegates on Android, Metal and CoreML delegates on iOS, and XNNPack delegate on Desktop platforms.
-
## Key Features
-* Multi-platform Support for Android and iOS
-* Flexibility to use any TFLite Model.
-* Acceleration using multi-threading.
-* Similar structure as TensorFlow Lite Java API.
-* Inference speeds close to native Android Apps built using the Java API.
-* Run inference in different isolates to prevent jank in UI thread.
-
+- Multi-platform Support for Android and iOS
+- Flexibility to use any TFLite Model.
+- Acceleration using multi-threading.
+- Similar structure as TensorFlow Lite Java API.
+- Inference speeds close to native Android Apps built using the Java API.
+- Run inference in different isolates to prevent jank in UI thread.
## (Important) Initial setup : Add dynamic libraries to your app
@@ -133,7 +130,7 @@ install(
## TFLite Flutter Helper Library
-The helper library has been deprecated. New development underway for a replacement at https://github.com/google/flutter-mediapipe. Current timeline is to have wide support by the end of August, 2023.
+The helper library has been deprecated. New development underway for a replacement at . Current timeline is to have wide support by the end of August, 2023.
## Import
@@ -144,11 +141,12 @@ import 'package:tflite_flutter/tflite_flutter.dart';
## Usage instructions
### Import the libraries
+
In the dependency section of `pubspec.yaml` file, add `tflite_flutter: ^0.10.1` (adjust the version accordingly based on the latest release)
### Creating the Interpreter
-* **From asset**
+- **From asset**
Place `your_model.tflite` in `assets` directory. Make sure to include assets in `pubspec.yaml`.
@@ -160,9 +158,10 @@ Refer to the documentation for info on creating interpreter from buffer or file.
### Performing inference
-* **For single input and output**
+- **For single input and output**
Use `void run(Object input, Object output)`.
+
```dart
// For ex: if input tensor shape [1,5] and type is float32
var input = [[1.23, 6.54, 7.81, 3.21, 2.22]];
@@ -177,7 +176,7 @@ Refer to the documentation for info on creating interpreter from buffer or file.
print(output);
```
-* **For multiple inputs and outputs**
+- **For multiple inputs and outputs**
Use `void runForMultipleInputs(List