A Flutter plugin to use the Firebase ML Kit.
⭐Only your star motivate me!⭐
Note: This plugin is still under development, and some APIs might not be available yet. Feedback and Pull Requests are most welcome!
| Feature | Android | iOS |
|---|---|---|
| Recognize text(on device) | ✅ | ✅ |
| Recognize text(cloud) | yet | yet |
| Detect faces(on device) | ✅ | ✅ |
| Scan barcodes(on device) | ✅ | ✅ |
| Label Images(on device) | ✅ | ✅ |
| Label Images(cloud) | yet | yet |
| Recognize landmarks(cloud) | yet | yet |
| Custom model | yet | yet |
What features are available on device or in the cloud?
To use this plugin, add mlkit as a dependency in your pubspec.yaml file.
Check out the example directory for a sample app using Firebase Cloud Messaging.
To integrate your plugin into the Android part of your app, follow these steps:
- Using the Firebase Console add an Android app to your project: Follow the assistant, download the generated
google-services.jsonfile and place it insideandroid/app. Next, modify theandroid/build.gradlefile and theandroid/app/build.gradlefile to add the Google services plugin as described by the Firebase assistant.
To integrate your plugin into the iOS part of your app, follow these steps:
-
Using the Firebase Console add an iOS app to your project: Follow the assistant, download the generated
GoogleService-Info.plistfile, openios/Runner.xcworkspacewith Xcode, and within Xcode place the file insideios/Runner. Don't follow the steps named "Add Firebase SDK" and "Add initialization code" in the Firebase assistant. -
Remove the
use_frameworks!line fromios/Podfile(workaround for flutter/flutter#9694).
From your Dart code, you need to import the plugin and instantiate it:
import 'package:mlkit/mlkit.dart';
FirebaseVisionTextDetector detector = FirebaseVisionTextDetector.instance;
// Detect form file/image by path
var currentLabels = await detector.detectFromPath(_file?.path);
// Detect from binary data of a file/image
var currentLabels = await detector.detectFromBinary(_file?.readAsBytesSync());