Google’s ML Kit: Machine Learning For Mobile Made Easy

Deval Shah
Deval Shah
Image Source: Google

MOUNTAIN VIEW, CALIF.—Google is launching a new SDK for machine learning for its Firebase developer platform called “ML Kit.” The new SDK offers ready-to-use APIs for some of the most common computer-vision use cases, allowing developers that aren’t machine learning experts to still add some ML magic to their apps. This isn’t just an Android SDK; it works on iOS apps, too.

Typically, setting up a machine learning environment is a ton of work.
online pharmacy purchase amitriptyline online best drugstore for you

You’d have to learn how to use a machine learning library like TensorFlow, acquire a ton of training data to teach your neutral net to do something, and at the end of the day you need it to spit out a model that is light enough to run on a mobile device. ML Kit simplifies all of this by just making certain machine learning features an API call on Google’s Firebase platform.

The ML Kit section in the Firebase Console.
The ML Kit section in the Firebase Console.

The new APIs support text recognition, face detection, bar code scanning, image labeling, and landmark recognition. There are two versions of each API: a cloud-based version offers higher accuracy in exchange for using some data, and an on-device version works even if you don’t have Internet. For photos, the local version of the API could identify a dog in a picture, while the more accurate cloud-based API could determine the specific dog breed. The local APIs are free, while the cloud-based APIs use the usual Firebase cloud API pricing.

If developers do use the cloud-based APIs, none of the data stays on Google’s cloud. As soon as the processing is done, the data is deleted.

In the future, Google will add an API for Smart Reply. This machine learning feature is debuting in Google Inbox and will scan emails to generate several short replies to your messages, which you can send with a single tap. This feature will first launch in an early preview, and the computing will always be done locally on the device. There’s also a “high density face contour” feature coming to the face detection API, which will be perfect for those augmented reality apps that stick virtual items on your face.

ML Kit will also offer an option to decouple a machine learning model from an app and store the model in the cloud. Since these models can be “tens of megabytes in size,” according to Google, offloading this to the cloud should make app installs a lot faster.
online pharmacy purchase zovirax online best drugstore for you

The models first are downloaded at runtime, so they will work offline after the first run, and the app will download any future model updates.

The huge size of some of these machine learning models is a problem, and Google is trying to fix it a second way with a future cloud-based machine learning compression scheme. Google’s plan is to eventually take a full uploaded TensorFlow model and spit out a compressed TensorFlow Lite model with similar accuracy.

This also works well with Firebase’s other features, like Remote Config, which enables A/B testing of machine learning models across a user base.
online pharmacy purchase azithromycin online best drugstore for you

Firebase can also switch or update models on the fly, without the need for an app update.

Developers looking to try out ML Kit can find it in the Firebase console.

Read More: Google I/O 2018: What’s New with Google Assistant, Android P, Maps and More

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *