There’s no question that the opening keynote on the annual Google I/O developers conference is likely one of the most action-packed occasions annually for Google followers. The company runs via all the thrilling new merchandise it plans to launch within the short-term future, and it additionally covers loads of enhancements and new options hitting current Google merchandise. and software program was each lined through the Google I/O 2017 keynote, as we anticipated, and there are such a lot of thrilling issues Google executives mentioned on stage through the occasion.
The Google I/O keynote covers a lot in such a brief period of time, that it’s all the time good to have an easy roundup of all the day’s greatest and most essential bulletins. That’s precisely what you’ll discover on this publish, which is your one-stop store for information on everything Google covered on stage on Wednesday.
Smart Reply in Gmail
Google’s machine learning system analyzes the contents of an email in the Android or iOS Gmail app, and it suggests some quick and simple replies to an email. This is similar to what smartphones do with text messages. Learn more right here.
A set of vision-based computing capabilities that can see what you’re looking at and offer information about the object. For example, you can point your phone camera at a flower and Google Lens will tell you what kind of flower it is. Even more impressively, you can scan the label on a Wi-Fi router and connect to the network instantly!
Find My Device app
Google’s answer to Apple’s Find My iPhone app has gotten a new name and a big design overhaul. Check out all the details in this post.
Google is making huge updates to Google Assistant. First and foremost, Google is adding the ability to type to the Assistant on your phone. This might seem like a step backward, but it’s obviously going to come in handy when you’re in public.
Google Assistant will also tie into Google Lens, so you can make conversational requests to Assistant about things you see. For example, you can point the camera at a dish on a menu, and then ask Assistant what it looks like without ever naming the dish. Assistant will then return photos of the dish using a Google image search.
Actions on Google is Google’s third-party developer toolkit that allows developers to integrate Google Assistant support into their services. This feature used to be limited to Google Home, but now it’s available on Android and iOS devices.
Google Assistant on the iPhone
Assistant is also coming to the iPhone for the first time. Google Assistant will get its own app on iOS, and it’s already available for download in the US App Store.
Proactive assistant support is finally coming to Google Home. This means the Google Home will light up and get your attention to delivering information without the user having to request it. For example, the Home will light up if you have an upcoming meeting and it’s time to leave.
The second big addition is hands-free calling. Just ask Google Home to make a call, and the speaker will place the call. Calls within the US are also completely free, which is a nice perk.
Spotify’s free service will be supported on the Google Home for the first time beginning this summer (only paid subscriptions were supported previously). Soundcloud and Deezer support is coming as well, and all models will soon be updated with support for Bluetooth audio streaming from any device.
Finally, Google is adding support for visual responses on Google Home. Home doesn’t have a screen, of course, but it will connect to other devices with screens — an Android phone, a Chromecast, or even and iPhone. For example, you can ask your Google Home for directions to a destination, and Home can send those directions to your phone and open Google Maps.
Google is adding three new features to Google Photos. The first is Suggested Sharing, which will use machine learning to remind you to share photos with people who appear in them. For example, if you take a group shot with five people, Google Photos will later pop up a reminder suggesting that you share the pic with each of the five people who appears in the photo.
As an extension of Google Photos’ new sharing feature, other people who take photos as the same event as you will be able to share recommended photos to a group album that all included users can access.
Shared libraries is another new feature that will let you configure your phone so that it will share any images you capture of a certain person. So for example, all photos you take of your children can automatically be shared with your husband or wife.
These new features will roll out to iOS, Android and the web in the coming weeks.
One final feature Google is adding to Google Photos is Photo Books, which uses machine learnings to help compile photo albums that can then be printed in a physical softcover or hardcover book that’s shipped to your door in just a few days. Google’s Photo Books product launches today on the web, and each book costs $9.99.
Google had plenty to say about YouTube, but the star of the show is the upcoming arrival of 360-degree YouTube video support in the YouTube smart TV app. Using your TV’s standard remote, you’ll be able to pan around 360-degree videos on the big screen. This will also extend to live events, which can stream in the 360-degree video directly to your TV through the YouTube app.
Google first released an Android O developer preview in March, and on Wednesday we learned about new features that will be added to Android O before it’s released later this summer. Here are some highlights:
Picture-in-picture support will allow Android phone users to shrink a video into the corner of the screen and use the phone while the video continues to play. Of course, this feature isn’t limited to videos.
Notifications are also being enhanced in Android O. Notification Dots are basically tiny chat heads for notifications — a long-tap on a Notification Dot will open up a preview of the notification, which can then be swiped to dismiss, dragged down to see the rest of the notification, or tapped to open the appropriate app. There’s also a new auto-fill feature in Android O that will work across third-party apps, making things like logins much easier. Then there’s Smart Text Selection, which makes selecting words and phrases much easier. Google is using on-device machine learning to allow single taps to select long phrases as appropriate, such as addresses.
Android O will also include enhancements into what Google calls Vitals.
Google Play Protect is the first new feature announced in the Vitals category, and it’s a virus checker of sorts that scans apps to ensure there’s no suspicious or malicious code. Android O will also speed things up dramatically using features like “wise limits,” which stop apps from using too many resources and draining too much batteries.
Play Console Dashboards for developers analyzes third-party apps for issues that might cause excessive resource consumption or battery drain. The tool can also help developers overcome those issues as they build their apps. Support for the programming language Kotlin has also been added to Android in Android O.
A new Android O beta is available for download today, and you can learn more in this post.
Android Go is essentially a new iteration of the company’s old Android One initiative. It will offer a lighter-weight version of Android O and future Android builds, and it will facilitate lighter-weight apps. This will allow device makers to build cheaper phones with less expensive components that will still be able to offer a quick and smooth Android experience.
VR and AR
Google announced Daydream last year to help vendors bring a high-quality virtual reality experience to mobile devices. On Wednesday, Google announced that Daydream support is coming to LG’s next flagship phone, as well as Samsung’s Galaxy S8 and S8+ in an upcoming update. But what about what’s next?
Google announced the addition of a standalone Daydream VR headset spec. This will allow vendors to build mobile VR headsets that do not need a smartphone to run. They’ll have no wires, and they will not need to be connected to a PC. HTC Vive and Lenovo are both currently working with Google to build and launch standalone Daydream headsets later this year.
Where augmented reality is concerned, Google announced a Visual Positioning Service (VPS, like GPS) that will enable next-generation AR experiences and indoor guidance. For example, VPS can guide you through a store to the exact product you came to purchase. Google is also adding AR features to Expeditions, it’s education product that allows students to enjoy interactive learning experiences in the classroom.
Google for Jobs
According to Google CEO Sundar Pichai, Google for Jobs is Google’s suite of products that will help people find jobs, and will help employers find the best candidates. Everything from retail jobs to c-suite executive roles can now be searched right in the Google search box, and a card-based interface will display available jobs without ever having to leave Google. Results are enhanced in a number of ways; for example, the job results for a search will show the searcher commute times from his or her home.
Google for Jobs will roll out in the coming weeks, and it will be available initially only in the US.
Updated to clarify the timing of free Spotify support on Google Home, and to correct an earlier error stating Google Assistant would be available in Google’s main iOS app.