Google I/O 2019 keynote: Everything you need to know (updating)
@deletescape
Google I/O 2019 is upon us and one of the most important events is the first one. This year’s keynote has a lot of promise with expected topics ranging from a new phone to the next Android Q beta and a bunch of other stuff. We’ll be watching live via the video above and updating this article as the keynote progresses.
In addition, we expect to see some stuff about apps and games along with Google Home products. We also have Eric Zemen, David Imel, and Justin Duino on the floor to bring even more Google I/O 2019 coverage over the next several days.
The live stream
You can watch the live stream with the video above at 1PM EST when the event starts. We expect the link to continue to work after the event is over as well. Below, we’ll update the article as more information becomes available.
Google I/O 2019 opening
We open Google I/O 2019 with a montage featuring gaming, virtual reality, augmented reality, and some Star Trek and Night Rider. Sundar Pichai hits the stage to begin the presentation by talking about stuff Google has been doing over the last few weeks and he makes a quip about an upcoming Liverpool soccer match. He talks about this year’s I/O app that uses augmented reality to help visitors get around this year. This is also a new feature in Google Maps.
Pichai talks briefly about all of Google’s useful services, including Google Photos, Google Maps, and Google Assistant. He further talks about products that billions of people use all around the world with an emphasis on Google Search and Google News. The full coverage feature in Google News is also heading to Google Search, including a full timeline of events and news from a variety of sources. Google is also bringing indexing podcasts into Google Search and you can listen directly from Google Search results.
”
Aparna Chennapragada, Google Search, Camera, and augmented reality
Aparna enters the stage to talk about augmented reality and camera in Google Search. The first new Search feature lets you see 3D models directly from Google Search and use augmented reality to place those objects in your camera app. You can even do some neat stuff like look for shoes and see how it fits with your outfit. Finally, see 3D models in your camera app and it seems to scale to size with whatever else is in the camera app. The demo showed a great white shark on stage and it’s very impressive.
Aparna moves on to Google Lens, available on most newer Android phones these days. It’s been built into Google Photos, Assistant, and Camera. Over one billion people have used lens already. Lens now has the ability to work natively with the camera and highlight things like popular dishes on a menu without the user doing anything with data from Google Maps. Lens can also calculate the tip and split totals of restaurant receipts in real time from the camera app without much user input. Google is partnering with many companies to improve these visual experiences.
Finally, Google is integrating Google Translate and the camera into the Google Search bar to read signs out loud to you in your native language or translate in real time like you can do in the Google Translate app already. Aparna throws it to a video clip of an Indian woman who never had a proper education using the app to live a normal life. This new feature works on phones that cost as little as $35 and uses a very small amount of space to make accessible to as many places as possible.
Pichai retakes the stage
Sundar takes the stage again and starts talking about Google Duplex as well as Google Search in terms of making reservations. You can ask Assistant to make reservations for you and, well, it does. The demo on stage was very impressive. This works with Calendar, Gmail, Assistant, and more. This new feature is called Duplex on the web and Google will have more information about it later this year.
Sundar also announced that Google’s voice models went from 100GB to 0.5GB, making it small enough to store it directly on the phone. This should help make Assistant faster. Pichai throws it to Scott Huffman for more.
Scott Huffman, Google Assistant, and voice models
Scott comes out and talks about making Google Assistant faster than ever. Another Googler, Maggie, then rings off a good couple of dozen commands and Assistant handles them all with aplomb to show off how much faster Assistant can get. She then demos Google Assistant working without using the hot-word and she uses her voice to reply to a text, find a photo of an animal at Yellowstone, and send that picture back to her text. She proceeds to use Assistant to find a flight time and send that information through text as well. Everything was done with voice with no touch input whatsoever. It’s very impressive to watch Assistant understand when Maggie was dictating and when she was calling for Google to complete a command.
Scott also announced Picks for you, a new Google Home feature that tries to personalize your results based on things Assistant has helped you do before. This includes things like directions, recipes, and other areas where your results may be different because of your personal preferences. Google calls this Personal References. You can ask Google what the weather is like at your mom’s house and Google will know where you mean, the traffic between you and that place, and what the weather is like at your mom’s house. Google Assistant will just get it.
Finally, Scott touches on improvements with Google products in the car, including easy commands for music, Maps, and more. Oh, and Assistant on Google Homes can now stop alarms with a simple command to stop. Scott concludes his segment with a fun little montage video clip of people using Google Assistant.
Wrap up
What was your favorite part of this year’s Google I/O 2019 keynote? Tell us in the comments!
from Android Authority http://bit.ly/2DTwA1f

No comments: