Whether you’re a student learning about photosynthesis or a parent researching the best cars for your growing family, people turn to Google with all sorts of curiosities. And we can help you understand in different ways—through text, your voice or even your phone’s camera. Today, as part of the SearchOn event, we’re announcing new ways you can use Google Lens and augmented reality (AR) while learning and shopping.
Visual tools to help you learn
For many families, adjusting to remote learning hasn’t been easy, but tools like Google Lens can help lighten the load. With Lens, you can search what you see using your camera. Lens can now recognize 15 billion things—up from 1 billion just two years ago—to help you identify plants, animals, landmarks and more. If you’re learning a new language, Lens can also translate more than 100 languages, such as Spanish and Arabic, and you can tap to hear words and sentences pronounced out loud.
If you’re a parent, your kids may ask you questions about things you never thought you’d need to remember, like quadratic equations. From the search bar in the Google app on Android and iOS, you can use Lens to get help on a homework problem. With step-by-step guides and videos, you can learn and understand the foundational concepts to solve math, chemistry, biology and physics problems.
Sometimes, seeing is understanding. For instance, visualizing the inner workings of a plant cell or the elements in the periodic table in 3D is more helpful than reading about them in a textbook. AR brings hands-on learning home, letting you explore concepts up close in your space. Here’s how Melissa Brophy-Plasencio, an educator from Texas, is incorporating AR into her lesson plans.
Shop what you see with Google Lens
Another area where the camera can be helpful is shopping—especially when what you’re looking for is hard to describe in words. With Lens, you can already search for a product by taking a photo or screenshot. Now, we’re making it even easier to discover new products as you browse online on your phone. When you tap and hold an image on the Google app or Chrome on Android, Lens will find the exact or similar items, and suggest ways to style it. This feature is coming soon to the Google app on iOS.
Lens uses Style Engine technology which combines the world’s largest database of products with millions of style images. Then, it pattern matches to understand concepts like “ruffle sleeves” or “vintage denim” and how they pair with different apparel.
Bring the showroom to you with AR
When you can’t go into stores to check out a product up close, AR can bring the showroom to you. If you’re in the market for a new car, for example, you’ll soon be able to search for it on Google and see an AR model right in front of you. You can easily check out what the car looks like in different colors, zoom in to see intricate details like buttons on the dashboard, view it against beautiful backdrops and even see it in your driveway. We’re experimenting with this feature in the U.S. and working with top auto brands, such as Volvo and Porsche, to bring these experiences to you soon.
Everyone’s journey to understand is different. Whether you snap a photo with Lens or immerse yourself in AR, we hope you find what you’re looking for...
...and even have some fun along the way.
by Aparna Chennapragada via The Keyword
Comments
Post a Comment