Artificial intelligence is the future, or it’d be better to say the ‘present’. At Google’s I/O conference today, CEO Sundar Pichai announced that the company is working on a new form of AI called Google Lens, which won’t just see what you see, but will also understand what you see to help you take action.

With Lens, Google’s Assistant will be able to identify objects around you and perform actions based on Google’s various apps.

Google Assistant is now on the iPhone

During its demo, Google showed off how pointing your camera at something helped Lens tell you what it is — like, it could identify the flower you’re preparing to shoot.

Scott Huffman, Vice President of Engineering for Assistant, mentioned using Google Lens to translate a Japanese street sign which he couldn’t recognize. Huffman said the Lens acts like a facilitator for a conversation between the user and Assistant using machine learning.

Google Lense.

You can now buy stuff with Google Assistant

Google Lens can even do some more incredible work like, if you want to connect to WiFi on your phone, you can just point your camera at the security credentials, and the Assistant will do the rest.

The company stated that the Google Lens will arrive soon but didn’t announce any date.

Follow our coverage of Google I/O Conference HERE.