Google I/O 2024: New AI Assistant Can Read Everything Through Your Phone’s Camera – News18

0
19
Google I/O 2024: New AI Assistant Can Read Everything Through Your Phone’s Camera – News18


Last Updated:

Google demoed its new AI Gemini assistant via digicam

Google I/O 2024 has been principally in regards to the AI development from the corporate and its new-gen Gemini AI assistant coming quickly.

Google I/O 2024 keynote was shock, shock, all about AI. The firm has a number of catching as much as do with OpenAI now taking ChatGPT to the 4o model earlier this week. The I/O 2024 keynote has proven us the work Google has been doing behind the scenes with the assistance of the Google Deepmind AI group.

And one of many merchandise to roll out from the lab is named Project Astra, which is a brand new-gen AI assistant that guarantees to combine AI into cell units with spatial understanding and video processing to provide you correct data.

Project Astra – The Everyday AI Assistant

The multimodal from Google is predicated on Gemini is principally its approach of claiming to OpenAI that we’re right here for the battle. So, how does this model of the AI Assistant work? Google is utilizing the digicam in your cellphone to information the AI assistant that will help you perceive the issues close to you.

It may even learn a code written on a PC and make it easier to decide its function or remedy the advanced code as properly. That’s not all, you possibly can level the digicam to the road and ask the AI assistant to inform you the place you might be situated and get extra localised particulars if certainly.

Google says these capabilities by way of Project Astra will likely be obtainable in Gemini Live inside the principle app that will likely be obtainable later this yr. The tech will initially work on the Pixel telephones and Google sees the AI assistant coming to extra units, which incorporates the sensible glasses and even TWS earbuds some day.



Source hyperlink