Google just announced loads of AI search updates including video search capabilities in Google Lens adding a whole new way for people to search the internet.
The new Video Search feature comes at a time when every major tech company is looking to one-up its competitors in the race to have truly useful AI functionality that sticks with consumers - could searching the web via video be it?
Rolling out to all Google app users on iOS and Android, Video Search flexes Googleās AI muscles just in time for the launch of Appleās Lens competitor the Apple Intelligence feature, Visual Intelligence. Appleās offering is yet to receive a release date but itās at the core of the companyās marketing for the iPhone 16 and iPhone 16 Pro, taking advantage of Camera Control.
Visual Intelligence lets you snap an image of something and quickly get information on whatever youāre looking at. Whether youāre snapping a photo of a closed restaurant to check opening times (apparently places donāt show opening hours in their windows anymore) or aiming your iPhoneās camera at a friendās dog to check the breed (we donāt ask questions anymore either), Visual Intelligence is essentially Appleās competitor to Google Lens - but new video and voice features in Lens leave it behind before the feature even launches.
[HEADING=1]Search what you record[/HEADING]
[IMG alt=āAI video search results Google Lensā]https://cdn.mos.cms.futurecdn.net/Kz...dZeDjUF6CZ.png
(Image credit: Google)
So how does Video Search work? And would you even want to use it? Youāre now able to snap videos on Google Lens and quickly ask questions related to what youāre seeing. The example Google gave was a person recording a school of fish in an aquarium and asking Lens to analyze the species based on search results. Itās pretty cool stuff, but how much more useful is video recording than snapping a quick photo?
At the time of writing, Iāve not been able to test Googleās new Video Search functionality, which is available globally for users enrolled in Search Labs āAI Overview and moreā experiment. Iāve also not had the opportunity to test Visual Intelligence, and as far as Iām aware no one outside the walls of Apple has had the pleasure either. With new video search functionality and even voice search functionality coming to Lens, I canāt help but feel like Visual Intelligence is already lagging behind in the same way Siri was compared to other voice assistants when it launched back in 2011.
There are a lot of questions here and we wonāt get answers for at least a few months. But I have to ask, do people even care about Video Search anyway? Or will Visual Intelligenceās dedicated launch button on the side of all iPhone 16s be enough to make people start searching without typing?
[HEADING=2]You might also likeā¦[/HEADING]
[ul]
[li]Google Gemini Liveās AI voice now comes in ten more styles that take inspiration from the stars[/li][li]Google is rolling out Gemini AI to older Pixel Buds models[/li][li]Gemini in Gmail will now provide smarter quick replies for your emails[/li][/ul]
Continue readingā¦
The new Video Search feature comes at a time when every major tech company is looking to one-up its competitors in the race to have truly useful AI functionality that sticks with consumers - could searching the web via video be it?
Rolling out to all Google app users on iOS and Android, Video Search flexes Googleās AI muscles just in time for the launch of Appleās Lens competitor the Apple Intelligence feature, Visual Intelligence. Appleās offering is yet to receive a release date but itās at the core of the companyās marketing for the iPhone 16 and iPhone 16 Pro, taking advantage of Camera Control.
Visual Intelligence lets you snap an image of something and quickly get information on whatever youāre looking at. Whether youāre snapping a photo of a closed restaurant to check opening times (apparently places donāt show opening hours in their windows anymore) or aiming your iPhoneās camera at a friendās dog to check the breed (we donāt ask questions anymore either), Visual Intelligence is essentially Appleās competitor to Google Lens - but new video and voice features in Lens leave it behind before the feature even launches.
[HEADING=1]Search what you record[/HEADING]
[IMG alt=āAI video search results Google Lensā]https://cdn.mos.cms.futurecdn.net/Kz...dZeDjUF6CZ.png
(Image credit: Google)
So how does Video Search work? And would you even want to use it? Youāre now able to snap videos on Google Lens and quickly ask questions related to what youāre seeing. The example Google gave was a person recording a school of fish in an aquarium and asking Lens to analyze the species based on search results. Itās pretty cool stuff, but how much more useful is video recording than snapping a quick photo?
At the time of writing, Iāve not been able to test Googleās new Video Search functionality, which is available globally for users enrolled in Search Labs āAI Overview and moreā experiment. Iāve also not had the opportunity to test Visual Intelligence, and as far as Iām aware no one outside the walls of Apple has had the pleasure either. With new video search functionality and even voice search functionality coming to Lens, I canāt help but feel like Visual Intelligence is already lagging behind in the same way Siri was compared to other voice assistants when it launched back in 2011.
There are a lot of questions here and we wonāt get answers for at least a few months. But I have to ask, do people even care about Video Search anyway? Or will Visual Intelligenceās dedicated launch button on the side of all iPhone 16s be enough to make people start searching without typing?
[HEADING=2]You might also likeā¦[/HEADING]
[ul]
[li]Google Gemini Liveās AI voice now comes in ten more styles that take inspiration from the stars[/li][li]Google is rolling out Gemini AI to older Pixel Buds models[/li][li]Gemini in Gmail will now provide smarter quick replies for your emails[/li][/ul]
Continue readingā¦