Originally unveiled at Google’s I/O developer conference in May, “Ask Photos,” which takes advantage of the company’s artificial intelligence technology in order to improve image search and discovery within its own application, will begin reaching users as soon as today. It’s a feature that lets users issue more complex queries, like telling the AI to find photos from August 2019 where you’re at dinner. And its long-term availability will be limited initially as it goes into “early access” with some customers in the U.S. before rolling out more broadly collected user base later on.
Designed using Google’s Gemini AI model, Ask Photos enables you to search photos with natural language queries that tap into what the AI knows about each image based on its content and other metadata. Currently, Google Photos users can search for specific individuals or objects in their photos but this upgrade to its AI will enable them through the use of natural language questions around a broader set if inquiry types — even some with which require more contextual understanding from within those images.
To give an example brought out by Google at I/O beforehand, you could make a request to get the “best photo from all my National Parks visits.” The AI will decide what the “best” within a batch of images is based on all sorts of signals such as lighting, how blurred an image might be and whether there are other things in the background. It would cross data it used to place its photos on maps with the geolocation of the images in order to detect those taken at National Parks.
In addition, Google claimed this feature could do more than simply retrieving a photo — for example asking questions, and getting answers in return. Parents asked, Swimlanes around the child’s previous 4 Birthdays for instances where they wondered what themes are used. The AI could take a look at pictures of parties and read who dressed based on the topic, like “mermaid,” or get together theme words equal to it. Then it could tell the parent when those themes were last utilized.
And more practical questions are like one asking you to remember an event, such as “what did we order last time at this restaurant” — provided of course that this is the kind of person who takes pictures when eating. Ask where we last camped at Yosemite. You might also be able to use the feature when selecting pictures for an album or even make a collage of all you did on vacation.
The AI understands your photo gallery in the context —e.g., who are important people, what things you like to do or eat otherwise—so it can clone photos and make a new years wish more personal.
For now, Ask Photos lives in Google Labs and will be available to U.S. users. Google says the AI Principles that guide all its development efforts also apply to this new Photos feature, and it won’t use private data in Photos for ad targeting. However, queries may be manually reviewed by Google employees in order to improve the AI over time. The answers of AI will not be monitored by human, however if user contact to give feedback or report abuse/harm.