Apple Intelligence will introduce many new image-related features, whether it’s generative image editing or creating a movie from photos stored on your iPhone using just a text message. But the most useful change coming via Apple Intelligence may be one that helps you find images you already own. And it has implications that could extend beyond devices capable of running Apple Intelligence.
I’m talking about natural language search, which is coming to the Photos app in the first round of Apple Intelligence feature updates coming to iPhone next month. According to Apple, you’ll be able to find photos and videos simply by describing what you’re looking for—not keywords, but actual descriptive phrases, as if you were talking to another human being.
Natural language search is part of the Apple Intelligence features included in the public beta of iOS 18.1, so anyone who has installed the update on a compatible iPhone can try the new feature. And that’s exactly what I did as I used this new form of research to plumb the depths of my image library. Here’s what I learned about Apple’s changes to photo search.
New in photo searches
In the old days, well, iOS 17, you searched for images in the Photos app, one image at a time. If I wanted to find a particular photo of my daughter drinking a milkshake on a trip to Hollywood a few years ago, I could start by searching for photos of her with her name. So, I might try throwing in terms like “milkshake” or “Hollywood” or “2022.” I might get lucky and the photo I was looking for appears; most of the time it wouldn’t. (In the case of the milkshake photo, nothing to say, at least on an iPhone 15 still running iOS 17.)
But on an iPhone 15 Pro running iOS 18.1 all I have to do is start typing “Prunella drinking a milkshake.” All the photos matching that description will appear, allowing me to choose exactly the one I want.
By using natural language search, it’s much easier to find exactly the shot you’re looking for. You may get more than one photo in the search results, but it’s much more accurate than the old Photos search method could produce.
I should also note that results start appearing as you type, so you may even find your photo before you’re done entering your natural language search. I wanted to track down a photo of a baseball game in Houston that I attended with a friend, so I decided to search for “Jason in the Astros shirt.” All the photos from that match when I got to “Jason in an astr”, which saved me an incredible amount of time.
Before natural language search, I probably wouldn’t have even used the search function in Photos to find that photo. Instead, I would browse my photo library to the year I thought that game might take place. Since human memory has a habit of forgetting precise dates, especially when you’re an Old Man like me, you can see the inherent flaw in this approach. In iOS 18, search now results in many more hits than misses.
Some hiccups in the research
As natural as it may be to search for photos using Apple Intelligence, not every phrase will work for you. I tried searching for all the photos I had of my daughter at night – apparently the search function is having trouble distinguishing night from day, as no results were displayed. Other variations of the term – “at sunset”, “after sunset”, “in the evening” were also found to be invalid. (Though when I started typing “eve”, photos from various Christmas Eves only seemed to vanish when I added “ning”.)
I wanted to find a video of a mariachi band that I captured during a 4th of July parade several years ago, but “mariachi band” didn’t yield any results. However, a search was made for “music at the parade,” which is contrary to my experience that more specific search terms work better. In natural language search, this does not appear to be the case. (For what it’s worth, “mariachi band at the parade” produced results, some of which didn’t actually feature a mariachi band. Your guess is as good as mine.)
You may have noticed that the last search included a video, so yes, natural language search handles videos and photos. In fact, Apple’s preview page for Apple Intelligence says that the AI feature “can even find a particular moment in a video clip that fits the search description and take you directly there.”
I’ve had no luck with that specific use case in my testing, though I think that says more about the nature of the videos I capture on my phone. The videos in my photo library tend to be short clips of people singing Happy Birthday at a reunion or of my daughter dancing weirdly to Alice Cooper music on the last day of school. I imagine the feature Apple describes works better on longer clips where something visually distinct happens later in the video: a clown enters the scene or a dog jumps into view. Unfortunately I don’t own a dog and I don’t have any videos of clowns. You know this, anyway.
What about other iOS 18 phones?
If these are the kinds of changes people with Apple Intelligence-ready iPhones can expect, what happens to research on iPhones that can run iOS 18, but which don’t have the silicon and memory for Apple’s new suite of AI tools? After all, since only the iPhone 15 Pro models and all new iPhone 16 phones can run Apple Intelligence, there are many more iPhones on the market that won’t have these features.
I tried searching for photos on an iPhone 12 running iOS 18.1 beta (though, presumably, it doesn’t benefit from Apple’s intelligence). And I’m happy to report that the interface for searching photos is the same as the iPhone 15 Pro, as I had the ability to enter natural language search terms into a search field. Even better, most of the time, the results on my iPhone 12 matched those on my iPhone 15 Pro, even though the latter supports Apple Intelligence and the other doesn’t.
“Waterfalls in Hawaii” showed the same video clips on both phones. When I searched for my daughter at the Eiffel Tower, multiple results appeared on the iPhone 12, and all of them were accurate.
I’m not going to pretend that all searches worked the same on the iPhone 12 as they did on the iPhone 15. Searching for photos of me with Darth Vader produced two different shots of me standing in front of a Darth Vader statue in La Lucasfilm headquarters in San Francisco. The iPhone 12 came up blank, so maybe you need Apple Intelligence to track down intellectual property.
I have mixed feelings about whether photo searches on older iPhones running iOS 18 feel like more of a new way of doing things, even if the experience is a bit more refined on Apple Intelligence phones. The old method of searching for photos wasn’t very useful, so it’s nice that you don’t need the latest Apple hardware to get a better experience. On the other hand, I’m not sure it’s a good thing for the platform that some phones with iOS 18 get the full search experience while others come close but slightly different.
It also eliminates the argument for upgrading to a phone capable of running Apple Intelligence. Maybe there are other features I can’t do on that iPhone 12, but if I can take advantage of better searches, maybe that’s enough to let me keep my current phone without upgrading. Then again, I guess it’s a Cupertino problem, not a Phil problem.
Photo search perspective
Like many Apple Intelligence features, natural language search feels like a work in progress where generally useful features include some rough edges. Even with some search issues, I think this is a much better approach to finding photos, and it seems to work on more iPhones than you might think.
More from Tom’s Guide
#Apple #Intelligence #Photo #Search #iOS #surprised #results