Earlier this week, during the WWDC Keenote, Apple showed its new iOS 26. For the first time since iOS 7 in 2013, the Apple is reviving the look and feel of the operating system, introducing a very Windows Aero-Esk design language that is being introduced a very Windows Aero-Esk design language called “” “” “” “.Liquid glass“(RIP Windows Vista), and since it was an attractive new thing in Keenot, it has been a hot theme of the week.
However, we also saw teaser of other new features, which are not being paid attention to the same level. For example, within the segment on iOS, Billy Sorantino showed a new ability to AI-operated visual intelligence of Apple, called, very simple, image search. The way it works is that you take a screenshot of anything you see on your iPhone screen. Once you have a screenshot, you can hit the image search button in the lower right. Using AI, the visual intelligence will scan the screenshot and discover the dates to be seen or made for the dates and time to be revealed in the image.
If it seems familiar, it is so To find Google’s circle What is exactly the same thing and available for more than a year. However, I normally “bring it from Android to steal” Lol, Apple! ” feedback. I am bringing it up, because the image search within the iOS 26 seems to be badly bad.
Visual intelligence in iOS 26: circle to search, but bad
During the kenot (top in the top embedded video starts at 38:27), the image search seems so easy and powerful. In his first demo, he draws a social media feed. There are many posts that are only lessons, and then an image. He takes a screenshot, starts the image search, and tells us, the audience, that he is interested in the jacket that the model is wearing in the social media post.
Apple’s own demo on this circle was prone to poorly and a bad UI for the search-esk facility.
The image discovery talks its talk and draws a collection of images sharing similarities with social media posts. Note that it does not discover the jacket. The software does not even know that Sorantino is interested in jackets because he never indicated. All software find images that look similar to one in their screenshots, and Sorantino works like this. Sir, I have been using Tineye to do so since 2008.
In addition, note that the image search ignored everything going on in the screenshot. It did not discover the emoji that appears in one of the posts, nor did it find anything related to many avatar images. Somehow it only knew to find it through an image, something that seems to be in real life.

In the next demo, Sorantino finds an image of a room with a mushroom -shaped lamp. He starts searching the image again, but this time asks the system to check the lamp specifically. He does this by scribeing over the lamp with his finger. Note that he does not circle the lamp, as it will be a dead cheap of apple intentions here, but whatever it is.
Once he Circle to search Scribals on the lamp, he sees another list of images. Anything strange notice, though? None of the visual list is one of the lamp original photos! Even the first result he chooses, he is not very clearly the lamp he was looking for, but Sorantino moves forward with adding it to his Etsy favorite as it was a great success. My boy is not a lamp. The system failed, and you are showing that it was successful.
Do you need to use your hands? It is like a child’s toy!

In the final demo of Sorantino, she uses visual intelligence explains what a picture reflects and asks a question about it. In the example, the photo is of a small stringent equipment. He captures the screenshot and types a question to chat. They find out that the photo is of a mandolin and this device has been used in many popular rock songs.
The glowing thing here is that Sorantino types its question. It does not look very convenient. To search with the circle, I can just ask my question orally. Even during the demo, this is strange because we see him giving a message that rock songs use instruments.
Eventually, this entire section was so worrisome. This is a pre-ridicted apple kenot demo, so you know that it will work better here than in real life. But even the demo suggests that it is behind the circle to search in both form and work. I tremble to think about how well it will do when it actually descends.
This entire demo was another example of Apple when it fell behind the curve for the supporting implementation of the AI ​​tool.
This is just one more thing when it comes to Apple to leave Apple AI ball. The game was late, and whatever has been tried to do is either a straight lift from Google, Android, or other Android Oms, or rely on Openai to do real work. Looking at this image, the search demo was like a stumbling football player through a big game and still tries to do it as if he nuts it.
If nothing else, however, the section proved to be a hundred times on that circle, then is one of the biggest successes of Google to search. How many times Google has created something that Apple tried to refer again and failed this tough? Give, I will give Apple the benefit of doubt for now. It is possible that image search can be much better when it gets stable in September iPhone 17 seriesBut based on today’s demo, its circle is a dud to find the clone.