Apple and Google as a search partner relationship is a new turn. Apple has paid about $ 20 billion every year to have Google's default search engine in its Safari browser. Now iPhone 16 users can access the Google search engine and visual search feature using the new camera control button on the device. OpenAI ChatGPT is visited through Siri and also appears as a third part of the demonstration.
With the Camera Control, Apple explained how users can quickly take a photo or record video, and how they’ll be able to slide their finger across the button to frame their shot and adjust options like zoom, exposure, or depth of field in a new camera preview experience. However, the button also provides iPhone 16 users with access to Apple’s new “visual intelligence” search feature, which is where the Google partnership comes in. When first introduced, the iPhone 16’s Camera Control seemed like Apple lingo for “shutter button,” but as the event continued, Apple explained there’s more you can do with this new hardware feature.
With Visual Intelligence, users don’t just have an easy way to learn about the things in their camera’s view, they also have another way to access third-party services without having to launch standalone apps. Essentially a visual search feature similar to Google Lens or Pinterest Lens, Apple describes Visual Intelligence as a way to instantly understand everything you see. Apple showed several examples of how users can click a camera control button to get information about restaurants you see while in town, or how to use the feature to identify a dog you see on a walk. This feature can also turn a wall-mounted event poster into a calendar entry with all the details.
Craig Federighi, Apple's senior vice president of software engineering, then casually mentioned that the feature could also be used to access Google searches. "Camera control is also your gateway to third-party tools that make it super quick and easy to tap into their domain-specific expertise. So if you find a bike that looks exactly like what you're looking for, bikes, just click Google and buy as something, he said. The demo shows a person pressing the camera control button while pointing the iPhone at the bike, then seeing several similar purchase options in a pop-up window at the top of the camera view. Behind the grid of matching bike photos and descriptions is a smaller button on the screen that says “More Google Results,” indicating that you can click again to continue your Google search.
Apple didn't explain how or when pressing a camera control button would have to go to a third-party partner for a response, rather than a native Apple service like Apple Maps, which was shown in the restaurant demo. The company also didn't fully explain how users will be able to control or configure the feature.
This feature is interesting that it proposes a new paradigm to interact with software and services that interact with Apple and iPhone. It reached the concept of the application store and began to feel obsolete. Using AI technology, users can ask questions, perform productivity tasks, get creative with images and videos, and more. These are things that consumers used to do in apps, but now they can do it through a new interface by talking and texting the AI assistant. Apple is in no rush to create its own ChatGPT competitor, instead positioning itself as a platform to access third-party services, including artificial intelligence technology, search services, and possibly other service providers in the future.
Additionally, it can make these connections through behind-the-scenes deals with partners, such as working with OpenAI for certain AI features, rather than using in-app transactions to generate revenue. It also smartly keeps Apple’s reputation from taking a hit when a third party, like ChatGPT, gets things wrong (as AIs tend to do) or when a Google Search doesn’t yield helpful results.