Revolutionizing Mobile Photography: How Apple's iPhone 16 Now Matches Up with Google Lens Capabilities, Insights Analysis
Apple Integrates Cutting-Edge Visual Search Capabilities in iPhone 16, Emulating Google Lens | Technology Insights
Screenshot by Kayla Solino/ZDNET
During Monday’s iPhone 16 event , Apple announced a brand new lineup of iPhones, AirPods and AirPods Pro, AirPods Max, and an Apple Watch Series 10. Apple also took the opportunity to expand on some Apple Intelligence features, the artificial intelligence (AI) tools coming to the iPhone. This includes a new Visual Intelligence feature, which essentially gives the iPhone camera Google Lens capabilities.
Apple’s Visual Intelligence lets you capture a photo of things around you, like a flyer or a restaurant, and then uses iPhone’s AI capability to search for it and give you more information.
Also: Everything we expect at Apple’s iPhone 16 event: AI features, AirPods, Apple Watch Series 10, more
Apple says captured data will remain private when used with Apple Intelligence and the company’s Private Cloud Compute , but users can opt for third-party integrations with the new camera experience.
Screenshot by Kayla Solino/ZDNET
Third-party integrations include the ability to search Google for whatever the camera captures, much like opening Google Lens straight from the iPhone camera app. Users can also allow ChatGPT integration with the Visual Intelligence feature, which would allow the AI chatbot to process the image data captured by the camera.
These third-party integrations require the user to give permissions on an opt-in basis.
Also: Every iPhone model that will be updated to Apple’s iOS 18 (and which ones can’t)
The iPhone 16 and iPhone 16 Plus are available for pre-order from $799 and $899, respectively, but the Apple Intelligence features won’t be readily available for some time. Apple says some of its AI features will begin rolling out in beta next month, with more features to come over the next several months.
Featured
We’ve used every iPhone 16 model and here’s our best buying advice for 2024
20 years later, real-time Linux makes it to the kernel - really
My biggest regret with upgrading my iPhone to iOS 18 (and I’m not alone)
Want a programming job? Learn these three languages
- We’ve used every iPhone 16 model and here’s our best buying advice for 2024
- 20 years later, real-time Linux makes it to the kernel - really
- My biggest regret with upgrading my iPhone to iOS 18 (and I’m not alone)
- Want a programming job? Learn these three languages
Also read:
- [New] Optimize Attention Strategies for Shorts on YouTube
- Amplify Visual Content Embedding Audio on Instagram Reels
- Best Video Repair tool to Fix and Repair Corrupted video files of Oppo Reno 10 Pro+ 5G
- ChatGPT Employment Risks: The Potential Firing Scenarios
- Dodge These Pitfalls: ChatGPT Mobile Downloads
- Elevating Interactivity: Top 5 Personalized GPT-3 Directive Strategies
- Fix Renesas USB 3.0 Driver Issue on Windows 10
- How To Bypass iCloud Activation Lock On iPod and Apple iPhone 15 The Right Way
- In 2024, Navigating the World of Virtual Engagements Effects, Filters, & More
- Mastering Siri-ChatGPT Integration on iPhones
- Redefining AI Conversation with Alternatives
- Speeding Up HR Operations with GPT Prompts
- UK Dialect Decoded: Fluent in Locals Tongue
- Unveiling the Surprising Gem: How iOS 18 Stole the Show at WWDC - Beyond Artificer
- Visionaries Collide: Blizzard's Tech Allegiance with Microsoft Redefines AI Horizons [Interview Special]
- Title: Revolutionizing Mobile Photography: How Apple's iPhone 16 Now Matches Up with Google Lens Capabilities, Insights Analysis
- Author: Brian
- Created at : 2024-10-13 17:36:40
- Updated at : 2024-10-15 07:12:54
- Link: https://tech-savvy.techidaily.com/revolutionizing-mobile-photography-how-apples-iphone-16-now-matches-up-with-google-lens-capabilities-insights-analysis/
- License: This work is licensed under CC BY-NC-SA 4.0.