[ad_1]
In a recent ZDNET article, my friend and colleague David Gewirtz explains why he thinks the upcoming iPhone 16, with its focus on iOS 18 and Apple Intelligence, is a significant upgrade.
I respect David’s perspective but I disagree with it.
Plus: 6 reasons why iOS 18 makes the iPhone 16 a must-have upgrade for me
David argues that the incorporation of artificial intelligence (AI) in iOS 18 made the iPhone 16 a necessary upgrade for him, and highlights the potential for Apple Intelligence to revolutionize how we interact with our devices. I agree with him, In the long termBut I’m not convinced that the first version of Apple Intelligence represents the big leap in usability that many are hoping for.
Every year, my wife and I eagerly await the release of the new iPhone. Apple Upgrade Programwe return the device, reset the loan with Citizens Bank, and buy the latest model. For the past few years, I Promaxand my wife Base ModelThe improvements expected each year are incremental but appreciable.
But despite all the buzz about the iPhone 16’s new features and Apple Intelligence integration, a few concerns dampen my enthusiasm for the upgrade.
What Apple Intelligence isn’t saying
Apple Intelligence represents a major leap in on-device AI capabilities, bringing advanced machine learning and natural language processing directly to your phone. However, this technology is still in its early stages. On-device LLM and generative AI are essentially in alpha or beta stages, and there is a lot of uncertainty about how well they will perform on current Apple mobile hardware.
David sees the integration of AI into iOS 18 as a big step forward. But let’s not face it: AI features on these devices are still in their early stages and may not deliver the seamless experience Apple users expect. When Apple Intelligence is released to the public in fall 2024, it will still be in beta, not finished.
It’s important to note that Apple Intelligence isn’t just a random or boilerplate iOS or MacOS feature upgrade: The device loads a scaled-down version of Apple’s Foundation Models, its homegrown Large Language Models (LLMs), which are several gigabytes in size and have 3 billion parameters (compare that to the hundreds of billions of parameters used in models like GPT-3.5 and GPT-4, or the parameters Apple runs in its own data centers for Apple Intelligence’s “private cloud computing” feature).
And while Apple Intelligence is expected to improve Siri in 2024, most updates aren’t expected until 2025.
Developers have yet to reveal details about how this will work on iOS, iPadOS, and macOS, but it will have to be at least partially loaded into memory, and current estimates are that it could take up between 750MB and 2GB of RAM at runtime, depending on how well Apple’s memory compression tech works and other factors.
This is a significant amount of memory allocated to core OS functions that are not always used, and as a result some of that memory has to be dynamically loaded and taken out of memory as needed, adding new system constraints to your application and potentially putting additional strain on the CPU.
Current iPhone hardware is not enough
Earlier this month, I explained that older and current generation iOS devices simply don’t have enough power to handle generative AI tasks on-device. The base iPhone 15, with only 6GB of RAM, may struggle to keep up with the demands of Apple Intelligence as it evolves and becomes more integrated with iOS, core Apple apps, and developer apps. Older iPhones have 6GB or less RAM.
The iPhone 15 Pro with 8GB of RAM may be well suited for these tasks, as it’s the only iOS device (excluding Macs and iPad Pros) that developers can use to test Apple Intelligence before the iPhone 16 presumably ships in October, though many end users may experience suboptimal performance on an 8GB device even after Apple Intelligence is fully implemented.
Plus: The best phones you can buy: tested by experts
Early adopters may find the AI feature more useful for developers than for everyday users, as the system may require tweaks and updates to work to its full potential. And like owners of base iPhone 15 models and older iPhones who will lose access when they upgrade to iOS 18, we expect Apple Intelligence to be a feature that end users can easily turn off to conserve memory for app use.
The upcoming iPhone 16 may have more advanced hardware but may struggle with new AI features due to design cycles that didn’t account for these features. It may take one or two more product cycles for the hardware to fully support the new AI features rolling out in iOS 18 and beyond. As a result, users may experience suboptimal performance and a less-than-seamless user experience.
Apple Intelligence explains why you shouldn’t buy the iPhone 16
For these reasons, I view the iPhone 16 (and possibly the iPhone 17 as well) as a transitional product in Apple’s in-device AI efforts.
In addition to other silicon optimizations, future iPhones will likely need more RAM to fully support these AI features, which could lead to increased costs. If the base iPhone 16 requires 8GB of RAM to effectively run Apple Intelligence, that could push the starting price to $899 or more. Pro models could require 12GB or 16GB of RAM, which would increase the price. This would also mean that the Pro models would have the new A18 chip, while the base iPhone 16 would likely only have the current A17. However, a 10GB “A17X” could be produced to give the phone more memory headroom.
Also, all iPhone models (for now) that support Apple’s upcoming AI features.
In addition to memory issues, AI processing requires significant amounts of power and additional computing resources. Without significant advances in battery and power management technology, users may have to charge their phones more frequently, which can lead to increased battery drain, shorter battery life, and performance issues. The additional processing power required to run LLM on a device can tax the CPU and cause the device to overheat, affecting overall performance and reliability.
How will Apple Intelligence evolve?
Apple’s AI capabilities are expected to improve significantly over the next few years. By 2025, we may see more advanced and reliable integration of Apple Intelligence into mobile devices, Macs, and even products like the Apple Watch, HomePod, Apple TV, and consumer versions of the Vision headset.
Just as Apple does “private cloud computing” by running secure Darwin-based servers in its own data centers for more advanced LLM processing, it is likely that Apple will leverage cloud-based resources for these less powerful systems through fully developed data center capabilities and partnerships with companies like OpenAI and Google to extend Apple Intelligence to these less powerful devices.
To save the Vision Pro, Apple needs to do three things:
Alternatively, we could consider a distributed or “mesh” AI processing system, where idle devices in a home or enterprise can assist less powerful devices with LLM queries.
Apple can achieve this as planned by including Apple Intelligence and on-device LLM in macOS 15 Sequoia, iOS 18, and iPadOS 18. Subsequent changes to iCloud, iOS, iPadOS, and macOS will allow all devices to communicate generative AI capabilities and idle processing states, allowing devices to act as proxies for each other’s Apple Intelligence requests.
Enterprises could also employ mobile device management solutions to facilitate on-device LLM access for business Macs, and could use iPhones and Macs as proxies for Apple Watch and HomePod requests for mobile users. And more powerful Apple TVs with more on-board memory and processing power could act as the Apple Intelligence “hub” for all Apple devices in the home.
Imagine your iPhone tapping into the unused processing power of your Mac or iPad with on-device LLM to tackle complex AI tasks, increasing the accessibility of AI capabilities across Apple’s product line.
Important points to consider before upgrading to iPhone 16
Apple’s AI features are essentially in beta: Apple Intelligence is still in its early stages and may not deliver the seamless experience Apple users expect. The potential of Apple Intelligence will be realized in future iterations with more mature hardware and software optimizations.
Hardware limitations: The iPhone 16 may struggle to meet Apple Intelligence demands due to a design cycle that didn’t initially consider these features. The iPhone 16 is a transitional product, and it may take another product cycle or two before the hardware is fully compatible with the new AI features.
Battery and performance concerns: AI processing is power intensive, which can increase battery drain and cause performance issues.
More extensive enhancements: Consider improvements to camera quality, display, and overall performance, as well as AI features.
But I’m still optimistic
Despite the Apple Intelligence hype, there are plenty of other reasons to consider upgrading to the iPhone 16. The improvements in camera quality, display, and overall performance are still noteworthy. The iPhone 16 is likely to feature better sensors, enhanced computational photography, and better video capabilities. The display brightness, color accuracy, and refresh rate will also be improved, making it a better device for media viewing and gaming.
Plus: 3 Apple products you shouldn’t buy this month (including this iPad)
However, if you’re considering buying the iPhone 16 solely for its AI capabilities, you might want to temper your expectations as the AI features are still evolving and are unlikely to perform to the expectations touted during the WWDC 2024 keynote.
[ad_2]
Source link