Tucked into Apple’s announcements at this year’s WWDC conference — especially iOS 11 — are juicy clues about what the software update for your iPhone and iPad might also bring to the next major iPhone.
Apple’s iPhone is due for a major overhaul. Last year’s iPhone 7 and 7 Plus drew fire for having a design and features that were too similar to previous models. The Siri voice assistant didn’t go far enough, and Apple failed to meet competitors on features like wireless charging and virtual reality. (There was, however, “waterproofing.”)
In September, we expect Apple to take the wraps off of a 10th anniversary iPhone stuffed with cutting-edge hardware and a fresh new design. We’re calling it the iPhone 8 for now, but rumors also point to iPhone X or iPhone Edition.
Here are the possible iPhone 8 features that jump out at me. Remember, this is pure speculation — or more somewhat educated guesses — that may never come to be.
Almost definite: iPhone 8 will have AR
Apple spent a damn long time pumping up AR in iOS 11 (it’s called ARKit), and the company is making some heady claims that it’ll have the largest AR platform in the world overnight, based on the iPhone’s rampant global success.
It’ll also be backward-compatible to 6S and 7 models. So that pretty much guarantees AR for the iPhone 8. By September, Apple could raise the bar and unveil something wild like AR overlays on Apple Maps (skip to the end for more on that).
Likely: Triple cameras, and more camera modes
It’s nearly a given that all iPhone 8 models will have dual rear cameras, and we know they’ll be AR-capable with ARKit. Apple could easily go a step further to add a third camera and make its AR on par with Google’s.
AR, which allows you to see virtual objects interact with the real world through your phone screen, can rely on software alone. We saw that with Pokemon Go, last summer’s hit mobile game that could plop characters in the world in front of you when you toggled it on.
Software-only AR is pretty crude. Hardware-based AR — which uses an infrared camera, a wide-angle camera, and an RGB camera — is much more precise. You need those cameras in order to map the dimensions of a room so you can place furniture in it.
That’s what Google’s using with Project Tango, and that’s the type of use-case that Apple demoed at the event when it showed how you could place a virtual lamp on the real life wooden table in front of you.
In addition to AR, we could also see some new camera modes. Apple showed us a new long exposure mode, which can capture some cool scenes of moving water, streaks of lights and star trails in the night sky — that could be just the beginning. Who know what Apple’s holding back. If you remember last year, Portrait Mode was a surprise on the iPhone 7 Plus.
Apple also talked about some behind-the-scenes work that will help photos and videos take up less space. That means more free storage space for you. But I wonder if that move could also free up space for new complex photo or video options that create larger files, like 360-degree images and video, which eat up a lot of space.
Likely: Siri could play a bigger hands-free role
Apple sped through its announcements for Siri at WWDC and didn’t get too deep into new capabilities, other than it having new voice tones, language translation, and support for more third-party apps. (“Hey Siri, call me a ride.”)
That makes me suspicious, and I have a feeling you’ll be able to do more with Siri on the iPhone 8 than Apple showed off in iOS 11. I’m especially thinking of the rumor that the iPhone 8 could ditch the home button and move the fingerprint sensor either under the display (you’d touch the screen, not a button) or to the back.
You can already go hands-free with Siri (if you turn this on in the settings). Without a physical home button to click into recent apps, I’m guessing that Apple could introduce some new Siri commands to control more aspects of the phone. Like, “Hey Siri, show me my recent apps.”
Possible: NFC could go beyond Apple Pay
Apple is expanding what the iPhone can do with NFC, which it uses right now for Apple Pay (and just Apple Pay, really).
Apparently you’ll be able to pair the iPhone to the Apple Watch with a tap, and there’s also support in iOS 11 for making NFC work with other things, not just Apple Pay. Called Core NFC, it lets apps spit out more information about a thing or place. Apple’s example: learning more about a product in a store or a museum exhibit. So if you’re at the Smithsonian and tap your phone on a placard, the screen could fill with details about what you’re looking at.
That’s not a very exciting example. What if Apple went as far as Android does with NFC, and announced that the iPhone 8 could be used to pair Bluetooth speakers, launch a phone call or become your bus or train pass. It’s far from definite, but it could bring the iPhone one step closer to being a true replacement for your wallet.
Possible: The iPhone could get its first optimized VR headset
You can already buy a cheap VR headset for your iPhone, but much like Google Cardboard, they’re not that great, especially compared to, say, Samsung Gear VR. So I say it’s only a matter of time before Apple unveils a better VR option for the iPhone.
Apple’s new software for the Mac, called High Sierra, can create VR experiences. To wit, we got a very convincing Star Wars demo at WWDC that used the HTC Vive headset. Apple’s new iMacs will also have the horsepower to run these VR environments, and Apple is working with partners, like Valve, to make VR on Macs a thing.
The next logical step: making the iPhone more VR-friendly, too.
An OLED display on the new iPhone could help and improved cameras could give it depth-sensing and tracking, like I mentioned above for AR. OLED screens are known for displaying true blacks and eliminating a backlighting effect, which are two things you want when the screen is pressed so close to your face.
Whether that’s an Apple-made VR headset or tools to help hardware partners build one for the iPhone, Apple is clearly interested in exploring virtual realities.
Very likely: HomePod integration
is Apple’s answer to the Amazon Echo and Google Home. And it’s just begging for meaningful integration with the iPhone and iPad. Apple didn’t go over those plans in detail, but it’s likely to revisit HomePod tricks alongside a hot new iPhone.
Here’s one I think we’re likely to see: Let’s say you’ve set your phone down somewhere in the house and can’t find it. Instead of firing up iCloud to use Find My Phone, what if you just asked the HomePod. “Hey Siri, find my iPhone.”
Or maybe you’ll be able to cast the songs and audiobooks you have queued up on the phone to the HomePod, so you can seamlessly move from one device to the other without starting over with a new request.
And if the HomePod doesn’t have its own built-in calling feature, perhaps it will act as a speaker for your at-home conference calls or family chats with the grandparents.
Long shot: AR overlay on Apple Maps?
iOS 11 is giving Apple Maps guidance that tells you which lane to move into before a turn. That got me thinking of another possibility. It’s a long shot, I know, but wouldn’t it be interesting to see that kind of guidance merge with AR?
Let’s say you’re walking down the street following directions. Imagine being able to open Apple Maps and see turn-by-turn directions superimposed on the real-world landmarks in front of you.
And if the phone is mounted on the windshield and the camera has a view of the road, you could see lane guidance over the street you’re on you can quickly glance at. That could completely eliminate some of the confusion (“Wait, is it this little street right here, or that one?”) when trying to find an address.
We’ll know more in a few months
Some other hints may bubble up from the developer preview and public beta. And of course come September, when we think Apple will hold its iPhone event, we’ll see how spot on or completely off-base my predictions are.
Until then, there are always the rumors.
Apple declined to comment on this story.