With a crisper screen that takes up nearly the entire front, Apple has tested the complete removal of the home button—even a digital one—in favor of new gesture controls for tasks like going to the main app grid and opening multitasking, according to the people and the images. Across the bottom of the screen there’s a thin, software bar in lieu of the home button. A user can drag it up to the middle of the screen to open the phone. When inside an app, a similar gesture starts multitasking. From here, users can continue to flick upwards to close the app and go back to the home screen. An animation in testing sucks the app back into its icon. The multitasking interface has been redesigned to appear like a series of standalone cards that can be swiped through, versus the stack of cards on current iPhones.Curious what it all looks like? Prominent iOS 11 leakers Guilherme Rambo and Steve Troughton-Smith give us a hint:
This is what the floating dock looks like on an iPhone pic.twitter.com/BbKVIL7yO8— Guilherme Rambo (@_inside) August 30, 2017
In one fell swoop, Apple is ushering us into an era where the only hardware buttons are Volume and Power. If Bloomberg’s reporting is accurate (and there’s no reason to doubt any of it is wrong), the mobile paradigm is shifting. Instead of mashing buttons, we will be swiping when we're not using voice control. Fortunately, this possible future requires little to no work from third-party developers: gestures and full-screen app use are boilerplate features from Apple, as is Siri. Currently, SiriKit is only available to a select few types of app. VoIP, ride-sharing, payments, messaging and a few other categories are ‘white-listed,’ but users can launch any app via Siri. Users have to unlock their devices using Touch ID or a passcode, but the next iPhone is set to change that: rumors point to facial recognition unlocking rather than Touch ID, making a truly hands-free experience potentially possible. All this activity also dovetails with Handoff, Apple’s technology that predicts which device you’re attempting to use. Open a webpage with your iPhone, and an iPad or Mac will suggest you pick up where you seemingly left off. The same feature is found with AirPods; they can switch between an Apple Watch and iPhone depending on which device you’re using at the time. With HomePod launching this Fall/Winter, voice becomes a much larger player for users. While Apple is focused on the device's music and connected home features, HomePod is another conduit for voice interaction: it has a touch-enabled display, but no buttons. In the future, you could tell a HomePod to open an app on an iPhone, or ask it to send that picture your Aunt just sent you to your sibling – all without ever touching your phone. Google and Microsoft are also in play when it comes to voice, but both have their own shortcomings. Google's "Okay, Google" command does much of what Siri can, but Android's software buttons don't push us into a voice-and-gesture future; and there's no guarantee that Google's manufacturing partners will go along with that particular vision. Cortana is limited to Microsoft's ecosystem, which no longer has a mobile aspect thanks to the demise of Windows Phone. Amazon's Alexa service is weaving its way into a variety of hardware and apps, but is often hard to interact with outside of Amazon's Echo speaker lineup; it remains to be seen whether an announced partnership with Microsoft's Cortana will yield fruit. The small step of removing the Home button from the next iPhone coincides with the increased reliance on voice-activation for the tech industry as a whole. As a result, how developers and companies build apps and software, especially in the mobile arena, may change radically over the next several years.
And this is what the floating dock looks like on iPhone when invoked in apps pic.twitter.com/542r1SWMAr— Guilherme Rambo (@_inside) August 30, 2017