Hamburgers, Thumb Zones and Dark Modes: A Retrospective of Mobile Interface Design from 2008-2021
When designing apps, companies must focus on more than just quality and functionality ― they need to deliver a human experience through a digital interface.
The Evolution of Mobile Experience Design
Ultimately, adapting to evolving user needs and pain points is key to human-centered design. So is embracing new trends and finding ways to bring people together. As Douglas Rushkoff puts it:
“Being human is a team sport and anything that connects us to one another is good and pro-human. Anything that isolates us from each other is anti-human.”
As we look back at the last ten-plus years of mobile design for Apple ― in which the Human Interface Guidelines have compelled designers to make their apps learnable, intuitive, and consistent ― we see that many of the most successful applications have delivered the most human experiences.
Opening the floodgates of innovation
With the introduction of third-party apps came the introduction of new third-party designs and the creation of never-before-seen features. Pull-to-refresh functionality, for example, first debuted in Tweetie ― an iOS application for Twitter ― in 2008. The feature eliminated the need for a refresh button, opening up more valuable real estate for navigation and user actions.
Plenty of developers borrowed from past iterations of product designs, too: Facebook is credited with reviving the hamburger menu icon ― first designed for the Xerox Star, the world’s first graphical user interface ― with its 2009 app and mobile web experience. (Hamburger menus and other forms of drawer navigation are loved by some designers and reviled by others, since they require users to first open a new menu before they can reach their objective.)
Still, the early years of the App Store saw lots of innovation. WhatsApp launched in 2009, enabling international one-to-one chat (and later, free voice messaging, photo messaging, and calls). The app was designed to show statuses next to the individual names of users, helping humanize the interface.
Angry Birds also launched in 2009, and quickly gained mass appeal due to its humorous approach and ability to deliver bite-sized gaming experiences to users who could pick up and play it any time. Angry Birds used spring loaded animation ― animations that are able to move to a target point, overshoot a bit, and bounce back ― to create the experience of “slingshotting” birds at pigs and other targets.
Sight, voice & swiping
In 2010, Apple debuted the iPhone 4 with its so-called “Retina Display” ― the first Apple display with a pixel density so high that the human eye is (supposedly) unable to discern individual pixels at a normal viewing distance. While many considered the reference to human biology to be false marketing, the “retina” claims hammered home Apple’s emphasis on making the iOS experience as human-compatible as possible.
With Retina Display also came a camera on the iPhone 4 that was finally good enough to get people to leave their digital cameras at home. Instagram, which debuted in late 2010, capitalized on the shift: its UX reduced the number of actions to post a photo compared to other apps ― making it simpler to instantly share with photos ― and quickly earned what its founder called “Facebook-level engagement.” (Facebook would go on to buy Instagram in 2012.)
Many successful apps to follow focused their UI features on facilitating human-to-human connections:
(2011) Apple introduced voice recognition with Siri in the iPhone 4S
(2011) Snapchat pioneered the sharing of “disappearing” messages that are only available for a short time before they become inaccessible to their recipients
(2011) Uber (formerly UberCab) became the first ride-sharing service connecting users to drivers on-demand via smartphone
(2012) Tinder was among the first apps to use on-screen “swiping” gestures to control what content the user sees when browsing
Skeuomorphic to flat
Skeuomorphic designs mimic their physical counterparts ― like the Notes app in iOS 6. In 2013, Apple released iOS 7, which marked a major shift away from skeuomorphic design, as well as gradients and textures that marked Apple’s smartphone experiences to date. The shift to more minimal, “flat” designs freed developers from relying on old, physical imagery to create new, digital experiences.
A prime example of flat design was Vine, a short-form video sharing app that first emerged in 2012 and rapidly grew in popularity before its purchase by Twitter in late 2012. Vine deployed an innovative press-and-hold-to-record to function that went on to be replicated by Snapchat and many other apps. As Vine’s co-founder and creative director Rus Yusupov explained on the company’s blog in 2013:
"Old things are beautiful, but new things should look, well... new. That's why Vine doesn't have a play button. It also doesn't have a pause button, a timeline scrubber, a blinking red light, or dials and a brushed-metal finish to give you the impression that you're using a dusty video camera."
The shift to flat design took place partially because people had become familiar enough with smartphone devices that UI design no longer needed to reference physical objects so literally. Over the years, people had become comfortable touching glass and manipulating their devices that way. This trend continued as Apple introduced TouchID in late 2013 with the iPhone 5S, enabling users to unlock their phones with their fingerprints.
Designing flat apps also got easier for developers in 2014. That year, Brandon Walkin, a former Facebook product designer, created Origami, a free prototype tool for designing interactive interfaces.
Cross-device continuity, wearables, messaging & AR
Flat designs were also easier to standardize across devices, paving the way for the Apple Watch and WatchKit in 2015.
The first Apple Watch apps functioned mostly like their iOS counterparts. Leveraging the capabilities of the Apple Watch, a wave of fitness apps emerged around 2015, including Strava ― which showed runners and bikers real-time stats for elevation gain, average speed, distance, and heart rate.
Messaging apps also proliferated for iOS and the Apple Watch. WeChat ― which would eventually grow to become the #1 app in the world ― built its 2015 Apple Watch app to allow users to message contacts, accept friend requests, view nearby users, and share to its “Moments” feature. Slack limited its Apple Watch app to direct messages and mentions, with options to respond using emojis, canned messages, or voice replies. (Slack discontinued its Apple Watch app in 2018, as did Twitter and others, once users could get iOS notifications via Apple Watch.)
In 2016, Apple teased its ambitions toward augmented reality with the dual-camera technology of the iPhone 7 Plus. The company wouldn’t release its ARKit ― a set of APIs for augmented reality (AR) apps ― until 2017, but by then there had already been a breakout AR experience on iOS in 2016: Pokémon Go. In the app-based game, Pokémon characters are superimposed on maps of real-world spaces, in specific locations. To “catch” them, users would “toss” a ball via their touchscreen. Integrating one’s digital and physical environment became irresistibly entertaining.
Gestures & next-gen storytelling
2017 and 2018 saw a raft of Apple changes that affected UI design approaches. As home buttons disappeared and users had to familiarize themselves with new ways of operating their screens, designers focused more on gesture-driven UI.
The late 2017 release of the iPhone X marked the dawn of facial recognition with FaceID, as TouchID and the home button went away.
Haptic touch ― which provides tactile feedback when pressing down on the screen ― was introduced with the iPhone XR in 2018.
Without the home button to view, open, and close recently used apps, iPhone X users switched to swipe-to-close functionality (which originated in the Facebook app in 2012) to toggle through and close recently used apps.
Using the bottom of the screen to enable app-closing functionality is a prime example of designing for the thumb zone. That is, keeping important actions in the natural zone one’s thumb occupies when operating the device.
Contextual swiping functionality, offering swipe-to-delete or swipe to reveal, proliferated in 2017. Many apps ― such as Spotify and Apple’s Mail app ― started using swiping as a way to remove an item from a list or to expose an entire set of contextual actions.
The rise of live social video
“Human design" is about bringing people together, too.
First released in 2017, the HQ Trivia app is a live trivia game with live streams at 9PM every day, and 3PM on weekdays. HQ Trivia combined a social media experience with a live video host ― and a shot at real prize money ― to achieve huge popularity as interactive mobile “appointment viewing.”
Mobile-social-video app TikTok (formerly ByteDance) could be called “algorithm viewing” for the famously powerful system behind its “For You” feed ― which delivers highly-relevant content to each user. The app, which first debuted in China in 2016 but took roughly two years to enter the U.S. market, has become wildly popular not only due to its recommendations, but for a UI designed to both attract new users and keep existing users hooked with no interruption between 15 second videos.
Greater comfort & virtual experiences
In 2019, with iOS 13, Apple introduced Dark Mode ― which reduces battery consumption, light emitted by screens, and eye strain ― as a Display option on its devices. Today, Dark Mode works with systemwide apps and most third-party apps. Apps like Slack automatically operate in Dark Mode once a user has changed their iOS Display settings from Light to Dark.
And while AR has been around on iPhones since 2017, and big in gaming ever since, applications of AR have evolved into more “human” day-to-day use cases. Tools like the Home Depot app allow users to see what products would look like in their spaces, for example.
AR and virtual reality are expected to be huge priorities for Apple in 2021 and beyond, leading to more complex interactions and a need for “beyond the screen” design thinking. Apps like YouTube, which features a Virtual Reality channel in its iPhone app, already allow users to watch virtual reality via smartphone and manipulate the image they see by tapping, dragging, or simply moving their phone screens.
In April 2021, Apple iOS14.5’s release allows Apple Watch users to unlock their phones with a partial scan of just their eyes to accommodate for mask wearing. This is a great example of Apple using its ecosystem to accommodate a newly evolved user need and audience pain point.
Embracing what’s human
As the Human Interface Guidelines point out, designing for humans means making apps learnable, intuitive, and consistent. As our needs and pain points continue to evolve, it’s up to designers and developers to build “pro-human” Apple app experiences that bring us together ― via smartphone or in virtual reality ― to enjoy high-quality experiences.
Big Human is a startup studio based in New York focused on digital product design for the future.