Improving the notification and control centers on modern phones
- Milen Yanachkov
- Dec 17, 2023
- 14 min read
Updated: Aug 11

FOREWORD
The research methods, findings, and general approach to solving the problem outlined in this case study are – for the most part – applicable to both iOS and Android, due to the very similar usability patterns for Notifications and Controls. That said, there are many Android skins out there, and some of them have their own intricacies, so this case study is focused primarily on iOS, due to its more controlled and predictable nature.
TOOLS
Figma | Survicate
MY ROLE
Product Design | UX & UI | Research
This has been a long time coming
I’ve been thinking about this problem for a number of years, ever since edge-to-edge displays and gesture navigation became the norm on smartphones. I don’t know why I waited this long to actually explore it. Guess I expected either Google or Apple to address it at some point so I just kept it tucked away at the back of my mind. But, alas, it remains virtually unchanged.
I’m talking about a problem with modern smartphone usability and ergonomics. A problem that resides at the intersection of smartphone hardware, interfaces, and the limitations of human hands.
Why has it remained unsolved for such a long time? Is it THAT difficult to solve? Or is it one of those things where you don’t know how much better it could be until the better solution arrives?
Let’s find out.
The problem (in short)
Smartphone screens have grown exponentially over the past decade, however, human hands haven’t.
Amazing revelation, I know, but let me explain.
Interfaces have responded to this change by clustering many of the most often used functions, like various controls and navigational elements, towards the bottom of the screen, where they are easily accessible with one hand. Yet two of those functions — two of the most used ones on any smartphone on a daily basis — have remained firmly out of reach ever since phone displays went edge-to-edge and the primary way of navigating a phone became gesture-based navigation.
I’m talking about the Control Center and Notifications on iOS, or their counterparts on Android.
Evaluating the current one-hand usability of the iOS Control & Notification Centers
If we take a look at a one-handed reachability heatmap of a modern iPhone, we will immediately notice that both the Control and Notification Center are deep in the red, unreachable zones of the heatmap.
Furthermore, they both have separate entry points on opposite sides of the screen, which requires even more finger movement. To top things off, once you open either the Control Center or the Notification Center, the information inside is either at the bottom (Notification Center) or you still have to move your finger to the bottom of the screen to close it (Control Center). So you are constantly going up and down in a triangular pattern as pictured below:

Even leaving aside one-handed usability, the triangular navigation pattern demonstrated in the video above calls for unneeded motion and reaching to different parts of the screen, be it with one or two hands. Furthermore, perhaps ironically, to open either the Notification or Control Center, you need to do it from the least reachable area of the screen, but closing them happens from the easiest-to-reach area.
Here is a video of the current state of one-hand use of the Notification and Control Center on iOS:
What is Apple’s current solution?
Apple is well-aware of the less-than-optimal one-hand usability of the Notification and Control Center and other iOS features. That’s why iOS has the so-called “One Hand” mode, which is triggered by swiping down on the navigation pill. This results in the whole screen scrunching up in the bottom half of the display, effectively improving the reachability of the Notification and Control Center, but at the expense of… pretty much everything on the screen. Then, you have to exit One Handed mode to continue using your phone as normal, which is rather cumbersome.
To add to this, I also found the downward motion to activate One Hand mode rather awkward, as it needs to be performed from the very bottom of the screen, where you don’t actually have enough room to reliably swipe down.
A brief history lesson
To be fair, iOS used to have a bottom-based control center that was accessible with a swipe from the bottom of the screen. In fact, Apple under Steve Jobs has always viewed one hand usability of iPhones as something very important, as depicted in this interesting iPhone 5 ad from 2012.
This is not to say that Jobs’ infamous quote from back then, that “no one wants a big phone”, holds true.
On the contrary, Jobs was very much wrong. People want big phones with big screens. That said, the core principle of ensuring a good one-handed usability is still very much relevant.
Even after Jobs’ passing, and after iPhones started following the trend and becoming bigger and bigger, iOS still had a bottom-based control center (granted, notifications were at the top, but still). This all changed with the release of the iPhone X with its new gesture navigation.
The gesture navigation seems to be one of the (apparent) reasons that iOS switched to a top-based Control and Notification Center. Can the two truly not coexist? Or has the problem not been explored in full depth before? Let’s find out.
How people hold their phones
The largest commercially available study on how people hold and interact with their phones reveals that there are 3 predominant ways people hold their phones:
One hand portrait (either left or right)
Cradling portrait (either index finger or thumb leading)
Two-handed portrait or landscape
Of those, 49% usually hold the phone with one hand when interacting with it. That’s both the biggest group and the one that has the most trouble reaching the Control and Notification Center. This group is then our main target, but in the process, we will also ease the use of other device holding styles as well.

Behind the data, reevaluation and new findings
The study cited above dates back to 2013. That’s at least the largest-scale commercially available study that I was able to find.
Now, it’s a decade-old at this point, so it needs to be calibrated for modern use cases, but I believe that the findings of the study are not to be discredited and are not outdated at their core. Why? Well, because smartphone displays may have grown bigger, but our hands remain exactly the same. In other words, while the overall screen area has expanded, the reachable area within has remained the same. This allows us to reevaluate the heatmaps from 2013 for the current year. In doing so, we will also discover new things!
Reevaluation of the 2013 study
Obviously, back in 2013, phones were a lot smaller than now. Luckily for us, the study uses the Samsung Galaxy S III as a representation of the findings. So, knowing the dimensions of the Galaxy S III, we can scale it to a modern smartphone – like the iPhone 14 Pro – and see how the heatmap aligns to a modern display.
Remember, screen sizes may have changed a lot since 2013, but our hands haven’t. So, the reachable area remains the same, but it is now within the boundaries of a much bigger screen.

Knowing the dimensions of the Samsung Galaxy S III, we can easily bring a modern smartphone to scale — let’s say the iPhone 15 Pro Max — and use this method to reliably scale the heatmap as well, seeing as how the reachable area is the same, only the screen has grown.
Do we still hold our phones with one hand?
One of my suspicions was that, given the increase in screen size and some interface elements being way out of reach at the top, the percentage of people who hold their phone with one hand must have decreased on average.
In order to validate this, I launched a poll that gathered a total of 352 votes. The aim was not to explore all the intricacies of one- and two-handed styles (e.g.specific use cases like typing or watching a video), but rather to evaluate whether the one-handed style was still overall predominant during general phone use 10 years later.
Surprisingly, the results indicate that it still is.

The poll could be followed by a more detailed survey in the future that looks into the specifics of different use cases. However, for the time being, the results are indicative enough that people still generally use their phones with one hand.
The way we hold our phones with one hand has changed
While reevaluating the heatmaps from the 2013 study, it occurred to me that — while I am primarily a one-handed user — I don’t hold my phone in the exact same way shown in the diagrams of the study. In fact, I rely much more on my pinky to prop the phone from below.
That’s because maintaining the distance from your thumb to the bottom of the screen is crucial — after all, that’s the easiest to reach area with many of the most important controls and functions.
Let me explain.
Looking at a modern phone next a decade old one, the most striking thing is just how big the screen is. There is so much more real estate to work with! The top and bottom bezel are almost completely gone, and the whole front side of the phone is effectively a screen, so the screen has uniformly increased in size vertically, right?
Well, no. I mean, yes, depending on how you look at it, but effectively no. Again, let me explain.

The thing is, the distance between the tip of our thumbs and the bottom of the screen is crucial, because many important interactions are clustered at the bottom of the screen. And since the screen has grown in both directions – up and down – we have adjusted to this by holding our phones higher up to make up for the difference and ensure that the bottom of the screen remains easily reachable.
In other words, since we are maintaining the same distance between our thumb and the bottom of the screen, the screen has effectively grown in size only upwards!
Validation and observational analysis of the PINKY PROP method
To confirm my suspicions on this adjusted grip on our phones – aside from the classic “I asked 5 friends” – I did an observational study on 50 randomly selected YouTube videos to determine if the “pinky prop” method is, indeed, a widespread way of interacting with a phone one-handed.
The videos were selected indiscriminately of the specific phone showcased in them, or the YouTube channel in general. The only requirement was that the person in the video uses the phone with one hand at some point.

The observational study confirms that the pinky method has risen to prominence in past years, perhaps as wide and as quick as bigger screen sizes have been adopted by phone makers. Obviously, this data needs to be further quantified with a survey, but for the time being it seems to be indicative enough of the relatively widespread adoption of this type of grip.
Let's be real, not everything can go in the green area
Obviously, the green area of the heatmap is the easiest to reach when holding the phone with either one or two hands. However, it would be foolish to think that we can just put everything in the green area. We need to think about the heatmap as more of a flexible outline. Quite literally, the lines that it draws are not as sharp and as defined as they are in the illustrations above.
In reality, the outlines of the heatmap are actually much blurrier. Depending on the size of your hands and even the slightest differences in how you hold your phone — with or without propping it with your pinky, higher or lower down — the green area will vary in size. It could a bit bigger or a bit smaller.

We also need to be mindful that the orange area is not necessarily bad, it's just harder to reach — but still reachable — when using the phone one-handed. So we don't need to take an extreme, and likely unfounded, stance on putting EVERYTHING in the green area of the heatmap. Instead, we need to prioritize.
Proposal
The solution seems to be (deceivingly) simple – all we need to do is position both the Control Center and notification center firmly in the green area of the heatmap, where they are easily reachable with one and two hands, with a thumb or an index finger.
We need to follow three (again, deceivingly) simple rules:
Rule 1.
The solution needs to be triggered from the green area of the heatmap
Rule 2.*
All interactive elements must fall within the reachable area of the heatmap, including the areas in orange
Rule 3.
The solution must not be at the expense of any existing features
* Rule 2 - It would be impossible to have everything, and at all times, fall squarely into the green area of the heatmap. We will be utilizing the full real estate of the screen, however, things will be laid out in a way where, as you interact more with a certain feature, the more precedence it will take in the easiest to reach area, pushing other things away. And in their initial state, all available features will be reachable within the green and orange area of the heatmap.
Leveraging existing knowledge to teach new lessons
When introducing new features, Apple often relies on a philosophy of evolving existing features that people are already familiar with, like adding Touch ID to the home button or using the same pinch gesture across the new Apple Watches and Vision Pro. We can apply this philosophy here as well and look into evolving one of the existing navigation gestures on iOS.
Currently, the iOS navigation recognizes 4 gestures triggered by linear motion:

We can go about this two different ways: we can either, as mentioned above, evolve one of these gestures or create a new one. Actually, if you think about it, the only gesture (if you can call it that) that the iOS navigation doesn't support is the tap.
Simply tapping the navigation pill would probably work great, but it doesn't exist as an interaction pattern currently. That's not a blocker, per se, but I really want to explore the boundaries and push the limits of one of these existing gestures, because I think there is a lot of untapped (pun not intended) potential there.
I'm talking about the long swipe up, which triggers the App Switcher. This screen could be so much more than a simple deck of apps. It could be an all-around information and control hub, accessible with a single swipe!
Demo and prototypes
Since the App Switcher has so much underutilised, prime real estate, we can comfortably (at least try to) combine it with the Notification and Control Centers, thus creating an all-in-one hub, accessible with either thumb or index finger, right from the bottom of your screen.
Okay, but without further ado — thanks for sticking around that far, and believe me, there’s a more to say about this — but before that, the demo:
Now that we are in the solution space, I’d like to expand a little bit on the core principles and base features of the combined Control and Notification Center. But before I do that, and because you've already seen the video, below you will find the two prototypes that I made for this project. One of them is made for web (laptop or PC) and simulates an iPhone, the other one you can run straight on your phone with the Figma app.
Now, keep in mind that the mobile prototype is just that - a prototype. It has it's quirks, like the navigation being only an approximation of the real experience (you will trigger the native iOS navigation more than a few times), but that's what you get when dealing with a proof-of-concept.
DISCLAIMER
This proof-of-concept prototype was made in Figma for the iPhone 15 and 15 Pro. As such, it may not properly scale on other devices. Given the high complexity of the prototype, it may take some time to fully load, depending on the speed of your connection. For the best experience, view in the Figma mobile app.
Core principles
Exploration and learning through motion
Currently, the iOS navigation is following an (unwritten) philosophy of natural learning through motion and interactivity. Try this - swipe up from the bottom navigation, as though to minimise the app, but don’t release it! Instead, keep swiping upwards. See how the app “attaches” to the deck of other apps? If you continue bringing the app upwards, it would detach from the deck and appear as though it were moving further away from you. Go wild. Flick it in any direction, see how it bounces, but inevitably contracts into its icon with a satisfying and visceral motion.
I want to apply this philosophy to my idea as well. This is why navigating the proposed solution relies entirely on swipe gestures. Every component is interactive and dynamically responds to the inputs as it they are happening, thus implicitly teaching a lesson in flow and direction through natural motion. Since everything is laid out to fully utilize the vertical expanse of the screen, everything is controlled with simple upwards and downwards motions.
Interaction-based precedence
Since not everything can fall in the green area at all times, different things need to take precedence based on user input. In its initial state, the Notifications and Controls Hub has all interactive elements within the green and orange areas of the heatmap.
When you start interacting with a particular element — be it the Controls or Notifications — it expands and takes precedence in the green and orange areas. Simply pulling it down (independent of what it is), makes it collapse and return to its initial state.

Fully customizable Control Center
When you first get to this new space, you’ll notice that part of the Control Center is visible, containing some of the most often used controls on there. However, as you pull it up, you realise that there’s a lot more beneath the surface. Via the ‘Edit’ button, you can customize this space to your heart’s content. You can even change the shortcuts at the top of the Control Center, so they are always a swipe away!

New App Switcher with tabs
You may (or may not, given the small size) have noticed that the App Switcher deck of apps also shows the icon and name for each app. However, in a very modest way, in a very small size, above each app screen.
I found that interesting and wondered:
“What are the icons and app names good for? Do they serve any navigation or findability purpose?”
I started playing around with it and noticed a pattern. When there were only a few apps open, or the app I was looking for was in the first 10 or so cards in the deck, I relied primarily on the app screens. However, when there were many more apps open, and I was looking for an app deeper down the deck — which required swiping through a lot of apps quickly — I noticed that I was relying on immediately recognisable (even in this small size) app icons.
I found that interesting and thought it merited more research. I’d be happy to do a larger-scale study on this, but for the time being I had to stick with the “bother 5 friends thing”.


Conclusion
This project took a lot of thinking. Not a lot of doing, although it may not look like it, but a lot of thinking.
I wanted to properly explore the problem space, roughly following the Double Diamond process, before jumping into the problem space, hence the rather lengthy dive into the various issues of one hand usability on modern smartphones.
Of course, now that this project and the concepts therein are in the wild, it is up to the potential user — you, yes YOU — to play around with the proof-of-concept prototypes and potentially (hopefully) chime in with opinions, thoughts, and suggestions.
If you'd like to reach out, feel free to do so, using the contact form!




