Sailfish OS: UX Case Study
asked 2018-12-06 17:56:02 +0200
This post is a wiki. Anyone with karma >75 is welcome to improve it.
I've already shared some of the ideas included below, but they are now more complete and easier to grasp. And sorry for opening another UX/UI thread.
This is not critique, but for consideration.
Sailfish OS: UX Case Study
As with most case studies, the idea is to explore the current state in a particular area and make some propositions about improving it.
And the area of this case study is the UX of Sailfish OS(which is already pretty good). Of course, the proposed ideas shouldn't be too harsh and drive away the current users.
First, let's talk context. We have to deal with multiple contexts. So let's find the borders/limitations we have to work with.
1. Hardware Limitations
Now let's start with what we know. Jolla
are not making hardware anymore, so they don't have much control here. Taking this is consideration and moving on with the currently supported devices.
Mobile Phones: - Jolla (4.5 inch, 16:9 ratio) - Jolla C (5.0 inch, 16:9 ratio) - Sony Xperia X (5.0 inch, 16:9 ratio) - Sony XA2 (5.2 inch, 16:9 ratio) - Sony XA2 Ultra (6.0 inch, 16:9 ratio) - Sony XA2 Plus (6.0 inch, 18:9 ratio) Tablets: - Jolla Tablet (7.85inch, 4:3 ratio)
Based on the screen sizes we can limit the scope a bit. Phone makers are probably going to stay with the current trend of bigger screens for some time. (Possibly the upcoming folding phones will disrupt this trend).
In the context of hardware, the differences are mostly dimensional. Sadly, different and more innovative types of hardware for interacting with the devices are hard to find. So let’s focus just on the screen dimensions for now.
Let's take the current line of phones in the eco-system to find potential weak points.
Current device line
Since not having a proper crowdsourcing I will be using screenshots (from review videos) of one-handed use
of actual people holding the various version and not just posing for picture with the product. We can see the following:
- XA2: Most people hold the phone with their pinky on the bottom lip.
- XA2 Plus: Most people hold the phone more to the middle. Actually most people use the phone with two hands.
- XA2 Ultra: The same as XA2 Plus.
With the "one-handed" information (which to be honest is not much, but still something) we can start analyzing:
- XA2: Keeping your pinky on the bottom lip of the phone still gives enough grip and the phone can be used for somewhat normal operation. Top of the screen is unreachable. The opposite horizontal edge is unreachable.
- XA2 Plus: Users hold the phone more to the middle. Most possible explanation is weight balance, otherwise there would not be enough grip. Bottom of the screen is harder to reach(more on that later). Top of the screen is unreachable. The opposite horizontal edge is unreachable.
- XA2 Ultra: The same as XA2 Plus.
So we can see the most common weak point these days:
- Reachability - in all versions of the XA2
Check the weak points against Sailfish OS 3:
1. One-handed use is worse because of reachability issues
2. Quick closing an App on Sailfish 3 - with the top of the screen being unreachable, this is a problem. Also this action is only possible from the left or right portion of the top edge, which making it tricky to use.
Other issues can mainly come from the size of the device and not the OS itself.
Thinking about worst case scenario, I tried to make a heatmap of the reachability situation on the XA2 Plus.
Now, I will excuse myself again. This was done within limited time. And without proper crowdsourcing this shouldn't be taken seriously. Based on my hand size(considering myself having normal hand size).
Green Area - the reachable part of the screen
Orange Area - the trickier to reach part of the screen
Red Area - unreachable
2. Software Limitations
Again, let's start with what we already have and then see if something can be proposed.
Sailfish is using gesture based navigation
. And in it's current state it looks like this:
Let's examine the navigation screens
which may have reachability issues.
- Home:
- With just 2 App Covers on a row there will be no problem.
- But with 3 App Covers on a row it may get tricky to hit App Covers in the opposite edges on the Top of the screen.
- Events
- Since the Top part is used as a Presentational component with no controls, everything is perfect.
- Apps
- App Icons on the Top may not be reachable.
- Top Menu
- Some of the Quick Toggles on the Top may not be reachable.
Propositions
Some propositions will be given to improve the current Navigation
.
Let's start with the Quick Close
.
Quick Close
Currently the Quick Close
is triggered with a swipe down from the Left/Right part of the Top edge
. Since the Top edge of the screen is hardly reachable(especially on the XA2 Plus and Ultra), this is an area which can be improved. The problem is where
to move this action and how
it will be triggered.
Where
- it needs to go lower. This leaves us with the Left, Right, Bottom edge to initiate it. The Left and Right
edges give us best reachability regardless of the way the phone is held. So let's try from the Left and Right edge.
How
(it will be triggered?) - The Left
and Right
edges are used for Navigation between Home and Events and also to minimize Apps. So we will need new gesture. Something without adding too much complexity and preventing accidental closing. So let's check the natural swipe direction and go from there. The natural direction of a swipe from the edge is sideways and going down. So if the opposite direction is used for Quick Closing
an App, it will prevent from accidental closing. Let's check it.
And here it is a more complete overview, with a hint at the top of the screen, telling the user what is going to happen.
Now the Top Edge
is decluttered.
Which lead us to the Top Menu
.
Top Menu
What can be done here? The Top Menu
can be accessed from three places - from Home, Events, In-App
. What about "one-handed" usage. What if the swipe direction from Home
to Events
is stored and used for rearanging the Top Menu
in a more compact form. A demonstration will clear things out.
Left Swipe
From Home
Right Swipe
This is not the about the UI (colors, icon shapes etc) but more about the UX, so here is a little comparison of the old vs the proposed:
Landscape
And the Presentational Component from the Events Screen
can be leveraged, leaving the Top part of the screen just for information/metrics:
- onChange Notifications for the
Quick Toggles
(ON/OFF) - Available Memory
- Media player information
- Temperature
The final proposition is about a feature which will be nice to have.
System Search
Triggered only from the Home Screen and Events
.
Two 'paddles' will apear on Swipe Down + Hold
.
From there the swipe will be continued to either Left
or Right
.
Search
Results
View from above of the proposed navigation improvements:
App Drawer
The App Drawer
will be hard to improve without reducing the interactive area.
@rhodos: W.O.W. Thanks a bunch for this impressive study!
rozgwi ( 2018-12-06 20:34:54 +0200 )edit@rhodos: Very well done! Thanks a lot!
AkiBerlin ( 2018-12-06 20:37:59 +0200 )editVery thoughtful, good analysis and workflows.
I hope Jolla will take these under consideration.
juiceme ( 2018-12-06 23:23:13 +0200 )editImpressive! I concur on the lack of reachability of large screens and it would be great if Jolla considered the ideas presented here. Personally I don’t care for the large screen hype and would like smaller phones (the N9 was perfect imo), but with the variety of phones available now second best would have a UI that handles the large screens in an ergonomic way.
Mohjive ( 2018-12-07 09:56:47 +0200 )editThat's very detailed and it makes perfect sense. With original Jolla it was small enough for one handed use but the xpepria phone are way too big for it, your ideas would work much better!!!
Mariusmssj ( 2018-12-07 11:45:44 +0200 )edit