Ask / Submit

Sailfish OS: UX Case Study

asked 2018-12-06 17:56:02 +0200

this post is marked as community wiki

This post is a wiki. Anyone with karma >75 is welcome to improve it.

updated 2018-12-08 10:58:06 +0200

rhodos gravatar image

I've already shared some of the ideas included below, but they are now more complete and easier to grasp. And sorry for opening another UX/UI thread.

This is not critique, but for consideration.

Sailfish OS: UX Case Study

As with most case studies, the idea is to explore the current state in a particular area and make some propositions about improving it.

And the area of this case study is the UX of Sailfish OS(which is already pretty good). Of course, the proposed ideas shouldn't be too harsh and drive away the current users.

First, let's talk context. We have to deal with multiple contexts. So let's find the borders/limitations we have to work with.

1. Hardware Limitations

Now let's start with what we know. Jolla are not making hardware anymore, so they don't have much control here. Taking this is consideration and moving on with the currently supported devices.

Mobile Phones: 
- Jolla (4.5 inch, 16:9 ratio)
- Jolla C (5.0 inch, 16:9 ratio)
- Sony Xperia X (5.0 inch, 16:9 ratio)
- Sony XA2 (5.2 inch, 16:9 ratio)
- Sony XA2 Ultra (6.0 inch, 16:9 ratio)
- Sony XA2 Plus (6.0 inch, 18:9 ratio)

- Jolla Tablet (7.85inch, 4:3 ratio)

Based on the screen sizes we can limit the scope a bit. Phone makers are probably going to stay with the current trend of bigger screens for some time. (Possibly the upcoming folding phones will disrupt this trend).

In the context of hardware, the differences are mostly dimensional. Sadly, different and more innovative types of hardware for interacting with the devices are hard to find. So let’s focus just on the screen dimensions for now.

Let's take the current line of phones in the eco-system to find potential weak points.

Current device line

Current Devices

Since not having a proper crowdsourcing I will be using screenshots (from review videos) of one-handed use of actual people holding the various version and not just posing for picture with the product. We can see the following:

  • XA2: Most people hold the phone with their pinky on the bottom lip.
  • XA2 Plus: Most people hold the phone more to the middle. Actually most people use the phone with two hands.
  • XA2 Ultra: The same as XA2 Plus.

With the "one-handed" information (which to be honest is not much, but still something) we can start analyzing:

  • XA2: Keeping your pinky on the bottom lip of the phone still gives enough grip and the phone can be used for somewhat normal operation. Top of the screen is unreachable. The opposite horizontal edge is unreachable.
  • XA2 Plus: Users hold the phone more to the middle. Most possible explanation is weight balance, otherwise there would not be enough grip. Bottom of the screen is harder to reach(more on that later). Top of the screen is unreachable. The opposite horizontal edge is unreachable.
  • XA2 Ultra: The same as XA2 Plus.

So we can see the most common weak point these days:

  • Reachability - in all versions of the XA2

Check the weak points against Sailfish OS 3:

1. One-handed use is worse because of reachability issues

2. Quick closing an App on Sailfish 3 - with the top of the screen being unreachable, this is a problem. Also this action is only possible from the left or right portion of the top edge, which making it tricky to use.

Other issues can mainly come from the size of the device and not the OS itself.

Thinking about worst case scenario, I tried to make a heatmap of the reachability situation on the XA2 Plus.

Now, I will excuse myself again. This was done within limited time. And without proper crowdsourcing this shouldn't be taken seriously. Based on my hand size(considering myself having normal hand size).

Green Area - the reachable part of the screen

Orange Area - the trickier to reach part of the screen

Red Area - unreachable

XA2 Plus HeatMap

2. Software Limitations

Again, let's start with what we already have and then see if something can be proposed.

Sailfish is using gesture based navigation. And in it's current state it looks like this:

Sailfish Navigation

Let's examine the navigation screens which may have reachability issues.

  • Home:
    • With just 2 App Covers on a row there will be no problem.
    • But with 3 App Covers on a row it may get tricky to hit App Covers in the opposite edges on the Top of the screen.

  • Events
    • Since the Top part is used as a Presentational component with no controls, everything is perfect.

  • Apps
    • App Icons on the Top may not be reachable.

  • Top Menu
    • Some of the Quick Toggles on the Top may not be reachable.


Some propositions will be given to improve the current Navigation.

Let's start with the Quick Close.

Quick Close

Sailfish Quick Close

Currently the Quick Close is triggered with a swipe down from the Left/Right part of the Top edge. Since the Top edge of the screen is hardly reachable(especially on the XA2 Plus and Ultra), this is an area which can be improved. The problem is where to move this action and how it will be triggered.

Where - it needs to go lower. This leaves us with the Left, Right, Bottom edge to initiate it. The Left and Right edges give us best reachability regardless of the way the phone is held. So let's try from the Left and Right edge.

How(it will be triggered?) - The Left and Right edges are used for Navigation between Home and Events and also to minimize Apps. So we will need new gesture. Something without adding too much complexity and preventing accidental closing. So let's check the natural swipe direction and go from there. The natural direction of a swipe from the edge is sideways and going down. So if the opposite direction is used for Quick Closing an App, it will prevent from accidental closing. Let's check it.

Sailfish Swipe Comparison

And here it is a more complete overview, with a hint at the top of the screen, telling the user what is going to happen.

Sailfish Close App

Now the Top Edge is decluttered.

Which lead us to the Top Menu.

Top Menu

What can be done here? The Top Menu can be accessed from three places - from Home, Events, In-App. What about "one-handed" usage. What if the swipe direction from Home to Events is stored and used for rearanging the Top Menu in a more compact form. A demonstration will clear things out.

Left Swipe Top Menu - Left Swipe

From Home Top Menu - From Home

Right Swipe Top Menu - Right Swipe

This is not the about the UI (colors, icon shapes etc) but more about the UX, so here is a little comparison of the old vs the proposed:

Top Menu Comparison


Top Menu Landscape

And the Presentational Component from the Events Screen can be leveraged, leaving the Top part of the screen just for information/metrics:

  • onChange Notifications for the Quick Toggles (ON/OFF)
  • Available Memory
  • Media player information
  • Temperature

The final proposition is about a feature which will be nice to have.

Triggered only from the Home Screen and Events.

Two 'paddles' will apear on Swipe Down + Hold. From there the swipe will be continued to either Left or Right.

Search Sailfish Search

Results Sailfish Search Results

View from above of the proposed navigation improvements:

Sailfish Navigation Comparison

App Drawer

The App Drawer will be hard to improve without reducing the interactive area.

edit retag flag offensive close delete



@rhodos: W.O.W. Thanks a bunch for this impressive study!

rozgwi ( 2018-12-06 20:34:54 +0200 )edit

@rhodos: Very well done! Thanks a lot!

AkiBerlin ( 2018-12-06 20:37:59 +0200 )edit

Very thoughtful, good analysis and workflows.

I hope Jolla will take these under consideration.

juiceme ( 2018-12-06 23:23:13 +0200 )edit

Impressive! I concur on the lack of reachability of large screens and it would be great if Jolla considered the ideas presented here. Personally I don’t care for the large screen hype and would like smaller phones (the N9 was perfect imo), but with the variety of phones available now second best would have a UI that handles the large screens in an ergonomic way.

Mohjive ( 2018-12-07 09:56:47 +0200 )edit

That's very detailed and it makes perfect sense. With original Jolla it was small enough for one handed use but the xpepria phone are way too big for it, your ideas would work much better!!!

Mariusmssj ( 2018-12-07 11:45:44 +0200 )edit

2 Answers

Sort by » oldest newest most voted

answered 2018-12-08 18:58:45 +0200

Spark gravatar image

updated 2018-12-08 19:01:39 +0200

I agree on the general approach to make SFOS one-hand usable again. But in the details I disagree with two suggestions you made:

1) Have you tried your suggested closing app gesture on a phone? The outward and then upward movement is entirely against our tumb's motion physiology. It's a horrible gesture to perform.

2) Concerning the top-menu: I see a benefit in creating a top-menu which leans towards the side of execution (thus mimicking left-handed and right-handed use of the phone). I don't see a benefit in creating two differently layouted top-menus like you suggest with your left/right vs centered top-menu. This creates confusion.

The global search would be a wonderful function, hope we will see it. I would realize it preferrably with a pulley integrated in the top menu: If you pull down the top menu only down to the first bar (where the "switch-off" symbol appears), it would lock-in a search bar ready to write. The found results would then appear below it. If you don't stop after the first "pulley-bar", the top-menu would extend fully as it is now.

I have another suggestion following your colored areas reachable by one-hand use: The app grid could extend only across the lower two-thirds of the screen making every app reachable by the thumb of the phone-holding hand. The upper third could be used for other useful (but non-clickable) information, e.g. provider name, clock, data usage, or other customized things.

edit flag offensive delete publish link more



Re 1: I tried to make this gesture few times and it seems to be sufficiently simple. I think its an idea that this gesture is somewhat unnatural to avoid accidental closures of the applications. So, to me it looks like it could work very well.

rinigus ( 2018-12-08 19:45:01 +0200 )edit


Thank you for your comment. I like this kind of positive criticism where someone can actually point out potential weak points. It's much easier to make something properly, working this way.

As always, there are many other possibilities. And more or less I have tried to justify my propositions without making the post too long.

1) I've tried the suggestion as much as you did, considering this is just a suggestion and not a real-world working example. As said, it's intentionally opposite the normal motion, with the idea of preventing accidental closes. And it would be more of a 'flick' upwards, not a whole swipe to the top edge. You can start the gesture more to the middle(or below that) and just 'flick' upwards.

2) I am with you on the confusion part. It's more of a cheap gimmick that one may want (or not) to have. Of course, it can be static layout without the top 1/3 of the screen.

A potential problem we can both have with making the 1/3 top of the screen just 'Presentational', is power usage for updating the information/metrics. This is not something that can be tested without a real working example. And of course it would be different, based on the kind of information presented there and the way its fetched.

rhodos ( 2018-12-08 19:46:35 +0200 )edit

I wonder if it's possible to let the user configure most of the gestures by himself. Every user has different habits, abilities or hardware and different experiences with software user interfaces. I for example hold my XperiaX by locking the strap of the booklet shaped leather case between index and middle finger, have big hands, have long term training with Operabrowsers mouse gestures and even love the patch "swipe to lock". There are some things in the UI _I_ would change for me and some others I would not. And whenever there's an innovation introduced, some people are moaning.

Yes, creating an UI is some kind of artwork, but each Linux desktop comes with some individually unacceptable defaults I need to eliminate - sometimes oppositional to the developers primary intention. For me it should just work.

So, why not a f*ing settings page, where you can define and reconfigure the gestures and assign them to actions and pages? Why are these basics always sacred? My phone is not a car, where the brake pedal is supposed to be located in the middle. My phone is a very private and personal thingy, and noone else should touch it. And if, I have to explain every action anyway, because none of the contemplable persons owns a Sailfish device....

Robomike ( 2018-12-08 22:26:12 +0200 )edit

I am with you on that. The two direction gesture to close the app is wrong. Its about making it simpler not complicating things. One gesture one action.

ApB ( 2018-12-09 12:01:26 +0200 )edit

answered 2018-12-07 21:54:27 +0200

deloptes gravatar image

Very impressive study however, I opt for smaller size as I use it mostly for talking to people. I do not want to cary with me a book size device, just to be able to call someone. From time to time, reading news or writing sms it is good to have a bigger screen. IMO screen should be > N9 and < Xperia X.


edit flag offensive delete publish link more



Thats why I opted for an Xperia X Compact. And thanks a lot to @g7 for making this option viable.

Vieno ( 2018-12-08 11:22:22 +0200 )edit

Hi Vieno, can you elaborate on this please - I am very interested in replacing the Xperia X with something else and heard/Read that it could be possible to run Sailfish on X Compact.

thanks in advance

deloptes ( 2018-12-09 00:42:45 +0200 )edit
Login/Signup to Answer

Question tools



Asked: 2018-12-06 17:56:02 +0200

Seen: 940 times

Last updated: yesterday