We have moved to a new Sailfish OS Forum. Please start new discussions there.
6

Proposal for additional system-wide gestures (swipes and taps)

asked 2019-01-13 23:07:21 +0200

BH gravatar image

updated 2019-12-31 19:13:10 +0200

Introduction

The following proposal aims at making the gesture based user experience of Sailfish OS more consistent throughout the entire system and eradicate the need for “makeshift” solutions in certain apps, that do not fit into the basic concepts of this user experience or hamper the “all screen real estate is yours” paradigm.

Although some of the described gestures might already be existing as patches, having them natively (out-of-the-box) from Jolla in Sailfish OS would be the best.


Swipes FROM the screen edge

In Sailfish OS, we currently have “Swipe from the screen edge towards the screen center” gestures:

  • A swipe from the right screen edge takes you to “Home”, with the covers of all your currently running apps.
  • A swipe from the left screen edge takes you to “Events”, where you can see your current notifications.
  • A swipe from the bottom screen edge takes you to the app grid, from where you can launch apps.
  • A swipe from the top screen edge closes the currently used app (when done from either corner), or takes you to the top menu (when done from the middle), where you can switch ambiences, trigger quick actions and toggle settings.

Swipes AT the screen edge

In addition to those “Swipe FROM the screen edge” gestures, this proposal introduces “Swipe AT the screen edge” gestures. They are performed by simply placing one finger at the desired screen edge and moving it along this edge (instead of moving it towards the screen center). Here is what they'll do:

  • Right screen edge: One out of two possible actions, depending on whether you currently are in an item view (e.g.: picture, web page, document, etc.) or in a list view (e.g.: gallery grid, message list, artist/album list, etc.):
  1. Zoom in/out (for item views):
    An additional, but much more practical way compared to the well known 2-finger-pinch gesture, because you need only one hand to conveniently do it. Be it that you only have one hand free because you are carrying/holding something with your other hand, or that you cannot use your other hand at all because it is injured/disabled/missing, this gesture greatly improves accessibility.
  2. Aimed scrolling (for list views):
    Quickly and precisely scroll to a desired location within the current list with the help of displayed clues. This is especially helpful for very long lists and (partly) eradicates the need for a search or filter feature. When you start moving your finger, the clues will appear (relative to your current location in the list). Continue moving your finger until you are at the desired location (clue), then lift your finger to immediately scroll to that location. The displayed clues depend on the list content. Following are a few examples:
    .) Gallery grid: Date on which the picture/video was taken, user-configurable as absolute date (January 2018, August 2018, etc.) or as relative date (6 months ago, 1 year ago, etc.).
    .) Message list: Date on which a message has been sent/received. Clues are similar to those of the gallery grid.
    .) Artist/Album list, Contact list: Letters of the alphabet.


  • Left screen edge: Open top/bottom pulley menu
    No need to scroll up/down the entire page in order to access a pulley menu (if one is available). With this gesture, it's immediately accessible no matter where you currently are on the page.


  • Bottom screen edge: Actions specific to the app currently being in the foreground
    Every app can have different actions here. Following are 5 examples:
  1. Browser: Go backwards/forwards one page (no need to bring up the navigation bar)
  2. Gallery: Go to previous/next picture or video (no need to entirely zoom out or scroll to the picture edge)
  3. Email: Go to previous/next e-mail
  4. Media: Skip to previous/next track
  5. Documents: Go to previous/next document page (no need for scrolling to get there)


  • Top screen edge: Actions specific to a “chosen” (background) app
    The best example for this gesture is the Media app. If you are listening to music while using a different app in the foreground, you can instantly skip to the previous/next track in your playlist without having to switch to the Media app first. The app that receives these swipe events can be chosen in different ways:
  1. By selecting an app in “Settings” that will automatically get this status once it is started (ideally with the additional option to only do the assignment if no other running app is the currently chosen app).
  2. By adding a corresponding button to every app cover in the “Close/Rearrange apps” view (the one you get by long-pressing in “Home”), that allows both selecting/switching and deselecting an app.

Double-taps AT the screen edge

By introducing “TAP at the screen edge” gestures, another set of actions is implemented. For the sake of user experience simplicity, this proposal only considers double-taps. Like with the swipes, every screen edge triggers a different action when double-tapped:

  • Right screen edge: One out of three possible actions:
  1. Reset zoom level to default (for item views):
    Pretty much straightforward and fits perfectly with the swipe gesture of this screen edge (see above).
  2. Manually enter desired scrolling location (for list views):
    Display a date selector, letter selector, etc. (depending on the list content) for even more precise control compared to the swipe gesture of this screen edge (see above).
  3. Alternative function (user-configurable – instead of the two actions described above):
    Trigger one of the left screen edge actions (see below).


  • Left screen edge: 2 possible actions (user configurable):
  1. Switch between the 2 last used apps (convenient if you want to do copy-and-paste) or between the currently used app and the “chosen” app (see above).
  2. Go to the “Permissions” screen of the currently used app (yet to be introduced in Sailfish OS), where you can modify all access permisions for this app on-the-fly as well as check when, how often and how long this app has accessed or attempted to access certain resources of your device, such as your address book or any positioning data (convenient for privacy management and for finding out, if an app deserves your trust).


  • Bottom screen edge: Actions specific to the app currently being in the foreground
    Again 5 examples:
  1. Browser: Go to the tab selection screen
  2. Gallery: Close the current picture/video and go back to the gallery grid
  3. Email: Close the current e-mail and go back to the containing folder
  4. Media: Pause/resume playback
  5. Documents: Close the current document and go back to the document list


  • Top screen edge: Actions specific to the “chosen” app
    To stay with the example from above, by using this gesture, you can pause/resume playback of the Media app without having to leave your currently used app at all.

Tooltips for screen edge swipes and taps

To assist the user in knowing/remembering which gesture does what, tooltips are used. If the user places one finger at any of the screen edges without swiping or tapping, icons will appear that tell what will happen if the user swipes into a certain direction from the current position or (double-)taps at this position. If the user lifts the finger without having moved it, the icons will automatically disappear and no action will be triggered.
In case of the app-specific actions (bottom and top screen edges), these tooltips can also include additional information, such as for example the title, elapsed time and remaining time of the track currently played by the Media app.


Swipes and taps on "Home"

Last, but not least, this proposal also introduces additional gestures for the app covers.
Currently, we have the following possibilities on “Home”:

  • Tapping an app cover switches to that app.
  • Tapping the cover button of an app triggers the respective cover button action.
  • Long-pressing anywhere on the screen switches to “Close/Rearrange apps” view.
  • Swiping left or right from anywhere inside the screen takes you to “Events”.
  • Swiping up from anywhere inside the screen takes you to the app grid.

With the exception of “swipe up”, all of these gestures remain unmodified. For the “swipe up” gesture, it now makes a difference if the swipe is done on a free spot inside the screen or on an app cover:

  • Swiping up on a free spot inside the screen takes you to the app grid.
  • Swiping up on an app cover will display up to 4 additional cover buttons (depending on the app).

A similar differentiation is made for the “swipe down” gesture:

  • Swiping down on an app cover dismisses the additional cover buttons (if they were displayed) or closes the app (if the additional cover buttons were not displayed).
  • Swiping down on a free spot inside the screen takes you to the top menu.
edit retag flag offensive close delete

Comments

1

I like the way it is at the moment. I just found a patch to restore the MeeGo style close of apps. I don'T mind adding more gestures, but I just think it should be optional to the user to decide which let say type or style of gestures should be used.

Nature is pragmatic and as such it is economic - so keep it simple is always a winning strategy and should be also applied here. I already commented on another place about this. At least let me decide at the end what I should use - for example old style vs. new style. Having choice == Having freedom!

thanks

deloptes ( 2019-01-14 02:13:33 +0200 )edit
3

While the idea is sound, a lot of gestures can confuse users and will require more mental power when using the phone.

addydon ( 2019-01-14 07:20:49 +0200 )edit

@addyon not really.

Complex gestures and shortcuts are difficult only for a beginner and will help power users a lot.

juiceme ( 2019-01-14 08:44:35 +0200 )edit
1

I think overloading the UI with multiple gestures, making the outcome dependent on where exactly the gesture was performed, is a certain road to failure. If you remember the very first reviews by tech journalists when Jolla 1 came out, many of them just couldn't use the phone because they failed to grasp the concept of "swipe from outside the screen" versus "swipe from inside the screen". The community was more intelligent, but just recently I read here on TJC that somebody claimed he never managed to use the gesture for pulley menus.

In Sailfish 3, the moved into the direction you propose here with "1 gesture, 2 possible results" with both top menu and app close being triggered by the same gesture. This was not a good decision and makes me see a top menu whenever I want to close an app. I'm certain that when moving my finger down the screen, I don't want to pay attention to where exactly it is and if the gesture could be misread as "open pulley menu" when my intention was to scroll through a list.

ossi1967 ( 2019-01-14 11:13:29 +0200 )edit

I see it a failure if system usage is geared to a direction of the least common denominator; Should we really adopt the philosophy that the dumbest people should dictate how an UI is to be used???

At least there should be "advanced configuration options" that could be activated when an user wants to enable options that could confuse casual tech journalists and other complete idiots.

juiceme ( 2019-01-14 11:47:31 +0200 )edit

2 Answers

Sort by » oldest newest most voted
4

answered 2019-01-14 22:35:23 +0200

figgis-diggis gravatar image

updated 2019-01-14 22:49:49 +0200

Oh please no.

Especially double-taps on the screen edges. Mixed with the edge swipes this is gonna be one hell of a hurdle.

Current layout (four swipe directions with different results depending on whether the swipe started on the edge or in-app) is right on the brink of being overcomplicated, especially for newcomers (and, IMHO, with the top menu the top edge layout actually became overcomplicated).

As for cover swipes — we already had it in Sailfish 1. Turned out to be not exactly the best idea.

Gestures are great, but it doesn't mean you have to do everything with them. Personally, I think the system-wide gestures should follow the "one direction — one function" rule without any of the "turn right in the middle and open a new window instead of closing the current one" or "double-tap some empty space to launch settings" (or "start from the corner and the gesture result is different") stuff. The system-wide controls must be clear, coherent and predictable.

Of course, in-app gestures can be more complicated — I really wouldn't mind Opera-style tab closure in browser.

edit flag offensive delete publish link more

Comments

My wife can't do an wdge swipe :-) but she needs only one app at a time so after multiple somethings, phone is locked or she uaes the button. You don't want to.watch that :-) However the along edge swipe seems a good idea. Too much different behaviour between System wide and in app is not good. But that is the question of what a new gesture.does, not if it is there

pawel ( 2019-01-15 02:38:54 +0200 )edit
1

@figgis-diggis: OT in a way, but just to add it: The cover swipes were a stroke of genius and worked exceptionally well, a lot better than the current tiny-button-madness. It is sad they had to make room for a concept that was never really implemented in practice, the 3-pane carousel with the so-called "partner space". A tragedy.

ossi1967 ( 2019-01-15 10:32:49 +0200 )edit

@ossi1967: I don't think that your comment is off-topic. I find the cover button concept very interesting. However, it wasn't mature enough back then:

  1. The fact that cover button actions could be triggered just by swiping over the app cover lead to the risk of unintentionally triggering those actions.
  2. Having just 2 possible cover button actions per app made them unattractive for many use cases.


In my proposal, I therefore tried to overcome these "flaws":

  1. Cover button actions can only be triggered by tapping the respective cover button, not by swiping over the app cover. Result: No risk of unintentional triggers.
  2. Every app cover can have up to 5 different cover buttons (a main one that is displayed by default, and 4 additional ones that are displayed when swiping upwards on the respective app cover). Result: The whole concept gets attractive for much more use cases.


The presence of those additional cover buttons can be indicated to the user by displaying a glowing line at the bottom of the respective app cover, similar to the glowing line that indicates the presence of a pulley menu.

BH ( 2019-01-16 01:02:19 +0200 )edit
1

answered 2019-12-31 19:09:03 +0200

BH gravatar image

A few months have passed since I posted my proposal here, and I am curious whether any of those ideas have found their way into the SFOS development pipeline.

One thing that goes into the direction of the proposed ideas is the introduction of a letter bar on the right edge of the screen in the People app. I think this makes the app more accessible to people who are new to SFOS and should therefore be presented by default. However, users who want the “pure” Sailfish user experience (“All screen real estate is yours.”, remember?) could IMHO get much more.

To stay with the example of the letter bar: This bar could only be displayed while the user performs the assigned swipe-at-the-screen-edge gesture (read my original post) and automatically disappear afterwards.

Yes, I fully agree: People coming to Sailfish from iOS or Android will likely be scared away if they are confronted out-of-the-box with an UX that has no permanently visible UI elements. But that’s what makes SFOS so unique. What you don’t need for your current task is simply not displayed, but always immediately available if you know where to put your thumb.

Therefore, as an addition to my original proposal, I suggest that 2 standard UX configurations are implemented (technically more correct: 2 parameter sets), from which the user can choose:

  • Easy UX: This is the default configuration for fresh SFOS installations. Similar to how it is currently implemented, more UI elements are permanently displayed (such as the letter bar in the People app). The learning curve is flat and people who don’t want to learn a new user experience will feel more at home more quickly.
  • Pure UX: All screen real estate is yours. No cluttering with UI elements (but again: they immediately appear when you want to use them and immediately disappear once you are done with them). Very gesture based (read my original post). Ideally, everything can be operated with just one thumb.

Of course, since these 2 configurations are just parameter sets representing the “extremes”, all users can fine-tune them to their likes and physical abilities (more/less gestures, more/less permanently displayed UI elements, and so on).

Then, Sailfish OS will become more attractive for more people:

  • It reaches out to those who want to get rid of Apple or Google without forcing a potentially unfamiliar user experience on them.
  • At the same time, it preserves and further develops its unique user experience that makes it stand out from the big mobile operating systems.

Have a happy new year!

edit flag offensive delete publish link more

Comments

I see a major issue with your thought process: You are discounting the factor of screen size and one-handed use.
Currently swipes from the left and right edge do the same thing and most of the UI is structured in such a way that you only rarely need to swipe from the top or bottom - basically only for closing an app.
Try holding one of the larger phones one-handed and swiping from the side opposite of your thumb with your thumb. This is why both left and right edge swipes do the exact same thing.
Personally, any on-edge swipes (with the exception of the quick scroll bar in the contacts app, which I like but which is such a rare interface item that it should stay visible for discoverability reasons, and which also needs a way to switch to left-handed mode - as a right-handed person, it would be impossible for me to use it one-handed if it was on the left side) would make my phone basically unusable for me.
I have already been struggling with current phone sizes and one of the aspects of Sailfish that appealed to me was that it allows me to use even my (by my standards) humongous Xperia XA2 mostly with just one hand.

And this is all without even throwing in the concerns about complexity and the like.
I do, however, like the idea of revisiting the cover interactions.

Anon ( 2020-01-01 16:00:41 +0200 )edit

@Anon: FYI: Take a look at Settings > Gestures. There you can configure the left edge swipe to behave differently compared to the right edge swipe.

My thumb can easily reach every edge of my phone's display when I hold it in the same hand, but your mileage may vary of course. And I agree with you: On large displays, it will get difficult and inconvenient.

However, the aim of all proposed additional gestures is not to replace already implemented methods, but to provide shortcuts and enhance accessibility. You are not forced to use them if you don't want to use them or cannot use them conveniently.

For example: You can disable all swipe-at-the-edge gestures. No problem. You can still zoom content using 2 fingers (but you need to use your second hand to be able to do it). You can still skip tracks in the media player (but you need to switch to it first). You can still access any pulley menu (but you need to scroll all the way up/down first). And so on...

It's like on a desktop-PC: You can only use the mouse if you prefer. You can only use keyboard-shortcuts if you prefer. You can use both in combination if you prefer. It's completely up to you. But you can only do these things if the software provides them to you, thus giving you freedom of choice.

BH ( 2020-01-01 22:37:52 +0200 )edit
Login/Signup to Answer

Question tools

Follow
7 followers

Stats

Asked: 2019-01-13 23:07:21 +0200

Seen: 591 times

Last updated: Dec 31 '19