We have moved to a new Sailfish OS Forum. Please start new discussions there.
1 | initial version | posted 2019-01-13 23:07:21 +0200 |
The following proposal aims at making the gesture based user experience of Sailfish OS more consistent throughout the entire system and eradicate the need for “makeshift” solutions in certain apps, that do not fit into the basic concepts of this user experience or hamper the “all screen real estate is yours” paradigm.
Although some of the described gestures might already be existing as patches, having them natively (out-of-the-box) from Jolla in Sailfish OS would be the best.
In Sailfish OS, we currently have “Swipe from the screen edge towards the screen center” gestures:
In addition to those “Swipe FROM the screen edge” gestures, this proposal introduces “Swipe AT the screen edge” gestures. They are performed by simply placing one finger at the desired screen edge and moving it along this edge (instead of moving it towards the screen center). Here is what they'll do:
By introducing “TAP at the screen edge” gestures, another set of actions is implemented. For the sake of user experience simplicity, this proposal only considers double-taps. Like with the swipes, every screen edge triggers a different action when double-tapped:
To assist the user in knowing/remembering which gesture does what, tooltips are used. If the user places one finger at any of the screen edges without swiping or tapping, icons will appear that tell what will happen if the user swipes into a certain direction from the current position or (double-)taps at this position. If the user lifts the finger without having moved it, the icons will automatically disappear and no action will be triggered.
In case of the app-specific actions (bottom and top screen edges), these tooltips can also include additional information, such as for example the title, elapsed time and remaining time of the track currently played by the Media app.
Last, but not least, this proposal also introduces additional gestures for the app covers.
Currently, we have the following possibilities on “Home”:
With the exception of “swipe up”, all of these gestures remain unmodified. For the “swipe up” gesture, it now makes a difference if the swipe is done on a free spot inside screen or on an app cover:
A similar differentiation is made for the “swipe down” gesture:
2 | No.2 Revision |
The following proposal aims at making the gesture based user experience of Sailfish OS more consistent throughout the entire system and eradicate the need for “makeshift” solutions in certain apps, that do not fit into the basic concepts of this user experience or hamper the “all screen real estate is yours” paradigm.
Although some of the described gestures might already be existing as patches, having them natively (out-of-the-box) from Jolla in Sailfish OS would be the best.
In Sailfish OS, we currently have “Swipe from the screen edge towards the screen center” gestures:
In addition to those “Swipe FROM the screen edge” gestures, this proposal introduces “Swipe AT the screen edge” gestures. They are performed by simply placing one finger at the desired screen edge and moving it along this edge (instead of moving it towards the screen center). Here is what they'll do:
By introducing “TAP at the screen edge” gestures, another set of actions is implemented. For the sake of user experience simplicity, this proposal only considers double-taps. Like with the swipes, every screen edge triggers a different action when double-tapped:
To assist the user in knowing/remembering which gesture does what, tooltips are used. If the user places one finger at any of the screen edges without swiping or tapping, icons will appear that tell what will happen if the user swipes into a certain direction from the current position or (double-)taps at this position. If the user lifts the finger without having moved it, the icons will automatically disappear and no action will be triggered.
In case of the app-specific actions (bottom and top screen edges), these tooltips can also include additional information, such as for example the title, elapsed time and remaining time of the track currently played by the Media app.
Last, but not least, this proposal also introduces additional gestures for the app covers.
Currently, we have the following possibilities on “Home”:
With the exception of “swipe up”, all of these gestures remain unmodified. For the “swipe up” gesture, it now makes a difference if the swipe is done on a free spot inside the screen or on an app cover:
A similar differentiation is made for the “swipe down” gesture:
3 | No.3 Revision |
The following proposal aims at making the gesture based user experience of Sailfish OS more consistent throughout the entire system and eradicate the need for “makeshift” solutions in certain apps, that do not fit into the basic concepts of this user experience or hamper the “all screen real estate is yours” paradigm.
Although some of the described gestures might already be existing as patches, having them natively (out-of-the-box) from Jolla in Sailfish OS would be the best.
In Sailfish OS, we currently have “Swipe from the screen edge towards the screen center” gestures:
In addition to those “Swipe FROM the screen edge” gestures, this proposal introduces “Swipe AT the screen edge” gestures. They are performed by simply placing one finger at the desired screen edge and moving it along this edge (instead of moving it towards the screen center). Here is what they'll do:
By introducing “TAP at the screen edge” gestures, another set of actions is implemented. For the sake of user experience simplicity, this proposal only considers double-taps. Like with the swipes, every screen edge triggers a different action when double-tapped:
To assist the user in knowing/remembering which gesture does what, tooltips are used. If the user places one finger at any of the screen edges without swiping or tapping, icons will appear that tell what will happen if the user swipes into a certain direction from the current position or (double-)taps at this position. If the user lifts the finger without having moved it, the icons will automatically disappear and no action will be triggered.
In case of the app-specific actions (bottom and top screen edges), these tooltips can also include additional information, such as for example the title, elapsed time and remaining time of the track currently played by the Media app.
Last, but not least, this proposal also introduces additional gestures for the app covers.
Currently, we have the following possibilities on “Home”:
With the exception of “swipe up”, all of these gestures remain unmodified. For the “swipe up” gesture, it now makes a difference if the swipe is done on a free spot inside the screen or on an app cover:
A similar differentiation is made for the “swipe down” gesture:
4 | No.4 Revision |
The following proposal aims at making the gesture based user experience of Sailfish OS more consistent throughout the entire system and eradicate the need for “makeshift” solutions in certain apps, that do not fit into the basic concepts of this user experience or hamper the “all screen real estate is yours” paradigm.
Although some of the described gestures might already be existing as patches, having them natively (out-of-the-box) from Jolla in Sailfish OS would be the best.
In Sailfish OS, we currently have “Swipe from the screen edge towards the screen center” gestures:
In addition to those “Swipe FROM the screen edge” gestures, this proposal introduces “Swipe AT the screen edge” gestures. They are performed by simply placing one finger at the desired screen edge and moving it along this edge (instead of moving it towards the screen center). Here is what they'll do:
By introducing “TAP at the screen edge” gestures, another set of actions is implemented. For the sake of user experience simplicity, this proposal only considers double-taps. Like with the swipes, every screen edge triggers a different action when double-tapped:
To assist the user in knowing/remembering which gesture does what, tooltips are used. If the user places one finger at any of the screen edges without swiping or tapping, icons will appear that tell what will happen if the user swipes into a certain direction from the current position or (double-)taps at this position. If the user lifts the finger without having moved it, the icons will automatically disappear and no action will be triggered.
In case of the app-specific actions (bottom and top screen edges), these tooltips can also include additional information, such as for example the title, elapsed time and remaining time of the track currently played by the Media app.
Last, but not least, this proposal also introduces additional gestures for the app covers.
Currently, we have the following possibilities on “Home”:
With the exception of “swipe up”, all of these gestures remain unmodified. For the “swipe up” gesture, it now makes a difference if the swipe is done on a free spot inside the screen or on an app cover:
A similar differentiation is made for the “swipe down” gesture:
5 | retagged |
The following proposal aims at making the gesture based user experience of Sailfish OS more consistent throughout the entire system and eradicate the need for “makeshift” solutions in certain apps, that do not fit into the basic concepts of this user experience or hamper the “all screen real estate is yours” paradigm.
Although some of the described gestures might already be existing as patches, having them natively (out-of-the-box) from Jolla in Sailfish OS would be the best.
In Sailfish OS, we currently have “Swipe from the screen edge towards the screen center” gestures:
In addition to those “Swipe FROM the screen edge” gestures, this proposal introduces “Swipe AT the screen edge” gestures. They are performed by simply placing one finger at the desired screen edge and moving it along this edge (instead of moving it towards the screen center). Here is what they'll do:
By introducing “TAP at the screen edge” gestures, another set of actions is implemented. For the sake of user experience simplicity, this proposal only considers double-taps. Like with the swipes, every screen edge triggers a different action when double-tapped:
To assist the user in knowing/remembering which gesture does what, tooltips are used. If the user places one finger at any of the screen edges without swiping or tapping, icons will appear that tell what will happen if the user swipes into a certain direction from the current position or (double-)taps at this position. If the user lifts the finger without having moved it, the icons will automatically disappear and no action will be triggered.
In case of the app-specific actions (bottom and top screen edges), these tooltips can also include additional information, such as for example the title, elapsed time and remaining time of the track currently played by the Media app.
Last, but not least, this proposal also introduces additional gestures for the app covers.
Currently, we have the following possibilities on “Home”:
With the exception of “swipe up”, all of these gestures remain unmodified. For the “swipe up” gesture, it now makes a difference if the swipe is done on a free spot inside the screen or on an app cover:
A similar differentiation is made for the “swipe down” gesture: