We have moved to a new Sailfish OS Forum. Please start new discussions there.

[Idea] Make Jolla accessible for blind people

asked 2014-02-14 10:07:19 +0300

KRM gravatar image

There are some projects for iOS and Android to make smartphones accessible for blind people. See e.g. the eyes-free project. I am not sure, whether this would be just a question of apps or if something can be done in the system. Wouldn't that be really unlike?

edit retag flag offensive close delete


This is great idea indeed. And special TOH would allow many possibilities. I'd give 5 votes if I could.

t0mps0 ( 2014-02-14 13:21:09 +0300 )edit

@chemist you should really turn that into an answer - awesome idea

Stskeeps ( 2014-02-15 21:48:30 +0300 )edit

I think, jolla does not care that much about handicapped people. Some time ago, their was a guy with handicap, presenting jolla 1 once. He could it do very well, because SfOs v.1 was just ingenious for one hand use. After changing to Sfos v.2 (the tablet version) it was not possible anymore to use it safely with one hand, because it could accidentally drop. So I think, they was so exited for the tablet, and just forgot their roots.

If only your eyes not that perfect, SfOs is just horrible, not even possible to increase font size, or see the links in the email app, with white background and ambienced URL color.

I do not see much effort in this way...

poddl ( 2016-12-23 02:23:16 +0300 )edit

8 Answers

Sort by » oldest newest most voted

answered 2014-02-14 11:12:16 +0300

chemist gravatar image

updated 2014-02-16 03:10:04 +0300

Braille-TOH I have Mr. Knopper's (Knoppix) wife in mind, she did some work on a braille-displays afaik.


edit flag offensive delete publish link more



Braille TOH is something I've also been thinking since I saw kimmoli's OLED TOH. I already challenged my watch maker friend to think of different ways to make the 6 dot mechanism (or 8 like in computer Braille displays) somehow packed to as thin space as possible. Watch makers tend to be able to utilize and implement micro-mechanics in a clever way. Let's see if that could lead into ideas how to make say 8 character Braille TOH one day.

tw ( 2014-02-16 14:00:46 +0300 )edit

http://spie.org/x37076.xml?ArticleID=x37076 this sounds cool, but little skeptic about the activating voltage, 5.68 kV

kimmoli ( 2014-02-16 17:56:39 +0300 )edit

answered 2014-02-16 02:35:39 +0300

sebsauer gravatar image

Qt and QtQuick come with proper support out of the box. What we need is the atspi-dbus daemon [1] and the QAccessible plugin[2]. Using that framework it woule be possible to write all kind of accessibility-tools including a screen-reader (using something like speech-dispatcher) [3].

[1] http://www.linuxfoundation.org/collaborate/workgroups/accessibility/atk/at-spi/at-spi_on_d-bus [2] Delivered together with Qt5 and official a11y Linux backend. See http://community.kde.org/Accessibility/qt-atspi [3] https://blogs.kde.org/2012/09/08/accessibility-kde-410-and-beyond - proper Qtatspi client-helper lib to write all kind of agents exist in iirc kde-playground, its qmake+QtCore+QtDBus dependencies only, works well.

edit flag offensive delete publish link more

answered 2016-12-21 19:07:05 +0300

PhDore gravatar image

For sure, Making Sailfish OS based devices accessible to blind is feasible. But the required development effort could appear to be quite huge regarding the amount of interested people. Along this thread I read interesting ideas addressing both aspects which are device feedback and interaction for non-sighting users.

Information presentation

Screen readers are responsible for presenting to visually impaired users what is displayed on the screen. All of them rely on Accessibility Technology (AT) provided by the UI. Most popular are Windows MSAA, GNOME/LINUX AT-SPI, or Accessibility Interfaces provided by APPLE. Most of the screen readers provide feedback through Text-To-Speech and are able to handle braille display terminals. Among them are:

Even if AT provides interfaces and mechanisms to the screen reader, this can properly work if the home screen and the APPs follow a set of accessibility guidelines.

Interacting with the device

Today question is more how to interact with an touchscreen device when you have no real idea where the objects are located on the screen. This requires a new set of gestures when accessibility is on. This has to be done without impacting, as much as possible, the APPs or the home screen. Main principles come from well-known mouse usage: One click/tap to select, two to activate. Gestures to navigate from one displayed object to another have to be added. Gestures to perform actions on the object having the “focus” may be performed anywhere on the touchscreen. The only acceptable exceptions are reference to screen edges and corners. As far as I know only JAWS, NVDA, TalkBack and VoiceOver handle gestures and translate them into regular APP UI events. Personally I’ve got the opportunity to try both a Nexus 7 running Android 4.4 and an iPhone 4. If both present weaknesses the one I was able to use during 3 months, without additional equipment, was the iPhone. With Talkback I was unable to edit text even in the tutorial APP. Not surprising since Apple claims that everybody can use their handset with the “screen curtain” on. Voice command, already available on various platforms, is very helpful but could not be considered as the primary input method since it cannot be used in any circumstances.

Danger of home screen replacement

The idea of replacing Sailfish home screen with an Audio Desktop (or any other alternative) answers only the desktop accessibility but will not address APPs accessibility. Going this way may result in building a complete software suite that will replace regular phone APPs. This suite will not take advantage of a wide deployed one and will offer limited features compared to regular applications. It is expected that maintaining the accessibility part of a regular application will require less effort than maintaining a full alternative of it.

Braille VS. speech

If braille can be preferred to speech by a large amount of blind users it still requires to have an external device connected to the handset. If it is not a problem whith a computer (even a laptop) using it with a smartphone appears not so easy (E.g. in public transport). The idea is that such a small device is primarily intended to be used in a standalone mode. Even if there is some research initiatives around enhanced touchscreen offering haptic capabilities allowing physical buttons/dots to appear on the surface like done in the BRITAB prototype, those alternatives are not currently available at an affordable price.

Sailfish advantages to address accessibility for blinds

First since SailfishOS UI is based on Qt, it will be possible to take advantage of what is currently done in terms of accessibility in it. Then, for ergonomic reasons, Sailfish UI already makes use of gestures that are not linked to screen area like “edge swipe” which sounds good for thinking about accessibility.

edit flag offensive delete publish link more



I think voice commands is the natural way this would work with swipes, where you get a call and the phone will tell you:

call incoming

phone says "swipe up to answer"

new email in inbox

phone says "you got mail, swipe up and i will read the mkst recent email" after reading the most recent email it can tell you "you have three additional unread emails from so and so, so and so 2, so and so3, tap 1 time for so and so, 2 times for so and so 2"

Also at any point gestures should allow, repeat, skip, delete, create (write, call, watch, find, search etc)

Jolla should do things better, that seem somewhat mechanic and clumsily implemented in Microsoft and Apple OS. Voiceover is horrible and doesnt allow good gesture interactions in addition to voice interactions.

DarkTuring ( 2016-12-23 02:46:04 +0300 )edit

answered 2014-02-16 17:29:48 +0300

this post is marked as community wiki

This post is a wiki. Anyone with karma >75 is welcome to improve it.

updated 2014-02-16 17:29:48 +0300

MSameer gravatar image

I personally had this in mind. The braile screen is an idea but it does not stop there.

If we could have a special voice assistant mode that would at least read the menus of various applications, text messages, ... etc that would be good too.

edit flag offensive delete publish link more


Knoppix-adriane comes with braille support but iirc most of the implementation is audio-assisted

chemist ( 2014-02-17 13:08:23 +0300 )edit

answered 2014-02-15 21:50:30 +0300

Alternatively, there's nothing stopping anybody from replacing the homescreen with another, and with same middleware build haptic feedback or other interesting touch interfaces for the phone. The framework is called 'lipstick' - and you use QML to build the homescreens on.

What requirements would you see to make it more accessible to people who are blind/vision impaired?

edit flag offensive delete publish link more


This can get long: simple things like contrast-constraint settings (ambiences have specific rules about readability, fonts, etc which should be enforced in this mode); global and per app font settings, button contrast changes, haptic feedback on system nav elements, voice-over on nav and frame elements separate from text-to-speech content; and probably some ability to plug into existing visual systems (JAWS, etc) when connected to PC, TV, kiosk interfaces.

arjwright ( 2014-02-15 22:03:49 +0300 )edit

answered 2015-02-02 10:10:27 +0300

tw gravatar image

Just read about the Blitab. How cool a Blitab TOH would be.


edit flag offensive delete publish link more

answered 2017-01-15 14:18:25 +0300

PhDore gravatar image

Agree with your voice/gesture interaction approach. This suggestion mechanism can be primarily used to handle notifications. Depending on his environment, the user can interact either with the suggested gesture or through voice commands. The suggestions corresponds to the most likely action.

edit flag offensive delete publish link more

answered 2017-01-16 09:01:37 +0300

N9Sailfish gravatar image

Jolla should bear in mind also the senior phones in addition to blind people. How would look Doro etc with SFOS?

edit flag offensive delete publish link more
Login/Signup to Answer

Question tools



Asked: 2014-02-14 10:07:19 +0300

Seen: 1,078 times

Last updated: Jan 16 '17