Oh mais je viens de découvrir que depuis la version 1.14 de /e/OS (Android dégooglisé), on a Talkback FOSS (le lecteur d'écran d'Android mais dégooglisé) !!!
C'est trop bien !!!
🥳🎉🎊
So one of the things for me about Android is the TTS engines. Apple has, basically, three speech engines built into every one of their operating systems. Vocalizer which is what VoiceOver starts off on using, Macintalk which is like Alex, and Fred and all that, and Eloquence. Three different ways of speaking, pronounciation sets, all that. Actual choice.
On Android though, a Pixel comes with Google TTS. And if you've ever been somewhere with no Internet and heard a low quality, robotic voice from Google Maps, you've heard what we have to deal with on a Pixel every single day unless we get something different. So on Android, there are a few more options. RH Voice which honestly doesn't sound so good to me in English, ESpeak which is as robotic as you can get and was last updated on Oct 23, 2022 (almost a year ago), Vocalizer which had its last update on Oct 30, 2021 (which is better than I thought but still feels unmaintained), and that's about all I know of. On Samsung phones, you can get Samsung TTS out of the box, and it's pretty good. Of course, then you get Samsung's TalkBack, Samsung's version of everything, but also all the goodies that come with Samsung phones. Oh and Samsung TTS has a longer pause between everything cause it was made to read stuff not for screen reading, so everything feels slower than it is.
So it's really sad. Eloquence is still a 32-bit app, so will not work on the newest Pixels. Google TTS' newer local models are sluggish with TalkBack, and cannot speak quickly, as many have found out when working around the fact that TalkBack doesn't use the newer model natively. And it's sluggish when reading long pieces of text, like this one. And there iOS is, with tons of voices to choose from. And I get it, I should be thankful that we have even ESpeak, but when you come home from a stressful day at work, what do you want to hear?
I'm not gonna lie, using Youtube Music, at least the audio player, is really nice. When a song starts playing, TalkBack tells me what the song is. I know, some people won't like that, and I hope that becomes configurable. But when I go to the next song button, and then lock the screen, and then unlock it and get back into the app, TalkBack focus is still on the next button, ready for me to just double tap. I can also use the GoodLock Sound ssistant module to set a long press of the volume buttons to go to the previous or next track. Ugh I just love that kind of stuff. I can't wait for TalkBack to catch up to VoiceOver.
I've been reacquainting myself with #Android#accessibility this last week after mostly ignoring it for a decade and have some thoughts about my experiences.
The good parts so far are that it's considerably easier to set up a new device as a #blind person, and thanks to the recently added #TalkBack#Braille keyboard, I haven't needed to hook up a Bluetooth keyboard at all. Having grown accustomed to Apple's DirectTouch experience with their on-screen keyboard, I find Android's to be unwieldy and slow by comparison. The dedicated number row at the top is nice, though, and is something Apple should have been doing years ago.
On the topic of #Braille, I’m glad it’s now baked in and that somebody at #Google finally dispensed with the old ridiculous commands and added usable, and memorable defaults. A huge flaw, however, is that if you’re typing in a text field, none of the #TalkBack navigation commands work, except for panning the display back and forth. What the actual F, #Google?
To compound the problem, #TalkBack#Braille has no command to stop interacting with a control, so it's genuinely possible to get stuck. Even so, #Android#Braille is at a level I would personally deem acceptable nowadays-- a statement I never thought I'd write.
Another thing I never thought I'd say is that for accessibility and efficiency, GMail appears to be the winner between the mail clients I've re-evaluated (including K-9 Mail and AquaMail). If the latter start supporting #TalkBack actions, I might change my mind.
Audio of me showing that TalkBack speaks while I'm trying to talk to the Google Assistant, and I eventually ask it to play some music, and it plays a nice piano remix of Chrono Trigger's Corridor of Time track on Youtube Music. Description is here cause a Mastodon client for the blind can't add media descriptions, that I know of.
Wir zeigen euch Sicherheitsrisiken durch KI-generierte Texte. Einmal haben wir ein bekanntes KI-Tool mit Infos gefüttert und die Antwort unverändert für euch eingefügt. Außerdem haben wir unsere Redakteurin zwei Texte mit denselben Infos schreiben lassen.
Die Unterschiede sind nicht leicht zu erkennen. Darum seid u.a. bei Mails aufmerksam, selbst wenn sie auf den ersten Blick nicht wie Spam aussehen.
Was denkt ihr: A, B oder C, welcher Text stammt von der KI?
Time to see how well I can replicate either the #Chromevox and/or #Talkback keyboard experience with the #jieshuo screen reader's 'Hotkey Scheme' feature. The default has 'waaaaaaaay' too many conflicts in my view (particularly if using web browsers that support keyboard shortcuts).
And since people are able to share said keyboard layouts, anyone thought of creating a #WindowEyes or #SuperNova/#Hal layout?
So I am loving #talkback 14.0 which rolled out yesterday and is almost entirely focused on #braille, #BrailleScreenInput , and #BrailleDisplay support including additional editing functions for displays and editing, granular control, and additional gestures within braille input.
So, I know I like to mrrrr at TalkBack a bit here and there, but one of the nicest things they added in TalkBack 14 is Braille Element descriptions. That full cell and dots 3-6 after it? That actually means something. An element you can tap and hold, using a Braille command, Space with Enter or something like that. And you don't have to go to the commands list to just figure out what commands you have, there's a categorized list of a good 50 commands now. No, it's not perfect, but the idea of documenting everything for the user, within the screen reader, is just... well it's nice. So nice not to have to scrub every corner of the screen reading experience just to find out what's new or what something means.
So, there's a version of TalkBack 14 going around that works with Android 13. I thought I'd test out the NLS EReader support. So I plugged it into a USB C to USB C cable, from the phone to the Braille display. Nothing worked. I did hear the phone say "USB Connector connected," and "USB Connector disconnected" a few times though, so I thought maybe it's a short in the wire. So since I didn't have any more C to C cables, I plugged it into my little USB hub thing. It still didn't work. On a burst of inspiration, I tried plugging a power cable into one of the C connectors. And then it worked. It was slow, as expected from a beta, but it worked fairly well.
Now, to use this, I'll have to be connected to power, until I can find another C to C cable. Hopefully that'll work. Still, it's super sad that Google doesn't communicate enough internally to simply put Braille HID into a play system update and there we go. I am glad, though, that they now have a place in Braille settings where they describe what each element stands for. I don't think VoiceOver has that.
Hey @objectinspace Sorry, the whole Google assistant thing, I was wrong. I turned the show spoken feedback on, and it did the same thing. When on speaker, TalkBack doesn't talk over you speaking to assistant. With headphones in, it does talk over you. Bah. I thought I'd found a cure.
I'd love to know why 9/10 times I try to get #TalkBack to describe a highlighted image, it doesn't work. It seems like a poorly implemented feature, or user error on my part, I don't know which.
Mastodon servers can report information about the hashtags that are trending, and #Tusky now shows this information for you. It's accessible from the left-hand navigation menu, and you can also add it as a dedicated tab.
I've attached a screenshot so you can see what it's like.
@nikclayton@Tusky Y’all guys do amazing work. May more #Fediverse & #Threadiverse applications follow your lead, especially with adding ‘custom accessibility actions within menus for #Talkback & other screen reader apps such as #Jieshuo.
Action menus (that being an Android #Accessibility feature), helps greatly with replying, boosting/rebloging, etc, making it more the while useful, especially if using a Bluetooth keyboard.
Curious what #blind people are using for #accessible#Mastodon apps on #Android out of interest? I'm using the stock app at the moment but it's a bit meh.
@philsherry It's quite longwinded to use with #Talkback. Doesn't make use of Talkback's actions ie: like/boost/reply etc. Could make navigating through it far quicker if it did.
Quick thought, scrolling on Android is like turning a page. You quickly swipe right to grab the current page, then left to turn to the next one. Left to grab the previous page, and right to turn it. This is not meant to be exact or anything, just an observation and way for me to remember before I forget. And I know I’ll forget if I don’t write it down.
What's new with TalkBack 14.0 (support.google.com)