User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
Admin
completely blind computer geek, lover of science fiction and fantasy (especially LitRPG). I work in accessibility, but my opinions are my own, not that of my employer. Fandoms: Harry Potter, Discworld, My Little Pony: Friendship is Magic, Buffy, Dead Like Me, Glee, and I'll read fanfic of pretty much anything that crosses over with one of those.
keyoxide: aspe:keyoxide.org:PFAQDLXSBNO7MZRNPUMWWKQ7TQ
Location
Ottawa
Birthday
1987-12-20
Pronouns
he/him (EN)
xmpp fastfinge@im.interfree.ca
keyoxide aspe:keyoxide.org:PFAQDLXSBNO7MZRNPUMWWKQ7TQ
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@jscholes @matt @tunmi13 And if you want to get a sense for how unsatisfactory live regions are, compare mudlet, that uses live regions to read new text with mushclient with mushreader, that uses the screen reader API. Notice how mudlet misses some text if it comes in too fast, doesn't always read text, and can't control if the text interrupts the previous text it sent or is added to the end of the queue. Mushreader has none of these problems.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@jscholes @matt @tunmi13 As an example, the person I'm currently in a meeting with has three monitors. One for the meeting, one for social media and dashboards, and one for what she's working on. Invisible interfaces and alerts from non-foreground apps are the only way I have to be even slightly as fast as her. And I'm already slower at a lot of things, because of the nature of inaccessible GUIs, so further friction and speed decreases would not be acceptable at all. If I didn't have these features I guess I'd have to have three laptops and a mixer? I don't know.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@jscholes @matt @tunmi13 Apple kind of does with its allow script control checkbox. But it should be on a per app basis. I agree that more control is needed. But I am strongly convinced this isn't a feature I could live without.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@matt @jscholes @tunmi13 Yup, you would be in the minority. Sighted people can have multiple windows on screen, and quickly glance at them. We can't. Getting alerts from non-focused windows is the only way to replace this experience.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@matt @jscholes @tunmi13 Notifications are far too verbose for, for example, reading out a chat message where your username is mentioned. I depend on that feature every day to do both my job and my hobbies. And there's no way any of the popular apps like tweesecake or TWBlue work without it.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@jscholes @matt @tunmi13 I would consider these to not work. They're Windows only, and they have a hole bunch of strange timing and delay issues between when they fire and when they get read.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@jscholes @matt @tunmi13 So would I. But this gets hard to do in a way that's cross-platform and cross-language, while also preserving ease of use. Most of these abstraction libraries just don't even support Braille at all.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@matt @jscholes @tunmi13 And what happens if the main Window ever gets destroyed or recreated? While I can often hook into app startup, most mod frameworks don't allow detailed hooks into Window Creation. It's possible I'm missing things, and smarter people than me can come up with a way to make this generally viable. But based on my research and skill level, I just don't see a path to avoid screen reader libraries in the majority of cases. Live regions are only useful in the case where you're writing your own app from scratch or modifying an open source app, and you never need to alert the user to things when the foreground window doesn't have the focus. This is a vanishingly small number of cases. As far as I can see, screen reader API's, and robust libraries to call them, are going to be useful for years to come.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@matt @jscholes @tunmi13 Better, but still not going to work for 99 percent of mods. In general, you don't get to spawn a new window, or modify properties on existing ones. The only place I could make this work is adispeak; I can write a full C# DLL there and do whatever I want. But if I do that, I lose the ability to notify the user if they have the IRC client in the system tray, or even just on the taskbar. Far from ideal.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@matt @jscholes @tunmi13 So would I. But the various game mods are developed by people mostly like me: hobbyists with jobs, and who are just skilled enough to find solutions and get things done. But without clear documentation and an easy to call API we can plug in, we're stuck. So I wouldn't expect this any time soon. All of the output systems in the above pole require one, maybe as many as three, lines of code to use.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@jscholes @matt @tunmi13 It's possible my understanding could be out of date. I'd love a better way to do things. However, as far as I know, live regions require the window to have focus, and require the app to be a web app. That's just not the case for any one of my use-cases. Sometimes I'm using an apps built-in scripting language to add accessibility, sometimes I'm patching an app to send text to the screen reader, or sometimes I'm creating an entirely separate app to run in the background to read log files and output alerts that way. In none of these cases would live regions work.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@matt @jscholes @tunmi13 My understanding is that I also need to be comfortable in rust. I’m not. 99 percent of the time these apis are the only thing allowing medium skilled programmers like myself to plug accessibility holes.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@matt @jscholes @tunmi13 If only platform apis were actually reliable in the real world. If only toolkits like unity and others supported them. The days of the screen reader api are far from over.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
Another thing I enjoy: getting all the pipes and parameters for a command just exactly perfect, and thinking to myself: "I should add an alias for this in my profile!" Then opening my profile, and finding an alias I added to do that exact thing two years ago.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
Customizing my terminal makes me feel so productive! Of course, all of the time-saving aliases, hotkeys, and configuration changes I just made I'm going to forget within two days, and they'll never make it into my muscle memory. But hey, if I change several of the habits I've developed over a lifetime of computing, the modifications I just spent an hour making could save me as much as 0.2 seconds per month!
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@jscholes @andrew @klittle667 My favourite was a documentary I watched where whenever someone was speaking a different language, the describer would announce "Subtitles appear." and then...not read them! Gee, thanks.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@jscholes @andrew @klittle667 Oh! Do they do the deeply silly thing that our French channels sometimes do here? Where they air the French dub of the movie, but mess up and put the English version of the movie with Audio Description on the second stream?
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@jscholes @andrew @klittle667 We do it rarely. It's mostly for live stuff like political debates or ceremonies or whatever. Sports, in one case. We also have language-specific channels. But at a political debate, for example, people might be frequently switching between French and English.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@jscholes @andrew @klittle667 So in other words, in order to even know what the overly complicated standard is, you have to pay for the documentation. And then you actually have to implement the thing. And then, of course, the fact that there is no public open-source reference implementation means that everyone does it slightly differently, so if you want to build your own equipment to work with AD tracks, you have to account for every possible way the documentation could ever be interpreted by anyone, along with some impossible ones. And absolutely none of this infrastructure could be reused to offer multilingual dubs of programs in different audio streams. Whereas in Canada described audio is effectively just another language; you will sometimes encounter a program with four different audio streams: English, English AD, French, and French AD. And here I thought the UK was better at this than North America.
User avatar
🇨🇦Samuel Proulx🇨🇦 @fastfinge@interfree.ca
5mo
@jscholes @andrew @klittle667 But in exchange, it means you can't just use the audio description that already exists when you're airing shows from the US and Canada, because we don't master our AD that way. It also explains to me why, when Canadian TV channels import Audio Description from the UK, the mix is an utter and total mess. I thought you guys were just really bad at that. One UK show I watch, for example, has all program audio on the left channel, and all audio description on the right channel. It's the worst of all possible worlds!