Previously Software Quality Engineer at Apple, now full-stack dev using Swift and Python 🚀
Here to meet new people, feel free to stop by and say hi!
| GitHub URL | https://github.com/robin24 |
| GitHub URL | https://github.com/robin24 |
The text mode lie: why modern TUIs are a nightmare for accessibility
The mythical, it's text, so it's accessible [...]
https://xogium.me/the-text-mode-lie-why-modern-tuis-are-a-nightmare-for-accessibility
Slop drives me crazy and it feels like 95+% of bug reports, but man, AI code analysis is getting really good. There are users out there reporting bugs that don't know ANYTHING about our stack, but are great AI drivers and producing some high quality issue reports.
This person (linked below) was experiencing Ghostty crashes and took it upon themselves to use AI to write a python script that can decode our crash files, match them up with our dsym files, and analyze the codebase for attempting to find the root cause, and extracted that into an Agent Skill.
They then came into Discord, warned us they don't know Zig at all, don't know macOS dev at all, don't know terminals at all, and that they used AI, but that they thought critically about the issues and believed they were real and asked if we'd accept them. I took a look at one, was impressed, and said send them all.
This fixed 4 real crashing cases that I was able to manually verify and write a fix for from someone who -- on paper -- had no fucking clue what they were talking about. And yet, they drove an AI with expert skill.
I want to call out that in addition to driving AI with expert skill, they navigated the terrain with expert skill as well. They didn't just toss slop up on our repo. They came to Discord as a human, reached out as a human, and talked to other humans about what they've done. They were careful and thoughtful about the process.
People like this give me hope for what is possible. But it really, really depends on high quality people like this. Most today -- to continue the analogy -- are unfortunately driving like a teenager who has only driven toy go-karts.
Viewpoint 2.0 just has been released.
The update includes support for Gemini 3, improved speed, the removal of the deprecated 2.0 models, better PDF support, drag-and-drop support, and more.
https://viewpoint.nibblenerds.com/patch-notes
(Please don't assotiate me with the development or lead of the product, I just decided to post it as I saw it.)
A severe #accessibility issue I've seen very few people talking about is the widespread adoption (in my country at least) of touch-only card payment terminals with no physical number buttons.
Not only do these devices offer no tactile affordances, but the on-screen numbers move around to limit the chances of a customer's PIN number being captured by bad actors. In turn, this makes it impossible to create any kind of physical overlay (which itself would be a hacky solution at best).
When faced with such a terminal, blind people have only a few ways to proceed:
* Switch to cash (if they have it);
* refuse to pay via inaccessible means;
* ask the seller to split the transaction into several to facilitate multiple contactless payments (assuming contactless is available);
* switch to something like Apple Pay (again assuming availability); or
* hand over their PIN to a complete stranger.
Not one of these solutions is without problems.
If you're #blind, have you encountered this situation, and if so how did you deal with it? It's not uncommon for me to run into it several times per day.
why do you think this is not being talked about or made the subject of action by blindness organisations? Is it the case that it disproportionately affects people in countries where alternative payment technology (like paying via a smart watch) is slower to roll out and economically out of reach for residents?