Eye contact is not limited to full facial portraits of people looking directly into the camera.

Eye contact is not even limited to looking directly into the camera at all.

Eye contact is whenever there is at least one eye anywhere in the image. No matter where it is. No matter how small the eye and how big the image is.

Ask autistic people, and they'll likely confirm. And they'll also likely confirm that it triggers them.

In fact, eye contact is even when you, as a neurotypical person, cannot even see the eye because it's less then a pixel.

Imagine an image of 20 megapixels. Now imagine there's a person somewhere in the image, only four pixels high and about one pixel wide. This means the head is half a pixel high and a third of a pixel wide.

Even if the person is looking directly at the camera, this still means that each individual eye is 1/15 of a pixel wide and maybe 1/30 of a pixel high. That's 1/450 or a bit over 0.2% of a pixel. That's about 1/9,000,000,000 or a bit over 0.000,000,01% of the whole image. If the person is looking directly at the camera.

Nonetheless, this may trigger some autistic people even if the person is not even looking into the general direction of the camera.

It doesn't even have to be a person. It may just as well be an animal or a fantasy creature or a robot or a sculpture or a stylised face or even only a single stylised eye.

I've actually had all this confirmed by @Yohan Yukiya Sese Cuneta 사요한🦣 who knows enough actually diagnosed autistic people to know.

So it doesn't matter how big or infinitely small the eye is. It doesn't matter where it's looking. If there's at least one eye in your image, it counts as eye contact.

If you, as the user who posts the image, know for certain that there is at least one eye in the image, you're obliged to
  • have the image automatically blanked or blurred
  • make sure that Mastodon will blank the image, too
  • add the content warning "CW: eye contact" to your post
  • add the hashtags #EyeContact and #CWEyeContact to your post, especially the former which some people out there may have filtered

You're only excused not to do so if you yourself honestly don't know that there is at least one eye in the image.

#Long #LongPost #CWLong #CWLongPost #FediMeta #FediverseMeta #CWFediMeta #CWFediverseMeta #CW #CWs #CWMeta #ContentWarning #ContentWarnings #ContentWarningMeta #Hashtag #Hashtags #HashtagMeta #CWHashtagMeta #EyeContactMeta #CWEyeContactMeta #Autism #Autistic #Neurodivergent #Neurodivergence #Inclusion #Inclusivity #A11y #Accessibility
Hubzilla.de

Eye contact is not limited to full facial portraits of people looking directly into the camera.

Eye contact is not even limited to looking directly into the camera at all.

Eye contact is whenever there is at least one eye anywhere in the image. No matter where it is. No matter how small the eye and how big the image is.

Ask autistic people, and they'll likely confirm. And they'll also likely confirm that it triggers them.

In fact, eye contact is even when you, as a neurotypical person, cannot even see the eye because it's less then a pixel.

Imagine an image of 20 megapixels. Now imagine there's a person somewhere in the image, only four pixels high and about one pixel wide. This means the head is half a pixel high and a third of a pixel wide.

Even if the person is looking directly at the camera, this still means that each individual eye is 1/15 of a pixel wide and maybe 1/30 of a pixel high. That's 1/450 or a bit over 0.2% of a pixel. That's about 1/9,000,000,000 or a bit over 0.000,000,01% of the whole image. If the person is looking directly at the camera.

Nonetheless, this may trigger some autistic people even if the person is not even looking into the general direction of the camera.

It doesn't even have to be a person. It may just as well be an animal or a fantasy creature or a robot or a sculpture or a stylised face or even only a single stylised eye.

I've actually had all this confirmed by @Yohan Yukiya Sese Cuneta 사요한🦣 who knows enough actually diagnosed autistic people to know.

So it doesn't matter how big or infinitely small the eye is. It doesn't matter where it's looking. If there's at least one eye in your image, it counts as eye contact.

If you, as the user who posts the image, know for certain that there is at least one eye in the image, you're obliged to
  • have the image automatically blanked or blurred
  • make sure that Mastodon will blank the image, too
  • add the content warning "CW: eye contact" to your post
  • add the hashtags #EyeContact and #CWEyeContact to your post, especially the former which some people out there may have filtered

You're only excused not to do so if you yourself honestly don't know that there is at least one eye in the image.

#Long #LongPost #CWLong #CWLongPost #FediMeta #FediverseMeta #CWFediMeta #CWFediverseMeta #CW #CWs #CWMeta #ContentWarning #ContentWarnings #ContentWarningMeta #Hashtag #Hashtags #HashtagMeta #CWHashtagMeta #EyeContactMeta #CWEyeContactMeta #Autism #Autistic #Neurodivergent #Neurodivergence #Inclusion #Inclusivity #A11y #Accessibility
Netzgemeinde/Hubzilla

#vendredilecture #mastolivres #MesLectures2026 #CW

Le passage de Mathieu PERSAN.

Bon, ce livre, c'est une sorte de petite bombe à mon humble avis parce que j'ai vécu cela aussi avec l'étudiante...
Il aborde un sujet difficile : la #dépression adolescente sous la forme d'un roman graphique.
Très juste sur le ton et les illustrations et sur le vécu de cette descente aux enfers que rien n'arrive vraiment à expliquer.

Bref, je recommande.

For instance, In Chapter D-I, section "The sound image method", one reads "The training begins also here with low speed, but the individual Morse code characters are sent from the start at a higher speed.". Years before Farnsworth's records, which, by the way, didn't use what almost everyone now wrongly calls #Farnsworth spacing, Koch had already described it.

(2/2)

#CW #Koch #Farnsworth

Ludwig Koch's thesis (1936) is a treasure trove of empirical data and methodological proposals for the learning of Morse code. And Quentin Santos has made it available together with an English translation: https://github.com/qsantos/koch-dissertation

(1/2)

#CW #Koch #Farnsworth

GitHub - qsantos/koch-dissertation

Contribute to qsantos/koch-dissertation development by creating an account on GitHub.

GitHub

I picked up a used Bencher BY-2 off FB marketplace today. I think it was a good deal for $80. It cleaned up very nicely and has some great stories behind it. I bought it from the son of a silent key KL7LF, who had stations in Colorado, Fairbanks AK, and Johnston Island. His son gave me a few of his dad's #QSL cards as well.

#cw
#hamradio
#amateurradio

New #blog #post: Mind the Shards

https://rldane.space/mind-the-shards.html

552 words

Mild #CW: I'm discussing mental health, and briefly, faith.

cc: my wonderful #chorus: @joel @dm @sotolf @thedoctor @pixx @orbitalmartian @adamsdesk @krafter @roguefoam @clayton @giantspacesquid @Twizzay @stfn

(I will happily add/remove you from the chorus upon request! :)

#rlDaneWriting #blost #podcasts #MentalHealth #MandyPatinkin #frailty

Mind the Shards

Made an iOS / MacOS app that decoders CW with four decoders / settings-combos. Using ggmorse as algo, which seems to work nicely. Tried fldigi and bayesian style - decoding but ggmorse is a clear winner.

I made enhancement to the ggmorse to use kalman filtering for frequency tracking, it didn't make it worse :)

#morse #cw #ggmorse #radioamateur #amateurradio

From AD0WE's QRZ page, I found The Art and Skill of Radio-Telegraphy by N0HFF, which is the first Morse Code learning resource I've read that recommends a practice that I started doing naturally: vividly imagining morse as a learning technique.

Sit quietly in a chair, close your eyes, relax, and imagine you are hearing each letter sound (just as you heard it), taking them one at a time, and immediately recognizing it or writing it down with a pencil. Make the picture as realistic and vivid as you can, even to imagining the "feeling" the pencil writing on the paper. Feel a sense of satisfaction of doing it right. Three to five minutes practice this way at any one time is probably enough. You can then repeat this kind of mental practice with each new group of characters as you learn them, and it will greatly strengthen the habit you are trying to build.

I practice both sending and receiving code with vivid imagination. To this day, I have trouble with D/B and B/6, and I have improved by vividly imagining hearing them and speaking (rather than writing) them, but I do this practice with all symbols. I also vividly imagine sending. I'll use memorized text, and imagine sending the whole thing, down to imagining feeling the particular key I'm imagining that I'm using between my thumb and fingers.

I have found this practice helpful.

#HamRadio #MorseCode #CW

It's a neat idea, but there are a few drawbacks worth bringing up:


  • Many people would get "lazy" with #altText , as they might feel the responsibility has shifted away from them to write it. If that happened, we could end up with less alt-text, not more.

  • A #CW suggestion system might end up being a spam and strife risk, because a lot of people would not want to CW certain topics, and they might get spammed by CW requests, especially if the requests work privately like DMs.


I'd take an alt-text suggestion system, but maybe not the CW suggestion system.

CC: @[email protected]

#a11y #accessibility #Mastodon #contentWarnings