In America, a picture is worth 1,163.5 words.
| blogosphere | https://coffeeonthekeyboard.com/ |
| siteosphere | https://jamessocol.com/ |
| twitosphere | https://twitter.com/jamessocol |
| gitosphere | https://github.com/jsocol |
| blogosphere | https://coffeeonthekeyboard.com/ |
| siteosphere | https://jamessocol.com/ |
| twitosphere | https://twitter.com/jamessocol |
| gitosphere | https://github.com/jsocol |
You don't use open source software because it's better (it usually isn't).
You don't use open source software because it's freer (it only sometimes is).
You don't use open source software because it's got better politics (it isn't always).
You use open source software because *it is the only option*. In the long run, if it isn't open source, it doesn't exist.
image source: keithstack.com
i don't really want to hear anymore about how ai "works for me" or "doesn't work for me" or anything like that
this is conceding the framing of the debate on totally self-centered terms and ignoring the massive societal effects of this hideous technology
this is how capitalism trains you to think and it's wild to see how many people still have these individualism brainworms even when they can clearly see the societal cost and it also impacts them specifically
RE: https://mas.to/@carnage4life/115989974643551129
I am shocked, SHOCKED.
How are these announcements not blatant market manipulation?
So the original #SBOM requirement for federal agencies in US was just removed.
"OMB Memorandum M-22-18, Enhancing the Security of the Software Supply Chain
through Secure Software Development Practices (M-22-18), imposed unproven and burdensome software accounting processes that prioritized compliance over genuine security investments.
This policy diverted agencies from developing tailored assurance requirements for software and
neglected to account for threats posed by insecure hardware. Accordingly, 0MB Memoranda M-22-18 and M-23-16, a companion policy, are hereby rescinded."
Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.
If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data
This is an essential read from @pluralistic – well worth your time to understand the points he’s making fully as he’s spot on and the excellent analogies will make it easier for you to discuss the topic with others.
https://www.theguardian.com/us-news/ng-interactive/2026/jan/18/tech-ai-bubble-burst-reverse-centaur