Tonight, on…
BlueSky: “Trump is destroying 80 years of American moral leadership”
X: “Zelensky is on cocaine and should never have started the war”
Mastodon: “Mozilla have removed a paragraph from their terms of service!!”
Time we switched all this brain-rot off.
2 seconds into logging on to Mastodon and at the top of my feed I see people lobbying for the Internet Archive to continue stealing the work of authors. There’s so much hypocrisy on the Fediverse where people think they’re principled for stopping creators losing out to AI, but equally principled for demanding that we can just steal those creators’ work directly. Freeloader culture is not the same as collaborative culture.
Seen some academics recently admit that they’re not going to attempt to stop their students using LLMs to write their papers. So I guess industry is going to have to make junior interviews even more stressful to weed out the people who got a higher education qualification that does not reflect their actual knowledge.
Once again I find myself furious that I can't get a photo from my Windows machine to my iPhone without emailing it. Why is iCloud on Windows so broken? Why is Dropbox so bad on iPhones? Why are these companies all so abusive to their customers?
I used to think that we could solve a lot of problems if we just get the regulation right. Then I started reading various laws and watching new ones getting passed, and I’ve lost that confidence. It’s not necessarily that the laws are over- or under- strict but that they’re often so badly worded as to be useless or require decades of case law to decide on the real scope. Great for lawyers, bad for citizens.
Who could have imagined that “building software by automatically downloading unaudited packages from the web” and “building software by asking a probabilistic text generator to write it for you” could possibly combine to make something even worse?
https://wandering.shop/@johnpettigrew/112172716984340737For those of you who use LLMs to help you code, here's a warning: these tools have been shown to hallucinate packages in a way that allows an attacker to poison your application. https://www.theregister.com/2024/03/28/ai_bots_hallucinate_software_packages/ #ai #gpt #chatgpt #security
The Wandering ShopGood post about the situation with cookie banners and consent. Sadly, the pro-tech brigade will routinely try and mislead the public about regulatory law to try and lobby for its removal.
Another example is “DMCA takedowns” - these don’t exist as a legal concept, but platforms want you to believe that they do. It’s just a choice platforms are allowed to make to get ‘safe harbour’ status. https://mastodon.ie/@andrewg/112121475190940766
There is no EU cookie banner law
https://www.bitecode.dev/p/there-is-no-eu-cookie-banner-law
mastodon.ieWhy is it, when people complain that “the law hasn’t caught up with technology”, nine times out of ten it’s someone complaining that the crimes they commit with their computer with ever-increasing ease haven’t yet been legalised?
The #AI legislation just passed by the #EU looks pathetically weak on the issue of protecting individuals from having their work harvested. Model developers only have to document how they handle explicit opt-outs, which are not even available to most users who have their work on 3rd party platforms.
There needed to be strengthening of the expectation that data mining should "not unreasonably prejudice the legitimate interests of the rightholders", but it's not there. Very disappointing.
Life hack: have at least two separate email addresses, so that when the appalling state of tech interoperability means you still have to email something to yourself to get it on another device, it does at least /feel/ like you’re sending a real email.