I regret to inform you (again) that Dave Winer is still wrong.
http://scripting.com/2026/04/05/130334.html?title=theDiscourseAboutWordpress
I regret to inform you (again) that Dave Winer is still wrong.
http://scripting.com/2026/04/05/130334.html?title=theDiscourseAboutWordpress
I have an HTTP server. It's sitting just over the other side of the room at the moment, having recently been moved away from pride of place next to my left elbow.
Mind you, I don't put the massive attack surface of bloody WordPress on it.
It does GOPHER, too.
It's #djbwares httpd, and gopherd, and geminid come to that. All static. All readonly. Running on NetBSD.
Just for kicks, I made the WWW pages downloadable over the FINGER and NICNAME protocols too.
And I usually edit its pages in #NeoVIM. (-:
@jautero @ianbetteridge I'm not saying I agree with the contents of the article. But I assume the https thing is a mistake. He uses https everywhere else including the domain I see him most often promoting. https://daveverse.org/2026/04/05/the-discourse-about-wordpress/
That said, I'm not sure https is necessary for every case especially now. What's more likely, a mitm attack on recipe site to change the ingredients' weights or an AI incorrectly parroting the contents?
No business should be without https though.
It's not a mistake.
https://this.how/googleAndHttp/
I wouldn't characterize HTTPS as solely a Google thing, myself.
But yes, as someone with an HTTP site, I can report that the bigger problem today is as you say #AI companies and their scrapers, not man-in-the-middle attacks. In fact, I've never suffered from the latter in quarter of a century. Whereas the scrapers and automated attacks are constant.
For examples:
Within the past couple of minutes, according to my logs, someone in AS8075 has attempted 130 WordPress vulnerability attacks on my HTTP machine. (For some reason, the attacker is using #Scunthorpe in the host name of the URLs. I kid you not.)
The large number of low-speed scrapers are a constant presence. And the last high-speed scraper was someone also from AS8075 who downloaded 210 pages from a single subdirectory in ~100 seconds on Good Friday afternoon.