214 Followers
70 Following
303 Posts

[Gecko] Competition, Innovation, and the Future of the Web - Why Independent Browser Engines Matter

https://blog.mozilla.org/netpolicy/2026/03/23/competition-innovation-and-the-future-of-the-web/

đŸŠŽïž

Competition, Innovation, and the Future of the Web – Why Independent Browser Engines Matter – Open Policy & Advocacy

Gecko matters because it ensures there’s an independent voice shaping how the internet evolves. Without Gecko, the landscape would be dominated by Apple and Google alone. Is that really the ...

Open Policy & Advocacy

Exceptionally well written article on static closures in #php:

https://f2r.github.io/en/static-closures

Why use static closures?

Why use static closures? (Published on March 3, 2026 - Version française)

F2R Articles
I've been going to conferences for 20 years. Not one person I've spoken to enjoyed the loud music during socials. In fact, almost everyone tells me that they'd rather have no music at all. They just want to have conversations with other people.

#vendredilecture avec l’apprenti assassin.
Premier volet de la saga du Fitz, "L’assassin royal". Il y a longtemps que je voulais dĂ©couvrir le travail de Robin Hobb. Et pour le moment, je ne suis pas déçu. L’univers et ses enjeux sont de prime abord assez classiques, mais la plupart des personnages sont trĂšs nuancĂ©s et les intrigues de cours haletantes.
Je recommande la lecture Ă  tous les amateurs de fantasy.

#mastolivre #SFFF

a decade or so ago, I was writing a H.264 decoder (needed a custom one for stupid reasons which of course had to do with hardware reverse engineering).

the first order of business was to implement CABAC: the final entropy coding stage of H.264 (ie. the first layer I had to peel starting from the bitstream), a funny variant of arithmetic coding. the whole thing is quite carefully optimize to squeeze out bits from video frames by exploiting statistics. in addition to carefully implementing the delicate core logic, I also had to copy-paste a few huge probability tables from the PDF, which of course resisted copy-paste as PDFs like to do and I had to apply some violence until it became proper static initializers in C source code.

furthermore, testing such code is non-trivial: the input is, of course, completely random-looking bits. and the way bitstreams work, I’d have to implement pretty much the whole thing before I got to the interesting part.

so, a few hours later, I figured I’m done with CABAC and reconstructing H.264 data structures, and pointed my new tool at some random test videos. and it worked first try! the structures my program spit out looked pretty much as expected, the transform coefficient matrices had pretty shapes and looked just as you’d expect them to, and I was quite happy with that.

and then I moved on to actually decoding the picture from the coefficients, and this time absolutely nothing worked. random garbage on screen. I spent a long time looking at my 2D transform code searching for bugs, but couldn’t find anything.

and then it hit me exactly what “entropy coding” means. I implemented something that intimately knows and exploits the statistical properties of what video transform coefficients and other structures look like, their probabilities and internal correlations, and uses that to squeeze out entropy and reconstruct it on the other end. my “looks good” testing meant absolute jack shit: I could’ve thrown /dev/urandom into the CABAC decoder instead of actual H.264 video, and it would still look like good video data at this stage until you actually tried to reconstruct the picture.

and sure enough, it turned out I fucked up transcribing some rows from the PDF around a page break or something.

10 years later, I think of this experience every time I see a vibecoded pull request, or other manifestation of AI bullshit. all the right shape, and no substance behind it.

and people really should learn to tell the fucking difference.

PSA: The Amazon wishlist doxing threat is much greater and more immediate than folks might realize. Attack works like this:

Stalker who wants your address opens an Amazon seller account and lists themselves as a third party seller for any item on your public wishlist. Then, they order the item from themselves as a gift for you. Bam, they have your address.

In particular, attack does not depend on an existing third party seller having poor PII handling hygiene, like the articles have implied.

Basically: If you run OpenClaw connected to any meaningful system you are not fit to design, program or run any kind of software. That disregard for security and quality should leave a black mark on you for many years.

If you're an English native speaker, do you ever use the word "minute" as a verbÂč to mean taking notes at a meeting?

A — I do
B — I would use the word, but that context normally doesn't come up in my life
C — I would not, because that meaning is archaic

(Boosts appreciated.)

Âč Emphasis on "as a verb". An example sentence would be to say "could you join us to minute the conversation?"

A
31.5%
B
22.6%
C
46%
Poll ended at .

@theorangetheme @davidgerard

I’ve never really believed in the 10x developer, but I was quite fortunate that I spent a long time in software development before I encountered a very real phenomenon (surprisingly common in big tech companies): the -1x developer. The person who creates so many problems that it is an entire other person’s job to clean up the results of the mistakes that they make.

In some rare cases, you may even encounter a -2x or -3x (or more) developer: someone who writes code very quickly and leaves a trail of devastation. They will do things like start a large-scale refactor that doesn’t solve any real problems and introduces some major design flaws, get half way through it, and then use the fact that they are leading this major transition to get promoted and transferred to a different part of the company. The team left behind has to deal with either finishing the refactoring and ending up in a worse place, or undoing all of their work. Both approaches delay new features and the team looks much less productive while they deal with this fallout. And the sudden drop in perceived productivity after they leave is used as evidence for how great they were (‘oh, yes, when I was in that team we were shipping new features and I was leading a transition to pay down a load of technical debt. As soon as I left, the whole thing fell apart. It’s a shame, but I had to move to a place where my skills could benefit the company more.’).

I can see LLMs allowing a -1x developer to easily become a -10x developer. And honestly believe that they are more productive because they never realised that their productivity was negative to start with. I would be entirely unsurprised to discover that industry is now littered with LLM-enabled -10x developers. Technical debt is too weak a term for their output. Companies are accumulating technical nuclear waste and it will be decades of work to fix all of the problems that they have caused.

Je suis en Master 2 Politiques des BibliothÚques et de la Documentation et je travaille sur mon mémoire qui porte sur la non présence des gens de classes populaires en bibliothÚque.
J'ai besoin de mener des entretiens d'une heure avec des gens qui estiment ĂȘtre de classe pop et qui ne vont pas en bibliothĂšque / n'y sont jamais allĂ©s. Je paye le cafĂ© si vous ĂȘtes chaud !
Si vous connaissez des gens qui rentrent dans cette catégorie, hésitez pas à faire tourner !

Repouet apprécié