Graham Smith

@cyberleagle
224 Followers
18 Following
429 Posts
IT and internet lawyer. Sceptical tech enthusiast. RTs and links are not endorsements. All views my own. No tweets are legal advice.
Blogwww.cyberleagle.com
@jim The Environment Agency was effectively a pollution licensing agency long before anyone squeezed its funding.
Been on a bit of a journey reading about Systems Thinking in the last few weeks. Why is that, you may ask? Well: point one, I wanted to understand the arguments for #SafetyByDesign as added to the UK #OnlineSafetyAct which says that services must be "safe by design".
@jim @cstross Not just business model, content or users, but also - according to the theory - technical features and functionalities.There is quite a heavy emphasis on that aspect in both the Act and Ofcom’s implementation.
@bjn @cstross @jim Lastly, an earlier piece on differing views of safety by design, written before the responses to the Ofcom Additional Measures consultation discussed in the more recent piece. https://www.cyberleagle.com/2024/12/safe-speech-by-design.html
Safe speech by design

Proponents of a duty of care for online platforms have long dwelt on the theme of safety by design. It has come to the fore again recently w...

@bjn @cstross @jim The piece to which I linked also discusses how the theory of safety by design was originally crafted with a few large social media companies in mind and tends to collapse if applied beyond that.
@bjn @cstross @jim Other than ‘think about safety at the design stage’ there is little clarity about what, in the context of the Online Safety Act, safety by design is supposed to mean. In the past its proponents have seen it as an alternative to content-focused measures, but now we have suggestions that e.g. automated content filtering is a safety by design measure. If, as has recently been suggested, the OSA should formally define it, we have to understand it first. https://www.cyberleagle.com/2026/02/safety-by-design-or-systems-for-content.html
Safety by design or systems for content moderation?

The Online Safety Act Network (OSAN) recently published a 10-point plan to amend the Online Safety Act. The plan includes: “Insert a defi...

Semi-seriously, I propose a moratorium on advocacy for online safety by design until its proponents have figured out what, concretely, it does and (perhaps more importantly) does not mean. Let’s face it, if it includes automated content filtering is it really anything more than an empty slogan? 3/4
Even then, there is a more than lurking suspicion that the theory was crafted specifically with a few large social media platforms in mind, and is liable to collapse if extended beyond that. 4/4 https://www.cyberleagle.com/2026/02/safety-by-design-or-systems-for-content.html
Semi-seriously, I propose a moratorium on advocacy for online safety by design until its proponents have figured out what, concretely, it does and (perhaps more importantly) does not mean. Let’s face it, if it includes automated content filtering is it really anything more than an empty slogan? 3/4
Last year’s submissions to Ofcom’s Additional Measures consultation, however, left me thoroughly confused. E.g. 2/4 https://www.cyberleagle.com/2026/02/safety-by-design-or-systems-for-content.html