What is an Ecology of Protections?
In our paper, we illustrate how social media and information infrastructures can be imagined as an ecosystem of agents interacting through processes (policies) and behaviors (actions of users and automations).
This ecosystem model makes it possible to understand how our current protections for photosensitive users are inadequate, allowing for multiple vectors of accidental and malicious exposure to dangerous flashing content.
[Image 1: Narrative Description of Diagram:
Platform Executives and Marketing Clients have a cyclical revenue relationship.
Platform Executives influence Platform Designers and Developers.
WCAG guidelines are meant to support Designers/Developers, but may not always be followed.
Marketing Clients sometimes create flashing content.
Non-sensitive users who are unaware of photosensitivity may also create and share and circulate flashing content, which can lead to accidental exposure for Photosensitive Users.
When Non-sensitive Users who are aware of photosensitivity are exposed to flashing content, they may warn photosensitive users.
Non-sensitive Users who are aware of photosensitivity may also intentionally or maliciously expose sensitive users.
If any user attempts to report a flashing graphic, reporting features do not have appropriate categories, and platform developers are never notified of the problem.
Sensitive users may also experience resets or overrides to their safety settings on major app updates.]
A more robust ecology of protections centers the photosensitive user, even though they are not "the majority", providing them with multiple layers which they can control to protect themselves from exposures when other policies and community norms fail.
[Image 2: Narrative description of the aggregate re-world map.
The photosensitive user is protected from ambient, accidental, and malicious exposure by a dual layer of protection by device level settings and app level settings.
If a developer update overrides app-level settings, the device level settings correct this.
Users who are aware of photosensitivity are able to educate other users to prevent circulation of dangerous content.
Advertisers and other users may still create and share flashing content, but an enforcement body can punish advertisers and platforms for creating or failing to prevent the circulation of flashing content.
Users are able to report dangerous content to both the platform and to regulatory bodies.
Malicious attacks are met with suspension or bans.]
Such a nested system of protections can prevent ambient, accidental, and malicious exposure to dangerous flashing media. Establishing these features and community norms may further protect all users from psychologically harmful content.
It is important to note that such provisions rely on centering user safety over corporate interests which rely on non-vonsensual auto-play to hook user attention and manipulate purchasing behavior. To this I say, get more creative. Or maybe... stop being a predator.
Additionally, the central layer of protection requires implementation by OS developers to provide an optional postprocess filter which can inturrupt dangerous luminance shifts at the pixel buffer level. Other research attempts to classify and predict dangerous flashes through ML techniques, but this is not necessary. A always-on simple algorithmic edit to the next array of pixels provides fail-safe protection at a minor aesthetic expense - a ghost or trailing effect.
[Image 3: An original sequence of frames which flash between pure black and pure white.
A flash detection mask is calculated and a pixel-attenuation filter is loaded.
In the final sequence, each frame is attenuated by the detection mask and filter, producing the impression of a flash without the actual danger.]
Image 3 demonstrates the proposed filter applied to a full screen blink sequence. Between-frame calculations of flash detection using South et al.’s algorithm would produce a full white mask for each frame, indicating that each frame change was dangerous and needed to be adjusted. In other examples, the calculated mask would identify specific regions for pixel filtering. The mask could then be applied to a calculation which adjusts how much the next frame transitions between the previous frame and the intended next frame. In the example here, frame 2 becomes frame B, which is two plus the mask (Δ1), multiplied by a fraction (q). Frame 3 becomes frame C, which is the original frame 3 plus the calculated difference between frames 2 and 3 (Δ2), minus the pixel values in frame B. The first flashing sequence is significantly faded, and over time the flashing sequence averages out to show level changes without strobing.
This effect may look strange, but photosensitive users already use "unaesthetic" filters to limit the overall brightness of their devices. Such filters may even enable content creators to change their aesthetic choices to eliminate ghosting which would indicate a safer content for all.
https://dl.acm.org/doi/abs/10.1145/3663548.3675610
#ASSETS2024 #GraphicsProgramming #TechnologyPolicy #Migraine #Epilepsy #photosensitivity #photophobia

