#SocialMedia #SocialSciences #BigTech #AlgorithmicRecommendation: "In downplaying the role of algorithmic content curation for issues such as misinformation and political polarisation, the study became a beacon for sowing doubt and uncertainty about the harmful influence of social media algorithms.
To be clear, I am not suggesting the researchers who conducted the original 2023 study misled the public. The real problem is that social media companies not only control researchers’ access to data, but can also manipulate their systems in a way that affects the findings of the studies they fund.
What’s more, social media companies have the power to promote certain studies on the very platform the studies are about. In turn, this helps shape public opinion. It can create a scenario where scepticism and doubt about the impacts of algorithms can become normalised – or where people simply start to tune out.
This kind of power is unprecedented. Even big tobacco could not control the public’s perception of itself so directly.
All of this underscores why platforms should be mandated to provide both large-scale data access and real-time updates about changes to their algorithmic systems."