MegaLoL:

"Beijing has said the US spin-off of TikTok to be sold to American investors in a deal orchestrated by President Donald Trump will use parent company ByteDance’s Chinese algorithm.

Wang Jingtao, deputy head of China’s powerful cyber security regulator, on Monday said US and Chinese officials had agreed a framework that included “licensing the algorithm and other intellectual property rights”.

He said ByteDance would “entrust the operation of TikTok’s US user data and content security”, without elaborating.

Trump, who is expected to finalise the deal when he speaks to China’s President Xi Jinping on Friday, has again extended the deadline for the US app to be divested from its Chinese owner.
(...)
An Asia-based investor of ByteDance said the new US TikTok entity would use at least part of the Chinese algorithm but train it in the US on American user data.

“Beijing’s bottom line is a licensing deal,” said the investor. “Beijing wants to be seen as exporting Chinese technology to the US and the world.”

“It’s the ultimate Taco trade,” said one US adviser close to the deal, referring to the acronym “Trump always chickens out”. “After all this, China keeps the algorithm.”

https://www.ft.com/content/550e4680-89e7-4b59-bb5d-2064cd6799c7

#USA #Trump #SocialMedia #TikTok #China #Algorithms #RecommendationEngines #ByteDance

Client Challenge

"The algorithmic recommender systems that select, filter, and personalize experiences across online platforms and services play a significant role in shaping user experiences online. These systems largely determine what users see, read, and watch, fueling debates around their potential to amplify harmful content, foster societal division, and prioritize engagement over user well-being. In reaction, some policymakers have turned to blanket bans on personalization or to the promotion of chronological feeds. But there are many better alternatives. Suggesting that users must choose between today’s default feeds and chronological or non-personalized feeds creates a false choice.

This report, prepared by the KGI Expert Working Group on Recommender Systems, offers comprehensive insights and policy guidance aimed at optimizing recommender systems for long-term user value and high-quality experiences. Drawing on a multidisciplinary research base and industry expertise, the report highlights key challenges in the current design and regulation of recommender systems and proposes actionable solutions for policymakers and product designers.

A key concern is that some platforms optimize their recommender systems to maximize certain forms of predicted engagement, which can prioritize clicks and likes over stronger signals of long-term user value. Maximizing the chances that users will click, like, share, and view content this week, this month, and this quarter aligns well with the business interests of tech platforms monetized through advertising. Product teams are rewarded for showing short-term gains in platform usage, and financial markets and investors reward companies that can deliver large audiences to advertisers."

https://kgi.georgetown.edu/research-and-commentary/better-feeds/

#SocialMedia #SocialNetworks #Algorithms #AlgorithmicRecommendation #RecommendationEngines

Better Feeds: Algorithms That Put People First – Knight-Georgetown Institute

Knight-Georgetown Institute

#SocialMedia #Algorithms #Instagram #RecommendationEngines #CoffeeShops: "Social media acumen requires awareness of each platform’s recommendation algorithm. Walsh observed that some companies may have great stories to tell, but they “are not attempting to keep up with these algorithmic patterns that will allow them to be visible to a larger audience”. Maybe they don’t post often enough, or they don’t keep up with shifts, such as Instagram promoting videos more than still images, a particularly stark change that occurred around 2022 as the platform attempted to mimic TikTok. Staying on top of what the algorithm demands is not easy, and even well-informed guesswork doesn’t always produce results. As Walsh told me: “We’ve put a lot of time and energy into creating beautiful content. But as a result of that algorithm, we find we’re not necessarily hitting as many eyeballs as we think we could or should, and sometimes that can be a little disheartening.”

“I hate the algorithm. Everyone hates the algorithm,” said Anca Ungureanu, the owner and founder of Beans & Dots, a coffee company in Bucharest, Romania, with its original location in a former printing plant. Her goal was to build “something that did not exist at that moment in Bucharest” – a space that was, at least aesthetically, non-local. It draws an international crowd; when someone searches Google for speciality coffee shops in Bucharest, Beans & Dots pops up. Ungureanu developed an Instagram account full of cappuccino snapshots and more than 7,000 followers, but grew frustrated when she felt that the platform was taking away her ability to access her audience through the feed. When her cafe started selling coffee online, Facebook and Instagram seemed to throttle its reach – unless it bought advertising and boosted the social media company’s own profits. It felt like algorithmic blackmail: pay our toll or we won’t promote you.
(...)
Other cafe owners I spoke to made the same complaint."

https://www.theguardian.com/news/2024/jan/16/the-tyranny-of-the-algorithm-why-every-coffee-shop-looks-the-same

The tyranny of the algorithm: why every coffee shop looks the same

The long read: From the generic hipster cafe to the ‘Instagram wall’, the internet has pushed us towards a kind of global ubiquity – and this phenomenon is only going to intensify

The Guardian
When I visit my son in Cleveland next month, we're going to Cedar Point for one day. To prepare, he sent me links to the official PoV videos for their roller coasters on YouTube. Now I fear the YouTube algorithm will be recommending roller coaster videos for me for the next 6 months. 😅 #YouTube #recommendationEngines

#YouTube #Algorithms #News #Media #Journalism #RecommendationEngines: "Huang and Yang used a data set of 1.7 million of YouTube’s “Up Next” recommended videos in 2019, using automated incognito browsing to eliminate any individual watch histories. They used network analysis, mathematical modeling, and Markov chains to determine the likelihood of news videos being recommended versus other topical categories.

They found that the topical filter bubble effect was stronger for most types of entertainment videos than for news (the stickiest topic in YouTube’s recommendations: cars), and that algorithmic redirection worked much more in entertainment videos’ favor, too. In other words, if you watch an entertainment video, you’re far more likely to be recommended the same genre of video than if you watch a news video.

The result is that as a user, you might start out watching news, but you’re likely to see more and more entertainment videos pop up as recommendations until you eventually watch one of them instead. On average, the researchers wrote, an entertainment video was three times more likely to be recommended than a news video, “indicating that no matter what users start with on YouTube, they are more likely to end up watching entertainment than news videos.”

Of course, recommendation algorithms don’t determine what people watch by themselves. Users can choose which of several recommended videos to click on, or what to type into the search bar, or whether to stop watching entirely. But Huang and Yang’s study isolates the influence of YouTube’s recommendation algorithm itself."

https://www.niemanlab.org/2024/06/how-youtubes-recommendations-pull-you-away-from-news/

How YouTube’s recommendations pull you away from news

Plus: News participation is declining, online and offline; making personal phone calls could help with digital-subscriber churn; and partly automated news videos seem to work with audiences.

Nieman Lab

#SocialMedia #SocialNetworks #ContentModeration #Algorithms #RecommendationEngines #Messaging: "So you joined a social network without ranking algorithms—is everything good now? Jonathan Stray, a senior scientist at the UC Berkeley Center for Human-Compatible AI, has doubts. “There is now a bunch of research showing that chronological is not necessarily better,” he says, adding that simpler feeds can promote recency bias and enable spam.

Stray doesn’t think social harm is an inevitable outcome of complex algorithmic curation. But he agrees with Rogers that the tech industry’s practice of trying to maximize engagement doesn’t necessarily select for socially desirable results.
Stray suspects the solution to the problem of social media algorithms may in fact be … more algorithms. “The fundamental problem is you've got way too much information for anybody to consume, so you have to reduce it somehow,” he says."

https://www.wired.com/story/latest-online-culture-war-is-humans-vs-algorithms/

#SocialMedia #Facebook #Algorithms #RecommendationEngines #AI #Spam: "Facebook’s recommendation algorithms are promoting bizarre, AI-generated images being posted by spammers and scammers to an audience of people who mindlessly interact with them and perhaps don’t understand that they are not real, a new analysis by Stanford and Georgetown University researchers has found. The researchers’ analysis aligns with what I have seen and experienced over the course of months of researching and reporting on these pages, many of which have found a novel way to link to off-platform, AI-generated “news” sites that are littered with Google ads or which are selling low-quality products.

Last week the world was introduced to Shrimp Jesus, a series of AI-generated images in which Jesus is melded with a crustacean, and which have repeatedly gone viral on Facebook. The images are emblematic of a specific type of AI image being used by spammers and scammers, which I first wrote about in December but have repeatedly made the masses go “WTF” and “WHY?” when shared away from an audience of Facebook users who are seemingly unable to detect them as AI, or don’t care that they are AI. “WHAT IS HAPPENING ON FACEBOOK,” a viral tweet about Shrimp Jesus read." https://www.404media.co/facebooks-algorithm-is-boosting-ai-spam-that-links-to-ai-generated-ad-laden-click-farms/

Facebook’s Algorithm Is Boosting AI Spam That Links to AI-Generated, Ad-Laden Click Farms

Viral 'Shrimp Jesus' and AI-generated pages like it are part of spam and scam campaigns that are taking over Facebook.

404 Media

#Ireland #AI #GenerativeAI #Algorithms #RecommendationEngines: "Ireland cannot put its faith in "voluntary action by tech corporations" to safeguard young people online from the rise of generative artificial intelligence (AI) technologies, the Oireachtas children’s committee will hear today.

The Irish Council for Civil Liberties (ICCL) is set to speak to the committee on Tuesday regarding the safe use of AI by young people.

”Technology corporations have a very poor record of self-improvement and responsible behaviour, even when they know their technology is harmful, and even when lives are at stake,” Dr Johnny Ryan, director of the ICCL’s Enforce unit is due to tell the committee.

“Tech corporations will not save our children,” Dr Ryan is expected to say. "

https://www.irishexaminer.com/news/arid-41330389.html

AI is already shaping the world our children see, ICCL to warn committee

Irishexaminer.com

#Journalism #Media #News #BBC #RecommendationEngines #PublicBroadcasting: "The BBC is the world’s largest public service broadcaster. Every week it reaches more than 90% of the UK’s adult population and 489 million people worldwide. To ensure our audiences get the most engaging experience, our team develops recommender systems which aim to provide users with the most relevant pieces of content among the thousands the BBC publishes every day. All BBC output should serve the organization’s mission to “act in the public interest, serving all audiences through the provision of impartial, high-quality and distinctive output and services which inform, educate, and entertain.” Recommendations make no exception and, since they determine what our audiences see, they are in effect editorial choices at scale. How can we ensure that our recommendations are consistent with our mission and public service values, avoiding some of the harmful effects that might be associated with recommenders? In addressing this question, we identified two main challenges: 1) methodological challenges: public service values are hard to measure through specific metrics, therefore we have no clearly defined optimization function for our recommenders; 2) cultural/operational challenges: domain knowledge around public service values sits with our editorial staff, whereas data scientists are the recommendations specialists. We need to create a shared understanding of the problem and a common language to describe objectives and solutions across data science and editorial. Our paper describes the approach we devised to tackle these challenges, presenting a use case from our work on a BBC product, and reporting the lessons learned."

https://knightcolumbia.org/content/recommenders-with-values-developing-recommendation-engines-in-a-public-service-organization

Recommenders With Values: Developing recommendation engines in a public service organization

Platformed! How #Streaming, #Algorithms and Artificial Intelligence are Shaping #Music Cultures

Tiziano Bonini and Paolo Magaudda

(Palgrave Macmillan, 2024)

"Offers an up-to-date overview how the music industry has transformed in response to digitalization and online platforms

Shows how digital music platforms and the cultural value of music in today’s society are mutually constructed

Presents an analysis of emerging music technologies, like artificial intelligence and blockchain for music circulation."

#AI #RecommendationEngines

https://link.springer.com/book/10.1007/978-3-031-43965-0

Platformed! How Streaming, Algorithms and Artificial Intelligence are Shaping Music Cultures

This book examines the impact of contemporary digital technologies like algorithms and streaming platforms on music and the music industry.

SpringerLink