What does “good” coverage of AI mean to you? I wrote about how the disparate views that very smart people have about existential risk are making it hard to calibrate how to cover advancements in artificial intelligence.

https://www.platformer.news/p/why-im-having-trouble-covering-ai

Why I'm having trouble covering AI

If you believe that the most serious risks from AI are real, should you write about anything else?

Platformer
@caseynewton I think that the people with seemingly non-overlapping views are the most important to put in (metaphorical) conversation with each other. People who believe that current approaches to machine learning are on a path to “AGI” and “superintelligence” should not get forums where their baseline views are accepted without substantive challenge. But these views shouldn’t be ignored or casually dismissed either.
@caseynewton As a layperson, having thought about all this stuff a bit obsessively, I think the skeptics seem to have the better argument, but I would give at least a 10% chance (totally arbitrary) that I’m wrong about that. So I think that’s important to explore.
@caseynewton I think the biggest risk for you as a journalist would be to be captured by the industry’s view of itself and its work. Even in your linked piece, which I respect, both positions you present as counterpoints are squarely within that industry ideology.