I see everyoine is (rightly so) ripping on Musk's new all-male xAI effort to "understand the true nature of the universe."

But the real problem is AI can't tell us anything new beyond what scientists have already figured out. It doesn't do any "thinking" or actual research or anything new; it simply regurgitates re-formulated ideas based on the science of what we already know.

What is MIGHT produce, especially since Musk is leading this, is garbage speculation unsupported by evidence.

@petergleick this is patently wrong on its face, considering how much new science AlphaFold has already enabled
@pmcarlton @petergleick Alphafold is, so far, *the* singular example of a real advance, vs a convenience (eg code completion). It seems like ML is best used for domain specific tasks, at least at this time.
@mglo @pmcarlton @petergleick Alphafold does not produce ideas, scientists do. Alphafold produces good structure predictions because it has large training sets from extant research. That's useful but it's not intelligence. It's not a new paradigm to think about protein structure. It's a sophisticated algorithm that searches correlations in large sets of data. Some algorithms in medical imaging for instance do that too, and so does ChatGPT. That doesn't give us the "true nature of the universe".
@JoseEdGomes @pmcarlton @petergleick that’s my point ( I use AF all the time). But it is the poster child for useful AI systems. The point being that the utility of these systems seems to depend on areas where there is really good data, but that only covers a small portion of the applicable problem space (ie it’s good for extrapolation).