@fnord "one of the most innovative companies in the world" -- brilliant. At the current discount that sounds like a wonderful opportunity -- innovation will drive future returns. I hope you are fully invested, with leverage.
@guyreading From first principles, this seems highly unlikely. This is circumstantially confirmed by the fact that, when translating ARC problems to sequences, the largest LLMs out there (not just GPT-3, but much larger ones as well) don't work at all.
I was happy that it was this easy to do. I didn't even have to make it a Model or a Layer (though I did make it a layer for a separate reason -- to make sure a subclassed layer could own a FeatureSpace and have it tracked for later saving)