derpyunicorn

0 Followers
9 Following
12 Posts
I'm working in ML model validation, and I'm particularly interested in any kind of monitoring models.
Of course all living beings are also born with sets of behavioral patterns, so they are "pre-trained" in that sense. But it doesn't seem like LeCun is talking about those.

@Riedl

LeCun's arguments about why LLMs are limited seem fine to me. However his last slide is just plain wrong. He claims that "almost nothing is learned through supervision or imitation". Babies/toddlers learn almost everything by imitation, and later of course comes a lot of supervision (schools). Animals also learn through imitation and some light supervision.

https://drive.google.com/file/d/1BU5bV3X5w65DwSMapKcsr0ZvrMRU_Nbi/view?usp=drivesdk