Workaccount2 on Hacker News just coined the term "context rot" to describe the thing where the quality of an LLM conversation drops as the context fills up with accumulated distractions and dead ends https://news.ycombinator.com/item?id=44308711#44310054

@simon Interesting, Simon. I feel like this concept of "context rot" describes my life too:

When I have lots of distractions and discouraging dead-ends and useless info filling my brain, my output quality falls off rapidly too! 🤯

@clairegiordano @simon Same, when I talk to my friends, somehow the main topic always fades to the background and we end up going down a million useless paths, but that doesn't mean it's not fun!
@simon The context rot would be the same issue described by the "Lost in the middle" paper, right? i.e. LLMs have a hard time locating answers with increasing context length
Or do you feel there is a distinction?
@pamelafox I think context rot is more of an end-user concern: it's not so much about them missing details, it's about them getting distracted by poor quality content that's made it into the context already

@pamelafox @simon

lost in the middle is more of a retrieval problem (haystack) - that's already fixed. it's fundamentally a long context reasoning issue. models still can't properly do multi-step reason over information scattered across long contexts. but they are getting better and better each iteration.

@simon definitely seen this in cursor giving it an iterative task that involves producing output over many files
@simon hmm.. on an unrelated note-can this be applied to coin a term 'agent rot' where the growing number of child agents just goes beyond the scope of search space of parent nodes that any new addition of a child node yields diminishing result?
i am talking about Darwin godel machines https://news.ycombinator.com/item?id=44174856
A deep dive into self-improving AI and the Darwin-Gödel Machine | Hacker News

@simon How many more datacenters running off town's electric/water would it take to fix this problem?