If the ultimate purpose of memory is to guide our actions in future, what is the point of episodic memory?

Why do we remember details of our past experiences?

@cian Perhaps useful to not just predict future events but to imagine ithem, such that we feel almost as though they were happening.
@cian Why important? Perhaps because conscious feelings give ideas motive force.
@cian Detail or uniqueness is especially helpful for keeping track of novel experiences (e.g., being able to distinguish rare things) -- and probably we don't remember all things with as much detail.
@BayesForDays that's a good point too. Need some level of detail for specificity. I guess intuitively it feels like I remember a lot of completely useless details though? But maybe that's just the brain overestimating what it needs to store
@cian @BayesForDays I would guess we need that "overestimate" to be flexible, be able to use information from previous experience for unforeseen inference. If we only brought forward what seemed like useful details at the time, we'd miss out on a lot of possibilities to mine that previous experience for new insights once we'd gained a new perspective. I think our ability to do that is limited by the capacity you mentioned elsewhere but that there are benefits to using that capacity for this.

@cian
If (a big if) we performed generalisation at retrieval (rather than at storage, as in almost all current artificial neural networks) then the episodic memories would be the essential input to the generalisation (and inference) process. You are best placed to know what dimensions to abstract over when you have a specific current task and goal to drive generalisation and inference.

(Of course, having arrived at some specific generalisation from the current retrieval, that generalisation might be stored as part of the current episodic memory and be available to guide future generalisations on retrieval.)

What are the implications if episodic memory is the primary form of memory and other (declarative/procedural/etc) memories are epiphenomena arising out of the episodic memories?

#CogSci #CognitiveScience #MathPsych #MathematicalPsychology @cogsci

@RossGayler @cogsci thats very interesting. I guess people historically dismissed this for two reasons, 1) assume we have a limited storage capacity so aren't good at raw memorisation; 2) even after learning something, we tend to forget details over time. Neither of these really apply to ANNs? (I think)

@cian @cogsci
Just off the top of my head (*speculation alert*)

"1) assume we have a limited storage capacity so aren't good at raw memorisation"
Maybe everything gets encoded and stored, but we aren't so good at retrieval/recall of specific episodes.
Maybe that poor exact episodic retrieval is a consequence of generalisation at retrieval.

"2) even after learning something, we tend to forget details over time."
Assuming we store new episodic memories over time, the accumulation of new memories might make it harder to retrieve specific old episodes through a generalisation at retrieval mechanism. (Also, even if parts of old episodes were to randomly disappear over time, that wouldn't necessarily stop a generalisation at retrieval mechanism. A good generalisation mechanism should be able to cope with partial records of episodes.)

"Neither of these really apply to ANNs?"
Well, you could include weight decay in an ANN and there is the phenomenon of catastrophic forgetting. However, I take the relevance of most current ANNs (feedforward, weight optimising networks) to cognitive concerns with a fairly large pinch of salt.
IMO the theoretical conceptual framework of most current ANNs doesn't make contact with cognitive concerns so you can't really ask these questions of them.

@cian @cogsci

Plus there are cognitive science people who argue that analogy is the core of cognition.

Analogy is a mechanism for generalisation at retrieval and the stored episodes are the input to the analogical mechanism.

@cian what are you thinking of as the alternative?
@jonny good q! I guess that we remember fewer details than we seem to do. Or no details at all?