@Garwboy Interesting... I've always wondered why evolution would do that. Maybe because it didn't...
I had heard/remembered it as 14% which I realise now is an oddly specific number. I've also understood it to be about a percentage of capacity rather than use of the whole thing, and about the memory rather than the whole brain.
I'm trying to remember (from decades ago) what the argument was and I think it was this: if you assume the human memory parts of the brain to be the simplified model of an associative neural network (which it isn't) and memories to be isolated events (which they definitely aren't), and you know how many nodes the memory parts of the brain has (I assume we can have a reasonable estimate here) and you somehow estimate how much things a person experiences in their lifetime, you can calculate that it is only 14 % of capacity.
The way I understand it is that in
artificial simplistic associative neural networks, all nodes are memory nodes, input nodes and output nodes together. The first memory is stored on all nodes, the second is stored 'on top of it', etc until capacity is reached. So it uses both always 100% and a fraction depending on what you mean. With more nodes, there's more capacity. But more nodes also means more detail in the memories (losing nodes loses detail rather than whole memories), which I think may be an evolutionary advantage. Also, I can imagine being close to capacity gives many false associations.
Obviously all these assumptions are wildly wrong and biology is more complicated than artificial models explained to us electrical engineering students...