Models and frameworks for assessing reality must reduce complexity in order to be useful. So there is a fine balance that must be struck between simplification and accuracy in order for the model or framework to not leave out so much detail that it becomes useless.

The question to ask yourself is: "what is being left out of the model?"

As an example: If we are modeling the social power of a group of individuals, and we are looking at their economic standing, educational backgrounds, and social interconnectedness, but we are leaving out their cultural background and historic access to resources, then our model is going to provide skewed results that give us a terrible depiction of some individuals within the group. This kind of modeling has often resulted in things like racism.

So being aware of what is excluded is often even more important than what is included.

#CoordinationAsPower

@robcayman

I'm extremely biased by my physics background, but one more "danger" that comes with modelization is that even when a model "works well" (i.e. explains reality and predicts experiments/events accurately) it generates complacency and it makes it hard to replace said model with a more comprehensive one when some unexplained events arise.

"Don't fix it if it's not broken... and even if it's broken, can we use it still?"

@clockwooork so you're saying I can't use Newtonian models of the universe to predict the behaviour of muons?

also... I want a lumineferous aether to replace spooky action at a distance...

#EinsteinWasWrong
.
.
.#Sarcasm

@robcayman

During our Masters we've had a whole ass course to learn the "patches" to the Standard Model, since physicists are so attached to it that they refuse to come up with anything different