I feel like the theme of the 2020s is epistemology.

For those who aren't aware, epistemology is the branch of philosophy that deals with the nature of knowledge, as in,

- What does it mean to say you "know" something?
- What does it mean for something to be "truth"?
- What is "real"? How do you "know" that?
- What is the difference between "belief" and "knowledge"?
- What's the best way to go about finding more of this "knowledge" stuff, whatever it is?
- What are the limits of knowledge?

For various reasons, these are questions that everyone has begun to confront on a day-to-day basis. It's no longer a theoretical exercise for weird nerds to ponder about; we have to deal with epistemology at an extremely practical level now, and few people are equipped for it.

I had an exchange on Discord with someone in the astronomy club that I'm a part of that made me realize a common epistemological pitfall that a lot of people fall into nowadays (including myself, at times).

I posted a video about some research into dark matter (look back in my feed), and this guy was like "Neat idea, but dark matter can't interact with normal matter, so it'll never work."

When pressed, he said it was because the YouTubers he watches say that.

/1

Ignoring the fact that his statement indicates he did not actually *watch* the video, and that a YouTube video is not a reliable source of truth, there are a couple of bigger issues at play here.

One is a fundamental misunderstanding of how knowledge works. It's not a binary thing; knowing something is not like flipping a switch. There are levels to it, gradations.

The other is that knowledge is not a thing you *have* it's a thing you *use*.

/2

What do I mean by this? Knowledge is an active thing, not a passive thing. How knowledgeable you are on a subject is not directly related to how much information about the subject that you have absorbed. Reading a book or an article or watching a video on a topic does not mean you actually *know* anything about that topic. Reciting facts is not knowledge, or more precisely, it's an infinitesimal amount of knowledge.

/3

How well you know something is related to your ability *use* that knowledge.

To use the dark matter example, an astrophysicist doesn't know more about dark matter than me because they have read more books or taken more classes than me. It's not because they can recite more facts about it.

They know more because they can actually *use* that information to do meaningful things. They can design and execute experiments. Interpret data. Make predictions. Come up with explanations for things.

/4

Nowadays, it's possible for one to read an article (or watch a video, or read the Wikipedia article) about something like dark matter and _feel_ like one knows something about the topic. It makes sense to you. You can recite facts, and maybe even form an opinion and argue with other laypeople about it.

But that's *all* you can do with that level of knowledge. The experts can do so much more, to the point that you don't even have the knowledge to really grasp _how_ much more.

/5

Reading or watching videos about complex topics is great. I don't even think that people forming opinions on stuff they know little about and arguing about it on the internet is necessarily bad.

My only point is to keep things in perspective. Your opinions or understanding of things like dark matter are essentially irrelevant. Unless you are directly involved with astrophysics on a daily basis, it doesn't matter if your opinions or understanding on that topic are correct or not.

/6

This is fundamentally what separates a crank from a legit scientist. There are millions of YouTuber cranks going on about the magical theory they invented that explains all this stuff that the scientists can't. They know something the experts don't, but it's all a conspiracy or something.

Okay, maybe you *do* know something the experts don't. It's totally possible. But that means you should be able to do something they can't. So demonstrate that. And not to me, a nobody, but to the experts.

/7

I'm always annoyed when people throw their favorite pet theory at me about some topic, as if I'm equipped to evaluate their claims. As if *my* opinion on the topic matters at all.

I'm not the person that needs to be convinced. If your idea is correct — if you can clearly demonstrate how your idea is valuable and useful — the experts will listen. If you do it right, they will not be able to ignore you. The fact that they do ignore you should tell you something.

/end

This all applies equally well to me and this entire thread. What do I know about epistemology? Very little, compared to professional philosophers who think and debate about such things every day. I might talk like an authority and asserting truths, but I don't know shit. I'm just thinking out loud and forming baseless opinions, as is my right as an intelligent agent.

Maybe I'm right, maybe I'm wrong. Does it matter? Hard to say. I'm not equipped to evaluate that.

/PS

@malcircuit You know (no pun intended), my reaction to your posts is *very* often.. wow, I'm glad I read that. So, thank you. Carry on.
@malcircuit fantastic thread, thank you

@malcircuit

A meme I made some while back,,,

@malcircuit This thread is very much in my (entirely amateur) wheelhouse -- thank you for writing it :-)

I think there is something I've always (ish) understood about the pursuit of knowledge (aka epistemics), especially when it comes to determining the outcome of a debate, but which many (most?) people don't seem to understand (for whatever reason):

Deciding which of a set of ideas is "true" (or, more precisely, "the most likely to be true given what we know") depends on following the trail of logic for each argument back to its premises and reconciling contradictory statements -- not, as many people seem to think, on weighing the mere quantity of arguments on each side against the other, much less on which side sounds more "persuasive".

I have to wonder if there's any way to spread this understanding of epistemics, or if most people are just stuck in whatever way of thinking they learned growing up...

cc: @Athena @Fishercat @FormerlyStC

@malcircuit Thanks for this interesting thread. The example that comes to mind is flat earthers - they claim to have knowledge that they very clearly don’t. They’re not using the scientific method, they’re spouting bullshit.

How do I know? Because real scientists and mathematicians can use their knowledge to predict useful things like the time the sun rises, or tides, or lunar eclipses or planetary motion or the return date of Haley’s comet — and in the extreme, guide rockets to send robotic probes to orbit (or even land on) other planets.

Think about all those useful, amazing things…. Then picture some YouTube moron insisting a round earth is a conspiracy, despite the fact that humans accurately estimated the circumference of Earth over 2000 years ago — with a stick.

@malcircuit

> If you do it right, they will not be able to ignore you

big if true, but i definitely can't do it right, so i gotta crank at randos or go quietly in the night

@malcircuit

There are a few who ask the right questions: starting with "Where do I start reading?"

@malcircuit

I’m not an expert … but sometimes I can ask questions after learning about cool stuff!

Of course the question may not be interesting to experts but sometimes they answer.

https://ruby.social/@stepheneb/115112696824245370

Now I’m curious about what it means when event horizons overlap. Is any information exchanged when they overlap. And once they overlap can they also detach later??

Questions are much more fun than arguing!

Stephen Bannasch (316 ppm) (@[email protected])

@[email protected] If two large black holes were orbiting around each other closely would there be a point where their event horizons overlapped before they got so close they combined. I’m assuming that when two black holes get close they orbit each other faster and faster as they get closer before they combine into one.

Ruby.social
@stepheneb Yes, exactly. People get so attached to their beliefs. They are more afraid of being wrong than learning something new that they stop asking questions, stop admitting their own ignorance, stop being curious. It's kind of sad, in my view.

@malcircuit

Being able to quickly be wrong in the most interesting and useful way is soo cool!

https://ruby.social/@stepheneb/109887737387949575

I wish more folks knew how useful being wrong can be.

Stephen Bannasch (316 ppm) (@[email protected])

@[email protected] Something that might be adjacent ... when I am learning something complex and new I jump in as fast as I can with both a model I know is naive and with just the right amount of confidence to break it quickly and usefully. Deeper in I find myself needing to reason using multiple and conflicting models. It's a strange combination of being both extremely confident and very skeptical at the same time. I'm confident I'm both wrong and good at finding out as fast as possible.

Ruby.social
@malcircuit @stepheneb The Scout Mindset by Julia Galef describes how people are attach their identities to their beliefs, making it harder to update beliefs. She presents a remedy that I try to live by, which is to attach "I pride myself on updating my beliefs" to your identity.
@malcircuit This articulates something that’s been bothering me. I wonder if this hyper-quantified, transactional, metricized culture — in business, in social media, and especially in teach-for-the-test education — corrals people into a pseudo-intellectual surface level of thought where they never learn *how it feels* to genuinely understand an idea.
A friend of mine said that the real point of college is to make sure that young minds are exposed to Big Ideas in a safe environment where they can explore them, compare them with other Big Ideas, and get use to the feeling of grasping Big Ideas and letting them go, so they don’t fall for the trap of encountering their first Big Idea in the wild and latching onto it forever.
@FormerlyStC No, no, you're thinking about it from the wrong direction. Our society, culture, and technology make it such that we are constantly awash in information. We are so immersed in it that it begins to feel like everything can be and is known, and people forget what *ignorance* feels like.