Fahrenheit is exactly what is wrong with USA

https://lemmy.world/post/44602363

Fahrenheit is exactly what is wrong with USA - Lemmy.World

So every time the Celsius Fahrenheit debate comes out, there is always the point made about “Fahrenheit makes more sense, it tells you how is out there: 0 is freezing, 100 is roasting hot”. And yes, that might be accurate but showcases that USA citizens only care about themselves, they do not even care about physics or chemistry. The “it works for me therefore it is good” mentality is what they bring to the world and the most clear example is their choice of framing for justifying Fahrenheit over Celsius.

Fahrenheit was literally devised by a physicist, Daniel Gabriel Fahrenheit, a European, mind you.

It was based on physical properties, too. Originally 0 was the freezing point of a replicable water solution, and 96 was set at human body temperature (96 used as it made dividing a thermometer easier). It was later recalibrated to put boiling at 212, 180 degrees from freezing, but that’s the original basis.

There is no god-ordained rule that states that 0 has to be the freezing point of water, nor 100 the boiling point.

Fahrenheit also has an inherent advantage to Celsius in that for every 5 degrees C there are 9 degrees F. There is more inherent precision.

In metric, we aren’t scared of using a decimal if we have to. Our thermometers can be as precise as we need them.
Yet weather reports rarely include them.
…because we don’t need them.
You go if you want to be precise as Fahrenheit is without decimals.

Weather forecasts are only accurate to 1 to 2 degrees Fahrenheit, or about 1 degree Celsius. So the only example you’ve given where Fahrenheit is “superior” is one where the accuracy is so low that we just shrug and give a number in the middle of the range. This doesn’t make using Fahrenheit more accurate, this just makes the scale irrelevant and we use a whole number because having a convention where we skip some would be pointless.

As for being more precise without decimals, I live in a country with half-decent education standards, so decimals and fractions don’t scare me.

It isn’t about being “scared” of them, it is about them being used.

And we don’t just make forecasts, we report actual temperatures.

But whatever, you just want an air of superiority.

Actual historical temperature data is recorded in tenths of a degree Celsius, because full degrees, Celsius or Fahrenheit, aren’t accurate enough. They still aren’t reported in the media, because they don’t matter in an everyday context.

Look, as far as imperial measurements go, Fahrenheit is pretty good. Any temperature scale is going to be arbitrary, and the reasons for Fahrenheit are valid enough. But, frankly, 180 divisions of temperature is nonsensical. The accuracy just isn’t necessary in daily life, and isn’t enough from a scientific context. And if I’m going to use an arbitrary scale, I may as well use the same one as just about all the other ones that don’t have some reason to be divided into multiple different segments, like degrees on a circle. So at that point, you can go decimal, like virtually everything else in the metric system, or you can go with a multiple of 60 for no damned reason besides history.

My proposed compromise: we should have a system with 0 at freezing point but the same sized degree increments as Fahrenheit, so boiling point would be 180