I'm not sure I agree with the take for farenheit. It's an arbitraty choice, and to me who grew up in a country that uses celsius, I find that far easier to understand and farenheit may as well be random numbers to me.
"Oh but 100 Fahrenheit means 100/100 on the hot scale, it just makes intuitive sense!"
WHAT DOES THAT EVEN MEAN?? Fahrenheit lovers literally don't know how ridiculous they sound
Nah it's more like, one degree fahrenheit is the smallest change in temp that the average human can sense.
I call bullshit, like yeah I'm sure that's the smallest degree or whatever, but how 'hot' or 'cold' something feels is up to way more than just temperature like humidity, wind chill if it's sunny or cloudy so in a real example I doubt a person can notice the difference between a 66°F and 65°F day because there's so many other factors. And you know what it is actually really bad at? Telling people when stuff freezes, you think some person from texas or nevada or any place that usually doesn't get cold enough knows the exact freezing point in fahrenheit? Most people will guess around 30 while pretty much everyone knows that the freezing point of water in celsius is 0°
When it's above 100, people who have options for something lower will generally go for them. Similarly for under 0. OK, so as PancakeLegend@mander.xyz pointed out, such sensitivities might be specific to US culture, but theoretically, how much would we have to expand the 0-100 Fahrenheit range so that 0 is too cold for pretty much everyone and 100 is too hot for pretty much everyone? 0 goes to -10, 100 to 140? A new-Fahrenheit degree would still be more precise than a Celsius degree.
My point is "really hot" and "really cold" are not useful reference points to ascribe to, no matter what numbers you're using. If i was coming up with a measurement system for brightness and i said 1000 was "really bright" would you be able to tell me anything about 500? No because you literally have no reference frame for what i mean by "really bright". It's the same thing when Americans describe Fahrenheit to the rest of the world. You have to experience the data points, and at that point, whether you use 0 to 100, -20 to 40, or 250 to 310, it doesn't matter. You will just intuitively understand the scale and so there's no inherent benefit.
Whatever your grew up with will always seem more intuitive for most people. But given that I grew up with Fahrenheit, the whole “0 is cold as fuck, 100 is hot as fuck” thing works for me.
Yeah, pretty much. I figured it was probably implied that I’m in the states. :)
I mean, SI units are objectively the best, and align with metric in most cases, but my brain is conditioned to accept Fahrenheit and miles per hour natively. Celsius and km/h have to go through an interpreter to convert them.
I have to say though, km/h has that “0 to 100” thing going for it that Fahrenheit does. 100 isn’t the fastest you’ll go, but it’s a typical highway speed.
I have to say though, km/h has that “0 to 100” thing going for it that Fahrenheit does. 100 isn’t the fastest you’ll go, but it’s a typical highway speed.
but we could be using meters per second
Fahrenheit is our broken clock being right once a day
Fahrenheit was not an entirely arbitrary choice: it was defined based on two points of reference that could be measured at the time: the freezing temperature of an ammonium chloride brine is used as 0, and the best estimate for the average human body temperature is set at 96.
Over time, as the freezing point and boiling point of water at sea level atmospheric pressure proves to be more accurate reference points, the Fahrenheit scale was adjusted to provide exact conversion to Celsius.
Fahrenheit was not an entirely arbitrary choice
it's not arbitrary, it's based on the uh, the freezing temperature of uh, ammonium chloride! we're all familiar with how cold that is! and, and, and, uh, the upper end is, uh ... they decided on 96. it's not arbitrary!!!
...96? How is 96 a point of reference when you are making a scale from scratch?
ah 7 and 9 are missing because 7 ate 9 and is on the run from the law
I know someone who knows both "natively" and celsius is much more logical to them because 1. kelvin has the same steps as celsius so for any science its much easier 2. freezing is 0 celsius so for weather(the thing you use temperature most commonly for) its really useful. Same with cooking.
Same.
Not to mention that the 0-100 range thingy really depends on local conditions. I mean, depending on where you live, there are parts of the scale you'll never use.
I've never in my entire life lived in a place where the lowest temp got anything close to 0°F.
My range of values is more -5°C - 45°C, or 23F - 113F.
23F for me is already fucking cold, and 100F is nowhere near fucking hot anymore (thank the entire humanity for climate change).
So whichever scale, for me they're still just a bunch of numbers. But at least Celsius is used in "science, bitch!"
Climate change is not all of humanity’s fault
It’s a very specific very very very tiny group of humans who are doing this to us
Fahrenheit measured human body temperature (which he thought was a constant) and called that 96 degrees. We now know normal body temperature is about 98.6 degrees F, but back then, his instruments weren't as accurate. The number 96 was chosen for its divisibility. It has many divisors (1, 2, 3, 4, 6, 8, 12, 16, 24, 32, 48, 96), making it easier to mark subdivisions on the thermometer.
It's a scale partly defined by human body temperature, which is, I think, the point.
100 F is roughly a human's body temp. (Actually 98.7 avg, but close anyway)
0 F is goddam cold. (This one's pretty arbitrary ngl)
That probably isn't very helpful.
Fwiw, Celsius isn't much better if you didn't grow up with it. 0 C is pretty cold, 100 C can give you severe scalds. The actual range the people will encounter in weather in their day-to-day lives is all over the place regardless.
Perhaps we are destined to stay divided
I'm UK based and ~0°c to ~30°c (32-86f) covers 90% of the year for celcius. It's still pretty unhelpful but I don't think that feels any harder than using Farenheit in day to day use, I agree that it's largely all arbitrary, but that's as good of a reason to just use that one that's scientifically useful too.
A useful way to think about it (and I think what the OOP is saying) is to think about it as a scale from 0-100. Where 0 is like the coldest humans can deal with and 100 is the hottest humans can deal with. Obviously this isn't strictly true (it gets to like 115 in death valley) but as an imperfect generalization it's pretty useful.
Yeah, no, that's not helpful at all - what I consider cold and what my mum considers cold are very different temperatures, and what I consider hot and my neighbour considers hot has an even bigger difference.
You rationalise it with the "human scale" idea, but really you just know the range of temperatures you're personally comfortable in, just like everyone using Celsius does.
Personally I like Fahr better because there's more resolution
I'm also really good at guessing the current weather, often to within 1 degree F and 3% humidity
Also Fahrenhiet 0 just feels more practical to me. Like 20 F isn't THAT cold for me, but that's equal to -7 C. In Fahrenheit if I see a negative number I know it's really fucked outside. A negative Celsius number is often just a "typical" winter day and nothing to care about. Of course this in itself is very eurocentric temperate-centric, same with all the weather map colors that shade in 75F as a "warm" temperature with the color red--like that's just an extremely comfortable temperature and should be something like green.
It's pretty vibes based--but the vibality could be improved