In Ali Baba and the 40 Thieves, the number of thieves wasn't really necessarily 40. The number was likely just chosen because 40 was an exaggerated number, much like when we'd say "I've told you a hundred million times". So 40 as a shorthand for "a huge amount" seems fitting in celcius.
I like the saying "Fahrenheit is what you feel, Celsius is what water feels, and Kelvin is what the universe feels".
Fahrenheit is what Americans feel, Celsius is what everyone else feels, and Kelvin is just Celsius +273.
good point, but to us Celsius fans or "Celsilovers" over one hundred sounds like the apocalypse.
Which is the closest thing to a legitimate criticism of celcius that exists. The entire top half of the scale (everything over ~50°, that is) is pretty much useless as far as judging the weather is concerned.
I am being forced to learn celsius by my non American friends. Call me an incelsius.
I present the temperature scale that I made up- the Human Scale (H°)
I thought about the Fahrenheit vs Celsius debate, and I think both have practical uses, however I think combined they could make a very practical scale.
Fahrenheit: while my American sensibilities agree that 100° is a good marker for what % of my patience is used up to cut a bitch, I think a similar place would be the average human body temperature. For this reason, 100°H = 98.6°F . It's not a perfect match, but it can still give us the satisfaction of "IT'S 100°!?" while having practical implications for medical uses "your body temperature is 102°, 2° warmer than average".
Celsius: I think this scale makes a ton of sense for colder temperatures. When the thermometer reads 0°, that's when you can expect snow. For this reason, 0°H = 0°C.
The conversation rates are:
H = (F-32) × 1.5
H= C × 2.7
More precise is
H = (F-32) × 1.501501501...
H = C × 2.7027027027...
While using the freezing point of water and the average human body temperature seem like inconsistent and arbitrary benchmarks, my goal is less about consistency and more about practicality for everyday use.
Now watch this scale grow as big as Esperanto.
the problem is that the average body temperature is slowly decreasing, so this isn't that well defined, we would need to link it to an event that is at constant temperature
also the celsius scale isn't that good imo because it's about the freezing and boiling of water at ambient pressure so it isn't universal
I say we set the boltzmann constant to a known value, and define temperatures from there
after that we find a range of temperature with useful round values and offset the scale for everyday use
So I had to look up the Boltzmann constant and... That's a lot of math.
I think you have a point on the decreasing human temperature. It looks like the decrease is at 0.05°F every decade, which actually is quite a bit. If it was something like 0.005°F, I'd say that that's a problem for the people of the year 2500 to solve.
That said, the reason it's been decreasing seems to be due to medical advances and not some change in the Earth's gravity or climate change. I would be surprised to see humans in the year 2500 having an average body temperature of 72.9°F, or closing in on 0°F in the year 3,984. I imagine there will be fluctuations, but there's got to be a lower limit to what is physically possible.
I'd still defend the Celsius number, since even though there are changes due to air pressure, it's changing over space and not time. In the year 2500, water at sea level will still freeze at 0°C.
I think my big thing is I'm less concerned about a logically consistent scale, and more towards a scale that's geared to the emotional side of temperature.
Thinking outloud moment
If we are going for the emotional side of temperature specifically, we would also need to factor in wind, humidity, sunlight, what season it is, etc. and that's a lot of variables, and even then that's how you get the wind-chill factor. But even that is almost completely subjective. I feel like that scale would go from "IT'S GOTTA BE NEGATIVE A MILLION FUCKIN' DEGREES" to "I FEEL LIKE IM ON THE SURFACE OF THE SUN, so like a bazillion degrees" and then we go to the traffic report.
Either way, it's not a perfect scale, but I'd still take that over the other two.
But really it is much better for human temperatures.
It's just intuitive, 0F is 100% cold, and 100F is 100% hot.
When the dry bulb gets above 100F, wind only cools you down by sweat evaporation, and when the wet bulb gets above 100F, even that can't cool you down, and you will die if you don't get to a cooler or drier environment.
I love it when it's -10% hot in winter nights or 110% hot around the equator. Makes perfect sense.
Yes, it does a better job of impressing that is all of the hot (or cold), and then 10% more than the difference between 38 and 43
i assure you, we who grew up with celsius absolutely know the dire difference between 38 and 43. 38 is death, 43 is the crimson realms where even souls wither.
all this "which one is better for x" is nonsense, you develop a feel for whichever you grew up with. it's just that the math is less stupid with metric. that's all.
When I was out in SD recently the temperature was reaching 100F or above frequently and it sucked but it wasn't that bad. Where I live in Cali and it gets that hot by the beach with humidity well into the 70% range sometimes I literally felt like I was about to die just sitting inside with a fan blowing right at me. Humidity is such a huge factor.
I'm gonna be honest. I love Celsius for the the whole perfect math reasons with calories and water based measurement...
But the curve on temps is a pain when all the nice temperatures require using a decimal place to decide just how slightly above or below pleasant it is but cold is basically everything from 16°C to -30°C And then decimals really matter when hotter than pleasant temps.
Whole rounded integers are just so vastly different depending how high or low you are in Celsius.
I don't know man, I've lived my entire life in a country that only uses Celsius and I've never seen a single place or person using decimals to display temperature we always use whole numbers.
I get your point but the difference in 1 degree in Celsius is still very insignificant to the point we don't really need decimals at all.
I've been all over the world. Trust me seeing 21.6 or other decimals is not uncommon you and others are really just pushing hard on the ideas that there is no flaws and none of the quirks of Celsius.
I literally just set an air conditioner to 20.5°C. I don't get why lie like this.
Huh, I can only speak from my experience. I have a couple of thermometers in my room that give decimals, but my air con doesn't give decimal options and the government meteorological service doesn't either. I certainly don't think I can tell the difference between 39 and 39.2.
That's a really weird one, every apartment I've lived at the air conditioner only displays the temperature in integers and I'm 100% sure of that because every each one of them had arrows to change the temperature up or down in one unit.
Meh I'm about 50/50 4 air conditioners in. Half degrees has not been all that uncommon and it's up and down arrows to adjust it. I don't get the handwaving of legitimate points of comtention to make Celsius seem more perfect. Everything has its flaws. It's completely fine to admit that.
That's not the point, I haven't said Celsius is perfect not a single time here, I'm just calling your BS because you said decimals matter for us which is not true because no one that lives in a country that uses Celsius knows the difference a decimal makes, I honestly think we just really feel some difference at like 2 degrees in variation, far from decimals.
Decimals don't matter for you and you are pushing that to everyone as a way to delegitimize my point.
I use Celsius and they matter to me. I can and do notice a difference between setting my thermostat to 21.5 vs 22 vs 22.5. You don't whatever.
I said I was using Celsius but because I had a complaint you decided I was outside of your accepted user group and my statement was BS.
Its bullshit.
Yeah your point is BS, because you really don't need decimals to do most things. Good for you that you can notice decimals in difference but that's not a normal thing, most weather forecast only say the integer, most air conditioners (all as far as I've seen) tell the temperature in integer, if you talk to someone else about the weather we also talk in integer.
YOU should stop pushing the idea that decimals are important into everyone else as if they are true for everyone, because they're not.
ITT: Europeans tie their personal identity to an arbitrary scale for the expression of mean entropy.
Fahrenheit is such a nice system. 0 is really, really cold and 100 is really really hot. So 50 must just be perfect, right?
Way more intuitive then Celsius.
Celsius isn't all that different.
-30 is really really cold, 30 is really really hot.
0 is just about perfect.
More like 0 is really cold and 40 is really hot, so 20 must be perfect, which it is.
Eh I can barely breathe at 30. 40 is certain death. Except in a sauna, where 100 is no problem and we throw water on the rocks to make it feel hotter.
I know, it's weird. I've got pictures of me lying around in a pile of snow in a t-shirt, trying to cool down. At -15.
Might be a location thing, where I live temperatures over 30 are the norm (humid too, shit sucks). 40 days are rare but not unheard of either. Meanwhile, my only experience with anything lower than 15 is the fridge.
Planck Temperature Units, everything else is a corollary fantasy