Wikipedia¹ has a more interesting answer to "Why?" —
After World War II, the FCC moved FM to the frequencies between 88
and 108 MHz on June 27, 1945. The change in frequency was said to
be for avoiding possible interference problems between stations in
nearby cities and to make "room" for more FM radio channels. However,
the FCC was influenced by RCA chairman David Sarnoff, who had the
covert goal of disrupting the successful FM network that Edwin
Armstrong had established on the old band.
RCA went after Armstrong. They knew that by convincing the FCC to change the frequencies they could seriously disrupt FM and make existing equipment useless. Armstrong later committed suicide. His wife won all of the on-going lawsuits. Basically, RCA broke him financially and mentally with multiple lawsuits and years of "discovery". Armstrong is the father of frequency modulation (FM) and some argue the father of modern radio.
It's the typical "write a lot, but don't really answer anything" style that so many politicians utilize, only substituting "talk" for "write".
The entire section could have been summed up as:
Because each broadcast band is identified by the frequency of the center of the band, the bands begin on an even number, and the bands are two "units" wide, making the center frequency an odd number.
Yes, that's exactly why, because the FCC said to do it that way. There are countless examples of "why is it this way" being answered "because that is what some arbitrary person picked."
My favorite example of this I remember reading in Andrew Tanenbaum's Networking book that the first frequency-hopping protocols used 88 frequencies because that is how many keys there are on a piano:
Article with headlines like "Why X was chosen" are expected to actually provide explanations of human motivation, rather than the nearest proximate cause. For example, an article called "Why this stretch of highway has a speed limit of 30 mph" would be expected to say something like "because there is a history of accidents there due to the narrow shoulder and tight turns" rather than "because the transportation department, which is in charge of setting speed limits, posted 30 mph signs there."
The most interesting part of that is the WHO of spread spectrum... Hedy Lamarr (movie star) and George Antheil (composer). Very fascinating that two major Hollywood players were able to innovate in hard science/math. A different era.
I think the piano keys stuff was because the idea was semi-based around a player's piano (if I recall the legend from my DSP prof). The roll moves around and different bumps make the notes sound - a similar concept (visually at least) to coding a communications signal and having it jump all over frequency. 88 notes -> 88 frequencies. I wouldn't be surprised if the experimental equipment was a modified version of a player's piano. Very cool stuff, though I may be recollecting the story wrong. Wikipedia agrees about the movie star + composer bit at least.
Another example is the Fahrenheit scale. The eponymous originator calibrated his thermometer by the temperatures where he could obtain brine (0 F) and an inaccurate estimate of human body temperature (100 F). https://en.wikipedia.org/wiki/Fahrenheit
It is still nice because it is human centric, roughly 0 to 100 for temperatures is easily relatable to the human body. 90 is hot, 10 is cold. Celsius starting at the freezing point of water is not as human centric.
"Human centric" is a false advantage. And it's not even human centric, the temperature of 100F is wrong, and what does brine have to do again with the human body?
Gah. Can we please stop with this bullshit already? Fahrenheit has nothing to do with body temperature. The human body temperature is ~100F by coincidence, not by design.
The Fahrenheit scale has 180 divisions (nice round number, evenly divisible by loads of integers) between the freezing point of water and the boiling point of water (212F-32F=180, for the math challenged). The 0F point is the equilibrium point of a frigorific mixture of equal parts ammonium chloride, ice, and water. One can easily produce two reference points for the F scale in a way that isn't sensitive to the local pressure using just ice brine (0F) and ice water (32F). The melting temperature of ice and the equilibrium temperature of the brine mixture are constant on the pressure scales and temperature scales available to metrologists before about the 19th century[1][2].
The same can't be said for the Celsius scale, which requires either a triple-point cell or a known atmospheric pressure (so that the boiling temperature of water is well-known).
Edit: So the close out the point, the Fahrenheit is a unit which is much better matched for the calibration tools and computational methods available around the time it was invented.
[1] Maybe a better way to state this sentence is that the temperatures are constant "to within the experimental uncertainty achievable at the time."
[2] It's well known to anyone who has ever cooked that water boils at an appreciably different temperature in Denver than in Los Angeles.
0 is uncomfortably cold. 100 is uncomfortably hot. The temperatures that non-STEM people (“humans”) deal with on a day-to-day basis can be represented with satisfactory precision using two digits (OK, three in Australia), no negative numbers, no decimals. It’s a great system for casual temperatures.
Whereas “reasonable” temperatures in C run from what, -18ish to 38ish? Where’s the sense in that?
If you want to defend a unit of temperature, defend Kelvin. Don’t pretend that ˚C is significantly more reasonable than ˚F. They’re both bonkers in their own way (mysticism about the vitality of water is not a real justification).
If (more likely) you just want to make fun of Americans for being rubes, use distances or weights or volumes as your example.
There are a lot of humans that live places where we regularly see negative temperatures. Where I live the temperature range is roughly -40–85ºF.
Choosing the freezing point of water isn't really mystic. It lets you know whether it will rain or snow. It also means that any negative temperature is capable of causing frostbite.
Not really, since it depends upon temperature at the clouds, not at the ground level. Oh, and it also depends on atmosphere pressure, and on time (since a phase change requires latent heat transfer in addition to merely being at the right temperature).
And as baddox mentions, 32 is hardly harder to memorize than 0, especially if you know anything at all about computers.
Humans have developed clothing, which allows us to stay alive in cold weather. There’s very little we can do (short of going into a building with AC) to survive hot weather — 100 is tolerable (just), but heat gets deadly very quickly when you go above it. You don’t want to be outdoors for extended stretches in either condition (the analogy is imperfect; realistically my experience of 10F is subjectively similar to 100F, and 0F is more like 110F).
How does that change the fact that 0F is more than "uncomfortably cold" for most people?
I'm not disputing the 0F to 100F (or -18C to 40C) range in terms of being "regularly seen".
EDIT: I suppose the core of the misunderstanding is that I was addressing the "human centric" advantage and relation to brine, where you were just arguing (quite reasonably) the convenient representation of the range?
It doesn’t, but I’m being slightly approximate. Keep in mind that perception of (and physical effects of) temperature is hugely nonlinear. 0F is more dangerous than 100F, but when you go outside of that range, heat becomes far more deadly very quickly. I shovel my driveway in -20F, but I would never consider doing that sort of outdoor labor at 120F. Many people live in regions where -40F occurs from time to time, but if its 140F, you will die (actually, I think the highest recorded surface temperature is 13xF).
To your edit: yes, I think that’s the real thing. I certainly don’t think the brine thing is reasonable, just that 0-100F is a nice range.
I agree that the smaller degrees, and lack of fractions, is more convenient for casual usage.
However, I'd argue that being water-based is hardly mystical: if water is boiling, or frozen, survival is more difficult.
It's also a pragmatic system for any nation that regularly drops below freezing. (Eg, could it snow today? Could there be ice to slide on?)
Yeah, you could argue that if it was an accurate 100F for the human body, but you'd still want the bottom level to be something useful - freezing point of water is as good a zero as anything (well, short of absolute zero).
My bigger question is why Metric stuck with Celcius and Kelvin instead of factoring in the Gas Constant (R) into Kelvin so the math would be completely constant-free.
Of course, then you'd have water freezing at 2271 and boiling at 3102, which wouldn't be fun in conversation.
> My bigger question is why Metric stuck with Celcius and Kelvin instead of factoring in the Gas Constant (R) into Kelvin so the math would be completely constant-free
That's a really good question. Another possibility would be to fix the calories/joules mismatch in heating water
Fahrenheit is not intuitive, you just think so because you grew up with it. People who grew up with Celcius have no problem relating to how cold 30 C or 100 C or -20 C are.
Celsius is no more intuitive for everyday usage. Of all the scientific unit debates, the temperature one is by far the silliest. There are very few temperatures that people need to memorize, and they're trivial to memorize in either temperature scale.
This I disagree with. Born and raised in America, I still cannot remember what temperature water boils at in Fahrenheit, I just remember it is something over 200 degrees.
I never said intuitive, I said human centric. Everyone can relate to a number, but it is not as human centric as Fahrenheit which has 0 to 100 more relative to a human scale.
As they say, "The center frequency is located at 1/2 the bandwidth of the FM Channel, or 100 kHz (0.1 MHz) up from the lower end of the channel."
It's natural to pick a range between round numbers. so 88.0 MHz - 108.0 MHz doesn't seem so strange a range to be allocated. Now if one band starts at 88.0 MHz and needs 200 kHz of bandwidth, it's going to go from 88.0 MHz to 88.2 MHz. The carrier frequency will be in the middle of this band, so it's represented as 88.1 MHz.
>Well, it is a rather arbitrary reason but a valid explanation.
It's not exactly arbitrary, it was inspired by a player piano, and the original patent used piano rolls. Using all of the keys just makes sense in that context.
I had an Aiwa stereo/receiver several years ago that could tune the FM radio in 50kHz increments instead of 200kHz. Most stations were weaker 50kHz off, but a few were stronger. More interestingly, there were one or two stations I could only get on the even decimals, usually between two empty odds. Now I wonder if someone's transmitter was configured incorrectly, or whether it was an illegal broadcast.
This only happens in the US, probably. The fun fact is that some car manufacturers believe this is global, so they only allow odd frequencies on the car radio.
We had a radio in our old Chrysler Voyager that required a pretty insane secret code being entered (turning it off and on again 5 times in a row or something like that was part of it) in order to enable to tune to regular EU frequencies...
My 2013 VW Golf has a similarly mad method to move the windscreen wipers to a position that enables the window to be cleaned. I forget the trick - I think its turn the car off, then on, then move the wiper stalk to one of its four or five positions. Or vice versa. I really can't remember.
Edit: switch ignition on, then off. Briefly press the down the windscreen wiper lever.
Oh, that servicing position. That's because you can't lift wipers from their resting position since bommet is too close to windscreen. You have to turn on the ingition, turn it off and then hold wiper stalk in the lowest position for 2 seconds. Same goes for 2013 Skoda Octavia. Had to learn that yesterday :)
I recently watched an episode of Top Gear where Clarkson was driving a Ford Mustang and used the odd-number ending frequencies on the radio as a part of his never-ending "American cars are weird" joke.
What actually more interesting to me is that FM has assigned channel numbers. I don't think I've ever seen a consumer radio that used channels instead of the raw frequency. I never knew they existed until I read that.
Yep ... the (unmodulated) carrier frequency is in the middle of the channel (as it is with most AM radio stations). What's more interesting is to look at the old analog television broadcast channels. They were transmitted using VSB AM (Vestigal Side-Band Amplitude Modulation) which effectively meant that the luminance information (brightness) was transmitted using both side-bands (regular AM) but the chrominance information (color) was transmitted using SSB (Single Side-Band).
Before the North American conversion to a digital only TV broadcasts, the audio for all analog channel 6 television stations in the United States could be picked up on 87.7 FM
Some channel 6 stations have expressed interest in/have implemented also broadcasting the sound for their broadcasts back on radio again after moving solely to digital broadcasts.
Never noticed that it's like that here in the US. My radio station growing up was SDR3 on 92.2 and channels are spaced .1MHz apart -- apparently the channel assignment in Germany took a different path.