The reason 6Ghz was introduced with WiFi 6E and 7 was because 2.4Ghz and 5Ghz was very busy.
My question is why isn’t there anything in between? Why isn’t there a 3Ghz, 3.5Ghz, 4Ghz, etc?
Also, what if things that require very little data transmission used something lower than 2.4Ghz for longer range? (1Ghz or something?)
Its regulation. You are not allowed to just use any frequency band. 2.4 GHz is free to use for whatever, so a lot of stuff uses it. 5 GHz is already in radar range so the chips have to detect and turn off when detecting radar.
6ghz is even more fun. They must operate and low power unless a gps transceiver is attached to confirm its not interfering with someone that has a license to operate in that location.
ah that makes sense.
2.4 GHz and 5 GHz are both “ISM bands.” These are frequencies that regulators have set aside for unlicensed use.
Fun fact: 2.4 GHz is free to use because of microwave ovens. Microwaves are really noisy around 2.45 GHz. Rather than try to regulate their radio emissions, or make people license their kitchen appliances as radio transmitters, the FCC allocated that patch of spectrum for free use. Any device that can tolerate the noise can use that bit of the radio spectrum.
Ah so the stuff in between that has things like radio stations?
I’m not sure what exactly you mean by “in between” here (between 2.4 and 5 GHz?), but commercial radio in the US is at much lower frequencies than wifi bands. AM radio is typically measured in KHz (the range is something like 600-1400, I can’t be bothered to look it up), while FM radio is in MHz (around 87-108 MHz).
Not like commercial AM/FM radio stations playing music, but radio in the more general sense. 5G cell phones and satellite-to-earth communication systems use that frequency range, for example.
makes sense, using WiFi with those frequencies would make it noisy and clogged up, esp. in crowded cities
Some parts of the spectrum in between 2.4 and 5 GHz are used by the mobile communication standards LTE and 5G (also higher frequencies). GSM, DVB-T, and FM radio operate at lower frequencies.
ah that makes sense
But, the key takeaway there is that those portions of the spectrum where mobile bands are are restricted to those uses. 2.4 and 5 are “free bands,” in which any device can use them (assuming compliance with FCC part 15 in the US). If I built a device and operated it at say 4.3GHz, I’d get in big trouble once found.
Indeed, the regulation authority, e.g. the Bundesnetzagentur in Germany, will knock on your door and inspect your radio and wifi devices. That way, some years ago they’ve found a malefunctioning radio alarm clock in my area.
When we talk about 2.4, 5, or 6 GHz the devices don’t operate at exactly that frequency, but within a band more or less on that number. For example 5 GHz is actually a set of channels between 5150 and 5895 MHz.
Why isn’t there a 3Ghz, 3.5Ghz, 4Ghz, etc?
Technically there’s 802.11y (3.65 GHz), 802.11j (4.9-5.0 GHz), etc. It’s just that several of these bands cannot be used universally across the globe, because they may be reserved for other purposes. By and the bands that end up being used are ones that don’t require licensing to operate.
Reference: https://en.wikipedia.org/wiki/List_of_WLAN_channels
IIRC Ubiquity make a line of point-to-point ethernet bridges that operate in the 20GHz range (because more bandwidth, and if you have line of sight you don’t care about interference as much). Responsible vendors won’t even sell you one without sighting a license cos they can also get in trouble for selling it to you if it turns out you are operating it illegally
The difference there though is that those devices are point to point, not broadcast/receive. Iirc there are different rules in place for direct line of sight devices, and misuse will land you a meeting with the FCC, hence why the more powerful variants require a license.
I remember that post from slazer2au… https://lemmy.world/post/19338754
Great answer getting to the point of the question.
that makes sense. regulations and such over radio waves
Also, what if things that require very little data transmission used something lower than 2.4Ghz for longer range? (1Ghz or something?)
No one seemed to touch upon this part, so I’ll chime in. The range and throughput of a transmission depends on a lot of factors, but the most prominent are: peak and avg output power, modulation (the pattern of radio waves sent) and frequency, background noise, and bandwidth (in Hz; how much spectrum width the transmission will occupy), in no particular order.
If all else were equal, changing the frequency to a lower band wouldn’t impact range or throughput. But that’s hardly ever the case, since reducing the frequency imposes limitations to the usable modulations, which means trying to send the same payload either takes longer or uses more spectral bandwidth. Those two approaches have the side-effect that slower transmissions are more easily recovered from farther away, and using more bandwidth means partial interference from noise has a lesser impact, as well as lower risk of interception. So in practice, a lower frequency could improve range, but the other factors would have to take up the slack to keep the same throughput.
Indeed, actual radio systems manipulate some or all of those factors when longer distance reception is the goal. Some systems are clever with their modulation, such as FT8 used by amateur radio operators, in order to use low-power transmitters in noisy radio bands. On the flip side, sometimes raw power can overcome all obstacles. Or maybe just send very infrequent, impeccably narrow messages, using an atomic clock for frequency accuracy.
To answer the question concretely though, there are LoRa devices which prefer to use the ISM band centered on 915 MHz in The Americas, as the objective is indeed long range (a few hundred km) and small payload (maybe <100 Bytes), and that means the comparatively wider (and noisier) 2.4 GHz band is unneeded and unwanted. But this is just one example, and LoRa has many implementations that change the base parameters. Like how MeshCore and Meshtastic might use the same physical radios but the former implements actual mesh routing, while the latter floods to all nodes (a bad thing).
But some systems like WiFi or GSM can be tuned for longer range while still using their customary frequencies, by turning those other aforementioned knobs. Custom networks could indeed be dedicated to only sending very small amounts of data, like for telemetry (see SCADA). That said, GSM does have a hard cap of 35 km, for reasons having to do with how it handles multiple devices at once.
Radio engineering, like all other disciplines of engineering, centers upon balancing competing requirements and limitations in elegant ways. Radio range is the product of intensely optimizing all factors for the desired objective.
i’d also note that antennas, amplifiers and so on have bandwidth that is some % of carrier frequency, depending on design, so just going up in frequency makes bandwidth bigger. getting higher % of bandwidth requires more sophisticated, more expensive, heavier designs. LoRa is much slower, caused by narrowed bandwidth but also because it’s more noise-resistant
have bandwidth that is some % of carrier frequency,
In my limited ham radio experience, I’ve not seen any antennas nor amplifiers which specify their bandwidth as a percentage of “carrier frequency”, and I think that term wouldn’t make any sense for antennas and (analog) amplifiers, since the carrier is a property of the modulation; an antenna doesn’t care about modulation, which is why “HDTV antennas” circa 2000s in the USA were merely a marketing term.
The only antennas and amplifiers I’ve seen have given their bandwidth as fixed ranges, often accompanied with a plot of the varying gain/output across that range.
going up in frequency makes bandwidth bigger
Yes, but also no. If a 200 kHz FM commercial radio station’s signal were shifted from its customary 88-108 MHz band up to the Terahertz range of the electromagnetic spectrum (where infrared and visible light are), the bandwidth would still remain 200 kHz. Indeed, this shifting is actually done, albeit for cable television, where those signals are modulated onto fibre optic cables.
What is definitely true is that way up in the electromagnetic spectrum, there is simply more Hertz to utilize. If we include all radio/microwave bands, that would be the approximate frequencies from 30 kHz to 300 GHz. So basically 300 GHz of bandwidth. But for C band fibre optic cable, their usable band is from 1530-1565 nm, which would translate to 191-195 THz, with 4 THz of bandwidth. That’s over eight times larger! So much room for activities!
For less industrial use-cases, we can look to 60 GHz technology, which is used for so-called “Wireless HDMI” devices, because the 7 GHz bandwidth of the 60 GHz band enables huge data rates.
To actually compare the modulation of different technologies irrespective of their radio band, we often look to special efficiency, which is how much data (bits/sec) can be sent over a given bandwidth (in Hz). Higher bits/sec/Hz means more efficient use of the radio waves, up to the Shannon-Hartley theoretical limits.
getting higher % of bandwidth requires more sophisticated, more expensive, heavier designs
Again, yes but also no. If a receiver need only receive a narrow band, then the most straightforward design is to shift the operating frequency down to something more manageable. This is the basis of superheterodyne FM radio receivers, from the era when a few MHz were considered to be very fast waves.
We can and do have examples of this design for higher microwave frequency operation, such as shifting broadcast satellite signals down to normal television bands, suitable for reusing conventional TV coax, which can only carry signals in the 0-2 GHz band at best.
The real challenge is when a massive chunk of bandwidth is of interest, then careful analog design is required. Well, maybe only for precision work. Software defined radio (SDR) is one realm that needs the analog firehose, since “tuning” into a specific band or transmission is done later in software. A cheap RTL-SDR can view a 2.4 MHz slice of bandwidth, which is suitable for plenty of things except broadcast TV, which needs 5-6 MHz.
LoRa is much slower, caused by narrowed bandwidth but also because it’s more noise-resistant
I feel like this states the cause-and-effect in the wrong order. The designers of LoRa knew they wanted a narrow-band, low-symbol rate air interface, in order to be long range, and thus were prepared to trade away a faster throughput to achieve that objective. I won’t say that slowness is a “feature” of LoRa, but given the same objectives and the limitations that this universe imposes, no one has produced a competitor with blisteringly fast data rate. So slowness is simply expected under these circumstances; it’s not a “bug” that can be fixed.
In the final edit of my original comment, I added this:
Radio engineering, like all other disciplines of engineering, centers upon balancing competing requirements and limitations in elegant ways. Radio range is the product of intensely optimizing all factors for the desired objective.
sorry for being unclear, i forgor a word. what i meant that certain antenna designs would have specific fractional bandwidth, so that just scaling that design to higher frequency makes usable bandwidth wider in kHz terms. in order to get higher fractional bandwidth more complex or bulkier designs would be required, like thicker conductors, added parasitics, something LPDA-shaped, or maybe elaborate matching circuit, all of which cost money. i guess that while resonant amplifiers are a thing, probably bigger limitation would be bandwidth of mixer
Radiowaves are not free real estate. Every country has their own laws on what frequencies you’re allowed to use for what.
2.4ghz frequencies are basically as unregulated as they can get in the US, so that’s why wifi used that for the longest time. I’m not sure what devices used 5ghz before, but they took that frequency for wifi. You have to fight for every mhz you can get in radio waves.
Here’s the wiki article talking a bit about this. I’ve never heard of like 3.6ghz wifi so that’s interesting. https://en.wikipedia.org/wiki/Wi-Fi#Operational_principles%3A~%3Atext=some+cases+severely.-%2CWaveband%2C-[edit]
shouldn’t the arrow be pointing down?
To be fair I was really lazy and just grabbed the first one I saw with “not” inserted into it.
Because yes you are correct.
Growing up I always learned (I think) the “insert missing text” symbol was shaped like the one in the pic, like a caret symbol.
The odd thing is I also remember the caret and inserted word being at the top like in OP’s image, but style guides I can find now show the caret at the bottom and the inserted text at the top.
https://books.byui.edu/fhgen_110_textbook_/chapter_16_deciphering_elements_of_handwritten_records#%3A~%3Atext=are+shown+here%3A-%2CInserted+Words%2Cadded+on+the+line+ above (search for caret)
https://grammarist.com/editing/proofreading-editing-marks-symbols/
Wikipedia seems to indicate using a downward facing caret, or a caret with an extra upward arm https://en.m.wikipedia.org/wiki/Caret_(proofreading)
TIL!
All that to say the formatting of OP’s pic, no matter the direction of the arrow/caret, makes it hard to read. A little “don’t dead open inside” or something.
Edit to add: This one shows it the way I remember it https://dmlfnsgrade9isawesome.weebly.com/editing1.html
Is it due to cost? Does adding more frequencies increase the cost? (maybe that’s why WiFi 6E and 7 routers are so expensive)
Or is it because of compatibility? It would be more difficult to get everyone to agree on so many different wireless frequencies.
Or maybe it’s because of interference? Maybe adding things in between 2.4Ghz and 5Ghz would make the network harder to read as things like 3Ghz, 4Ghz, etc. are too close to either 2.4Ghz and 5Ghz. But if that’s the case, how is 6Ghz okay?
Or maybe things in between 2.4Ghz and 5Ghz ARE being used, but for different things that would break if WiFi started to use them, and if so, what are those things?
Or maybe things in between 2.4Ghz and 5Ghz ARE being used, but for different things that would break if WiFi started to use them, and if so, what are those things?
Warning, this is an extremely large image. You will need to open it in it’s own window and zoom in.
Once you have opened it in a new window, I’m sure you will be able to tell that the entire radio frequency spectrum is absolutely fucking packed to the gills. It’s all accounted for, everything from 3kHz to 300 GHz.
Also, 2.4GHz is the frequency put out by microwave ovens, so they actually do interfere with 2.4GHz Wi-Fi and Bluetooth.
Finally, this is just how the US allocates the radio spectrum. Every country actually has their own different allocation system. If I recall correctly, Wi-Fi bands in the UK are slightly different.
EDIT: Actually, just noticing that this is actually 22 years out of date, since it’s from 2003 (bottom left of image). Things have been changed around and re-allocated several times since then, but you get the idea. There’s a lot of groups who want access to these frequencies. Government agencies, military communications, space exploration, satellite communications, cellular telephone service, television broadcasting, civilian band radio, wi-fi and so on.
what’s fixed service in this context?
Stuff that doesn’t move. Like a terrestrial radio station, they have one big tower that broadcasts the station and it doesn’t physically go anywhere. That’s distinct from mobile radios like phones, CB radios, etc. which are always moving around all over the place and potentially causing interference. Fixed radio, you generally have a license for a specific geographic area and only you are allowed to use that band in that area. But then they can license it to somebody else at a distant location where it won’t interfere.
but there’s separate category for “broadcast”, so it’s more of point-to-point thing?
Here’s a massive list of the UK allocations: https://static.ofcom.org.uk/static/spectrum/fat.html
deleted by creator