> It's hard to believe that there is a standard and devices are widely deployed that mess so so much with their environment.
This is entirely a problem of our own making.
We give cellular providers hundreds of MHz of exclusive spectrum access, and then deprecate and auction off the old analog TV channels to give them hundreds more, but we expect everybody else to shove all of their traffic into a 100 MHz block at 2.4 GHz.
It's not that this standard is doing anything particularly nasty to the environment, it's just that it happens to be messing with the one tiny head-of-a-pin that we've decided everything needs to live on.
It's a computer controlled radio for crying out loud, give us 400 MHz and let the computer figure out how to hop around.
>This is entirely a problem of our own making.
>We give cellular providers hundreds of MHz of exclusive spectrum access, and then deprecate and auction off the old analog TV channels to give them hundreds more, but we expect everybody else to shove all of their traffic into a 100 MHz block at 2.4 GHz.
I don't think this is a bandwidth issue. USB is a wired standard. Using shielding is not that hard, but doing it inexpensively, while maintaining cable flexibility and low attenuation at high frequencies is very hard.
I wonder how do manufacturers of those devices manage to not get caught by the fcc in us and equivalent agencies in EU. Perhaps they sell directly to customers overseas?
The FCC's job is not to get every last penny for every last Hz, it's to be a good steward of the commons.
There is utility in the general public being able to use more unlicensed spectrum at home and in the office.
We can argue about what the right breakdown might be, but I'll start by asserting that ~3 GHz in total between Verizon, T-Mobile, and AT&T, vs. only 250 MHz in total for the entirety of local, short-range wireless communication is absurd.
On Wi-Fi, everybody just shouts at each other. On mobile, providers buy huge swaths of spectrum, partially as a monopolistic strategy to make life harder for competitors, and partially because they're still using cell allocation strategies from the '80s. They maintain exclusive rights to blocks that they are not using in a region, because the towers two cells away are using them, and it's just easier to use a fixed checkerboard allocation.
Both Wi-Fi and 3GPP standards can and should be improved to make better use of temporarily unused spectrum.
A good start might be to prevent carriers from having exclusive rights to any spectrum. At least one layer of the cellular protocols should be standardized across carriers allowing towers and phones to dynamically request and then relinquish spectrum on an as-needed basis.
Today, if 500 people in a room use T-Mobile phones, but Verizon owns all the spectrum, then nobody gets to use anything. This is stupid. Verizon should have access to a fraction of spectrum proportional to their users in an area. More users, more spectrum, and vice versa.
>Today, if 500 people in a room use T-Mobile phones, but Verizon owns all the spectrum, then nobody gets to use anything. This is stupid. Verizon should have access to a fraction of spectrum proportional to their users in an area. More users, more spectrum, and vice versa.
That would require a fairly radical departure from the infrastructure of the existing cellular networks, wouldn't it? Right now, each provider has a monopoly on a portion of the spectrum within a defined geographical area, and provides the base station and backhaul infrastructure to support their network.
It's not like "Verizon owns all the spectrum in a room"; it's that Verizon has better base station coverage of that room than T-Mobile does.
The providers compete on, amongst other things, coverage and network performance. Sharing spectrum would effectively require mutualization of base station infrastructure. You would effectively have a single monopoly with the networks operating as virtual operators. It's very far from clear that would be a good outcome, to me, at least.
They could share spectrum with different equipment in the same way multiple WiFi routers can share spectrum. The underlying reality is cell networks don’t actually need a lot of spectrum, instead their trading owning a lot of spectrum to reduce the number of cell towers because they aren’t charged the full price of that spectrum. Add some significant property taxes for that spectrum and you bet they would be selling or handing a lot of it back to the public domain.
> That would require a fairly radical departure from the infrastructure of the existing cellular networks, wouldn't it?
I think it can be done at the legal layer - just require roaming agreements between providers. They can already roam across borders so there's no technical barrier.
I'd rather the telephone industry spend $1 on hardware than see them spend $1 on license fees.
The story now is "$80 billion in license fees" and "$20 billion in hardware" and that seems the wrong way around -- when most folks play poker the stakes are supposed go up, not down.
The money is made up! The FCC could sell spectrum for pennies, or give it away, or restrict access to The Right People, or wash their hands of it, or do another auction. They've done all of these things at various points. The license fees are made up.
The way it's now done, will put the spectrum there, where there's the most value. However the result is that it's actually a tax which the consumer will indirectly pay.
> The FCC's job is not to get every last penny for every last Hz, it's to be a good steward of the commons.
That’s the spin.
Yes, the FCC is the steward. But it’s also a bureaucratic, underfunded, somewhat backwards agency.
SIGINT is pretty cool, and FCC has some cool things and people that interact with them, I’m sure. But the outward face of FCC and how they appear to operate seems to have hardly changed in past years. I respect that typically, but it’s like an old bowling alley kind of respect; it’s fun, but it’s old and dirty, and the nachos aren’t bad but they aren’t good. The Lysol sprayed in those shoes is for psychological comfort. You don’t need to know the real reason.
All licenseholders should add up their transmitted joules of energy at the end of the year and pay the FCC. It will be in everyone's best interests to make most efficient use of the spectrum because then they can get their information to its destination with fewer joules.
Devices like home wifi routers should be able to buy upfront a license for X number of joules/year, and the cost of that is included in the purchase price of your router/laptop.
If market rate were justified for everything then we wouldn't even have space reserved for HAMs.
> There's only one industry with enough demand and revenue to justify that purchase price, which is mobile.
There's plenty of industries with enough demand. There's just no way they can compete with the money that mobile offers. That's a damn shame given that money shouldn't be the only, nor even the main, driver.
I think OP was using a rather consequentialist and capitalist economic model, where the value of a thing is by definition equal to the money that you can get for it on the open market.
This is a part of the mobile market. They're the same market. 5G providers have frequencies that can't travel through walls and don't even make sense for a long-distance network. Those frequencies should just be part of the Wifi standard.
It very much depends on which band that spectrum is in. If 400MHz was worth the same everywhere, it's doubtful that amateur radio would have >4.7GHz of it allocated on a primary basis in the EHF band, for example.
Give it time. Once these bands become more utilized, and more hardware gets pumped into the system, this bandwidth allocation for hams will shrink just like it has in every other band. They'll find any reason to reduce these allocations in favor of some industrial monopoly that may or may not even use what it takes (cf. UPS).
It "works" but it is a mess. Large chunks of the spectrum require some combination of radar detection and power reduction, and if you live near a radar station you basically cannot use 160mhz wifi.
I will also add that 60ghz works great, but it will not penetrate walls. I have found it penetrates my doors and thus makes a great replacement for 2.4ghz/5ghz mesh networking in my apartment, but not much else.
This is actually incredibly annoying, especially in a residential setting where the 2.4GHz band is incredibly noisy (5GHz doesn't penetrate as well so it ends up being less noisy).
There is a big block of spectrum around 5GHz for unlicensed use, and another huge one in the 6GHz about to be opened up.
The trouble is that protocols like ZigBee and ZWave and Bluetooth and ANT+ are stuck in the 2.4 GHz band and practically you cannot turn off your 2.4 WiFi access points and be happy.
Thus I have to go to the woods to pair my fitness band with the heart rate sensor because at my house who knows what is going on with the hue light that is on the wrong side of my monitor from the hue hub or the smoke detector that posts the battery status to SmartThings, etc.
The big chunk around 5ghz is more restricted e.g. DFS/TPC requirements. Using 160mhz channels is impossible in my NYC area apartment because of those restrictions.
Its not just 2.4 that usb3 kills. I've had it jam gps from a couple feet away and thats down at 1.3. Did some emi tests, it was really broadband interference. I really have no idea how these chipsets got approved.
GPS is sub-thermal noise and the receivers don't have a ton of dynamic range. Your GPS may not have a good filter on it and might be saturating, rather than necessarily broad-band RFI.
5GHz works well as long as your access point is in the same room.
In real life one of the drivers behind having so many wireless devices in the first place is avoiding the effort and cost of laying cables everywhere, 5GHz often doesn't solve that problem well.
For now I'm using powerline devices to connect rooms with their own set of problems, nothing beats having a well thought through wiring in the house though.
My house is over 100 years old with fairly thick walls, knob and tube wiring, and a lot of neighboring 2.4Ghz access points. I ended up going under the house and running cat6, which was no small feat considering how tight the crawlspace is.
Of course, the cat6 cable I used subsequently got recalled, and so the manufacturer had to pay for a contractor to rerun it. They said that it was the type of job they wouldn't have even quoted for any price originally because it was so gnarly.
I have 2.4Ghz and 5Ghz from two APs on either end of the house. I turned off support for pre-n speeds on the 2.4Ghz to hopefully save some bandwidth on the beacon frames. I have Ethernet over Power to my garage, where I have a third AP for our inlaw unit.
The Ethernet over Power seems to be pretty good, but I had to find the right brand of equipment for it to not be flaky. WiFi still sucks, but my desktop and TV are hardwired, and it's good enough for mobile devices. I can go for about a week without losing my work VPN connection, through wifi through Ethernet over Power to PDSL.
If you care about interference ethernet over power is awful. It basically turns all of your electrical wires into a giant antenna and broadcasts that as broadband noise over multiple miles.
The only reason the things aren't outright banned is the frequencies tend to be sub 100mhz so it only has a significant impact on HF. Still, nasty things and many have been found over the legal noise limits.
If you can I recommend point to point 2.4ghz wifi links. Since wifi is regulated by ERP they have a high gain antenna that usually punches through things better than an omni AP. I've done them through multiple walls from 300-400ft away.
That certainly sounds reasonable. I've poked around with an SDR though, and the only egregiously noisy thing I've seen is the one 100Mbps Ethernet link from the DSL modem to the router.
Before I ran cat6, I indeed used WDS wifi. It wasn't reliable, though, and the particular hardware I had suffered from a bottleneck somewhere that seemed to limit throughout to 6Mbps or so. I could have spent another $500 on less crappy equipment and maybe made it work, but moving to wires everywhere I could was highly effective.
In apartment buildings, there’s no advantage in a wireless protocol that penetrates walls—quite the opposite. I don’t want my neighbour’s router and mine constantly shouting over each-other. I want the signal to end at the wall.
Agreed, this would be ideal. Most people don't understand that literally every access point on a given channel is part of the same broadcast domain even if they're totally separate networks. Every AP that yours can see on the same channel slows down the network for all of them, because wifi is half-duplex. Everyone needs to shut up for a second while one host on the channel transmits.
I see more than 20 networks in the building I'm in, so that is already happening.
Most people I know don't have cables going to different rooms, there's a router where the signal comes in and wifi from there. Often there's not even the choice which room or part of the room that is.
Unless your building is relatively new and someone thought about setting up the cables, it can be very expensive to do so later.
To lay a cat6 cable and have it snake over the ceiling in a cable gutter? The cable I grant you isn't dirt cheap but it's by far the most expensive part of that arrangement.
I live in a small house in a big city and 5ghz is the only usable spectrum. The connection is solid through walls and there is a minimum of interference from neighbors -- I can only see my immediate neighbor's wifi, whereas 2.4ghz I get broadcasts from the entire block and can't even stream Netflix
In fact the lack of range works so well that I can use almost the entire spectrum for various networks and not feel guilty
This is entirely a problem of our own making.
We give cellular providers hundreds of MHz of exclusive spectrum access, and then deprecate and auction off the old analog TV channels to give them hundreds more, but we expect everybody else to shove all of their traffic into a 100 MHz block at 2.4 GHz.
It's not that this standard is doing anything particularly nasty to the environment, it's just that it happens to be messing with the one tiny head-of-a-pin that we've decided everything needs to live on.
It's a computer controlled radio for crying out loud, give us 400 MHz and let the computer figure out how to hop around.