Hacker News new | past | comments | ask | show | jobs | submit login
5G: if you build it, we will fill it (ben-evans.com)
134 points by Doubleguitars on Jan 18, 2019 | hide | past | favorite | 125 comments



There's heavy political hype around "5G".[1] There are claims that it will matter for self-driving cars (despite Waymo saying they don't need more bandwidth), medical applications, and "smart energy". All of those are bullshit.

The "if only we had more detailed maps, self-driving would work" is a fake argument. Once you have enough info to know where the fixed trouble spots are, more data will not help you. Trouble comes from unexpected moving objects.

(2004 (not 2005) DARPA Grand Challenge, where everyone did badly. CMU team tried to do it by manual pre-planning. DARPA gave out the course as a set of waypoints on a CD 2 hours before the start. CMU had a big trailer full of people at workstations to plan out the exact path in those two hours, using high resolution aerial photos. But the USMC Colonel in charge of the event foiled their scheme. Just before the event, a few of his Marines went out in the dark and put up some obstacles the vehicle would have to go around. And, sure enough, the CMU vehicle plowed right into a sheet metal fence and got stuck.[2])

What we're likely to see is bandwidth that changes drastically as you move. You'll get great bandwidth with line of sight to a nearby base station, and then it will drop off drastically as you get further away. That's inherent in using 26GHz for high bandwidth.

The likely benefit is that it becomes possible to provide enough short-range bandwidth for many people to get video-rate bandwidth in crowded areas. So people can watch the game on their phone while in the stadium.

[1] https://thehill.com/blogs/congress-blog/technology/420509-wh...

[2] http://overbot.com/grandchallenge/images/race2004/cmucrash.j...


The worse thing is, if you're in the middle of nowhere with 0 reception what then? We need to rely on what's available locally above anything external, of course downloading all the map info and caching it definitely should be a big priority. I wonder if based on some location information and given that the car can 'see' where it's driving if it will improve GPS clients and work even when reception is so poor. I hate driving and hearing "GPS connection lost" from Google Maps, like really? You can sense certain things about my phone, why can't you just keep going, I've pre-cached a good chunk of the map too!

I'm not a GPS expert by any means, but dang.


GPS doesn't require a data connection.


But it does require a connection to GPS satellites.


GPS is receive only. So there is no "connection" per se, "reception" might be better here.


And you’ll have reception anywhere on the planet as long as you’re outdoors and not in a tunnel or a deep canyon or similar.


Urban canyon was a fun term from a previous phone-support gig...


Then the government will add new legislation that you are not allowed to go anywhere where there is 0 reception :)


The bandwidth isn't just for downloading maps to cars.

It's also to enable autonomous vehicles to constantly stream data and video feeds back to humans who can take over in real time if the car doesn't know what to do.


That's completely unrealistic and unsafe. Even with widespread 5G deployment the cellular network will still lack the reliability necessary to make real-time remote driving feasible. When happens when the network drops packets during a critical maneuver?

Furthermore you can't expect a remote operator to suddenly take over with no context of preceding events and immediately control the vehicle in a safe manner. It takes a little time for anyone to understand what's actually going on.


This plays into the density of towers. 5G has a very limited range unlike 4G. It's going to take years (if not decades) to get to a place where 5G can cover the entirety of the US.


Even if we had 100% 5G coverage it still wouldn't be adequate for a safety critical system. What happens when the tower fails on a hot day because the cooling system broke, or a construction crew cuts the backhaul fiber, or the carrier has an infrastructure failure because someone spoofed their BGP routes?

If level 4+ autonomous vehicles are ever going to work then at a minimum they need to be able to operate safely with zero network connectivity.


Imo they need to BE the network


Dynamic mesh networks can be a nice supplement but no one has ever demonstrated large-scale reliable operation with mobile devices. And you can't always count on having another vehicle in radio range. So turning vehicles into the network won't make remote control viable.


I think the idea would be to have the vehicle detect that its out of its operating spec, and park until either

a) situation returns to within operating spec b) remote control comes through

The remote control aspect would be severely limited and intended only to get the vehicle back to a situation within its operating spec. This trades the very difficult problems 'need to immediately control the vehicle in a safe manner' and 'what happens if a remote control is lost' into the difficult problems 'how do I detect I'm out of operating spec' and 'how do I park safely when I'm out of spec, especially if I've just lost remote control'


Park where?

I frequently drive on rural roads where there are blind curves and no real shoulder (and not much cellular coverage either). A human driver can usually steer a disabled car mostly off the road by going down into a ditch or up an embankment. But it's still not a safe place to be. And autonomous vehicle software is nowhere near being able to handle those maneuvers.


I mean yeah, thats part of the challenge. Its still an easier one to solve than the one you posited


How is it easier to solve? It seems about as difficult to me, assuming the requirement is to maintain a high level of safety in all situations.


It could be good enough if it’s a backup system only. Then again, probably the only reason the government would be enthusiastic is because these amounts of bandwidth would enable continuously streaming GPS tracked dashcam surveillance from every citizens car.


It's not nearly good enough for a backup system. The whole point of having a backup system is for safety and reliability. You can't build a viable backup system atop an unreliable network.


This already is happening with several autonomous vehicle companies.

It helps with municipal adoption.


That is a horrendous idea. 5G isn't going to be some magical technology that's always on all the time. It will have outages and black spots just like any other.


Is this comment serious? How could that ever work??


This is actually Waymo's strategy that Waymo is allegedly actually using in Arizona, so presumably it is serious. (Though I agree it's a terrible idea.)


Oh I thought the main argument why 5G helps autonomous cars was the lower latency.


> It's also to enable autonomous vehicles to constantly stream data and video feeds back to humans who can take over in real time if the car doesn't know what to do

If this is required for autonomous vehicles, then I'll pass.


That's the reason very few companies are talking about selling those, and the entire discussion is about cars as a service.

Given enough time, they will probably stop requiring external help, but it's very unlikely that the first autonomous cars to run without it.


I want to be able to drive a car from the safety of my home using a VR headset. They can turn actual trucking into a trucking "simulator".


I often see latency being touted as the game changer for 5G. With compute being pushed out closer to customers "Mobile Edge Compute"

Could anyone enlighten me to what the use cases are? Because it's still dubious.


The most instantly recognizable one, with a clear commercial potential, is cloud gaming.

Other fuzzier ones are, uh, crunch live video or audio captured by the device via beefy compute to... display overlays? Search similar products? Live translation? Calculate evasive maneuvers? A lot of these are limited by interface concerns that introduce human or other 'latency' anyway, making connectivity latency between the base station where the 'edge' is, and the device itself less of an issue.


The latency needs for online gaming aren't as stringent as you might think.

I have a Samsung TV which, if it is not in "game mode" introduces much more latency than my DSL internet.

I couldn't win at Titanfall at all with the TV in normal mode, in game mode I was able to progress halfway up the leaderboard.

So far as I can tell, "Edge" is for people who think AWS isn't expensive enough and they'd rather pay AT&T or American Tower prices.


With the gaming scenario some distinction between display latency and network latency is worth reviewing. Network latency is something that software can cover for in some degree(to create online experiences there's always some kind of synchronization of different wallclock timelines going on). Display latency is universal, though. A faster display will let the game give you feedback faster, so long as we are talking about feedback in the local experience(so: aiming, movement are commonly predicted on the client, while hit registration often uses a server roundtrip). This separation of responsibilities allowed people to play Counter-Strike 1.x on 200ms dial-up connections back in the 2000's.

When it's a pure video feed you get the impact of all the latency all the time, and that means the problem has to be solved with brute force reduction of bottlenecks everywhere.

So how much latency is enough for responsiveness? This is pretty easy to derive from the common target framerates: for 30hz display, you need 33.3ms. For 60hz, 16.6ms, for 120hz, 8.3ms. Since perceptual latency is known to keep improving up through 144hz it's reasonable to say that we should be looking for only 5-6ms at most, while most broadband connections are still achieving pings in the 20-100ms range, depending on the game and the specific connection. On the very best connections, that is, we can assume a "30hz transparent" video stream, which is fine for casual experiences but severely impacts competitive ability when tested. In many current popular titles this manifests in mechanics that are both latency sensitive and require a server roundtrip, e.g. building in Fortnite.


Except I have 4-8ms ping on my internet, with rare peaks at 20ms.

I have between 40ms and 200ms on 4g. Average around 80, with often peaks above 300ms.


It just means that the last mile over the air will only add 1ms of latency. Compared to LTE which I think has around 5ms. The rest of the network (read Internet) is still just as fast as before.


> With compute being pushed out closer to customers "Mobile Edge Compute"

That seems implausible to me. We already have the equivalent of supercomputers in our pockets, and yet the trend is to move computing away from the end user, onto centralized servers.


In some ways but not in others. Edge Computing is actually a pretty hot area in telco but also other areas. One of the issues is that there’s so much data being collected, some of which needs real-time action, that you need distributed computing to filter and take actions based on rules. This edge computing can be fairly substantial.


Yeah I'm totally going to tell you what I'm currently working on because I like random competition so much :)

really though, the waves aren't faster than lightspeed. Although lightspeed in different media might depend on frequency, indeed, I do not think that's the problem, rather than congestion, because the refraction index of Air is close to 1.

Congestion is a typical problem, though. ""Mobile Edge Compute"", what about "capsule networks"?


If you think of data that's one thing, but Radar a rather different one. You won't need to bump a scanning ray around, if you can simply detect changes in the static urban microwave background. You'd need many senders. Suppose those fit into lanterns, that have line of sight by definition. I'm fantasizing about MEDs emitting 26 GHz with 1GHz duty cycle, lol.


In my view, 5G primarily benefits carriers. I'm not personally excited about it, as it won't provide anything I need that 4G doesn't currently provide. I won't be going out of my way to upgrade anything with 5G in mind (but I won't be avoiding it, either).


I never heard about more detailed maps as a requirement for self driving

Perhaps those people should check how the first navigation device worked ( FYI: without GPS back in 1981) as inspiration to see if there are other ways.

https://jalopnik.com/the-first-commercially-available-car-na...


Why did DARPA give out the course ? Was that part of the challenge or a way to favor a contestant ?


The 2005 DARPA grand challenge was "follow these dirt roads through a desert" where they wanted the vehicles to follow a specific set of dirt roads.

The self-driving car problem can be divided into several chunks, and the 2005 Grand Challenge intentionally tested some of those chunks and ignored others - a divide-and-conquer approach to problem-solving any developer will be familiar with.

The subsequent 2007 urban challenge added extra chunks, like other moving vehicles, high-level route planning and replanning around blocked roads, manoeuvres like parking and three-point turns, and suchlike.


Broadcast has other solutions like broadcast. It's when everyone has different things you need 5G.


> Mainly because of this new spectrum, mobile 5G speeds in good conditions could be well over 100 megabits/sec and potentially several hundreds megabits/sec (mobile speeds of over a gigabit/sec are technically possible but unlikely in the real world).

I already see speeds like 200 Mbit/s on 4G in good conditions, sometimes more. Normal conditions 50-100 Mbit/s. Yeah, your mileage may vary. Those 1.2 Gbit/s 4G cellular modems do deliver. I'd be disappointed if 5G didn't significantly improve on that in real world.

> 5G is promised to have much better latency than 4G - perhaps 20-30ms in the real world, down from 50-60ms for LTE (4G). It’s not clear how visible this will be to users.

What? HSDPA "3.5G" was about 50-60 ms. 4G is mostly something like 12-20 ms, when I've measured the latency. 5G hopefully at least halves that.

Anyways, I do acknowledge 4G performance is very regional. Above just reflects my experience, what I've been measuring.


Yes, there's hype and confusion around latency and 5G. Already with LTE most of the latency you're seeing comes from the network side, not from the radio link itself. And LTE latency can be low enough for customer application as you're experiencing.

"low latency" for 5G is related to a variant called URLLC, for Ultra-reliable and low-latency. 5G is an umbrella for 3 different variants:

1) eMBB: the massive broadband with higher speed, mostly with mmWave. This is associated with the 3GPP "NR" (New Radio) cellular standard;

2) massive IoT: this will actually be done based on LTE CatM and NB-IoT for a long while, with small improvements in release 15. It can be confusing, but the "Gs" are performance requirements defined by the ITU-T, and not a specific technology. Then a tech that meets the 5G requirements can claim (honestly, without marketing hype) to be 5G. It turns out that for massive IoT LTE meets the 5G ITU-T specs. Later on there will be a new tech based on NR, but nobody is in a hurry there, as LTE IoT is ok;

3) URLLC, the optimized for low-latency variant based on NR. URLLC is the less clear of the 3 really. The vision is to use NR for industrial control applications, and maybe VR/AR for consumer too. There are low-level optimizations down to the radio framing layer to reduce latency. Lots of fuzzy visions, but I'm not sure anyone has a clear business case. The work on this at 3GPP (the organization defining the LTE and NR specs) is progressing slowly. In particular there's always a cost to reduced latency, so it won't be for free. Private usage for factory could make sense, but consumer use is less clear.

There's a tendency to confuse the "regular 4G/5G" with URLLC, but the later is a different beast and not solid yet.


Benedict is now in Silicon Valley and obviously his audience are US based. So those numbers are based on Real World US Carriers. In many places around the world, latency and bandwidth improvement from 5G would be minimal because we are already getting decent 4G speed. And in your cases I doubt you care about additional 100Mbps when you already gets 100MBps to start with, the likely 5ms latency improvement aren't going to be very noticeable.


Anecdotally, I just ran a speed test and got 140Mbps up/35Mbps down with 31ms latency on LTE. I’m in SF.


Up here in the frozen wastelands of the north, in a city with 100K people, I get 52Mbps down/11 Mbps up/33 ms latency on 4G.


City centre York UK (the original and best York!) on 4G mid-morning, a quick single unscientific test now shows 25Mbps down, 30Mbps up, 32ms latency, which seems toward the higher end of normal throughput (the best I've seen a test show around here I think was a little over 40Mbps, 15 to 25 is what I'd usually expect) and normal for latency (that is almost always low-to-mid-30s around here).

For what most people use their phones for extra bandwidth isn't going to make much difference, which is good because the individual is unlikely to see it: more devices are coming online as the bandwidth improves, and the demands you have increase too (usually outside your control: sites getting heaver, using higher quality video, ...).

The latency improvements will be more noticeable I expect, especially in busy cells because latency-centric problems tend to balloon much more than bandwidth-centric ones as the air gets more crowded. As the available throughput nears saturation both bandwidth and latency suffer but for most uses the latency is going to be what causes most problems because for many applications you can wait for the bulk of a block of data to come in as long as the first bits of it arrive in a short amount of time (in a web browsing example you can start reading before the rest arrives, unless the page is badly designed and doesn't render until everything is transferred).

Away from "normal" Joe Public mobile network use (web pages, games, some video, maybe maps) the next big thing is workers using VPNs, and again here latency improvements are going to be more beneficial to most people than a bandwidth boost.

Despite this the bandwidth improvement is the fact most shouted about. The reason for this is that it is a number people understand (or at least think they understand) because it is how these things are usually touted. It is not dissimilar to the back end of the GHz wars with CPU releases: at a certain point the extra clock ticks were meaningless because of many other factors, but chips kept getting sold that way because that is what the buying public "understood" and marketing didn't want to bamboozle them with other details (cache design factors, multi-core performance, ...).


27Mbps download here in North Dallas with 39ms of latency. T-Mobile Unlimited LTE.

If I’m downtown speeds are usually better.


Yeah the article is a bit of a miss. On Telstra 4G in Australia, you can regularly get >100Mbit/s and ~15ms latency. We've had this for a few years now.


As I'm reading these comments, I just did a test on my own 4G connection and I have 105 Mbps download and 17 ms ping and in my city this is the norm and I have seen speeds higher than 150 Mpbs.

Screenshot: https://www.dropbox.com/s/pw86k6tgd9rc5fd/Screenshot%202019-...


In certain locations, I too can get those kinds of speeds. But when I actually go to use it over long periods and over more normal work loads (e.g., while travelling along the CalTrain corridor in SV, particularly in the San Antonio/Palo Alto area), I'll routinely get 20 second latencies; it quickly makes 200-300ms latency feel like stuff is "working".


One of the goals of mmWave is tactile networking with <= 1ms latency[1].

One of the main issues here is that they want to make a lot of money from 5G, that's the main reason why you're always required to have a base station involved.

[1] https://ieeexplore.ieee.org/document/7876982


The primary reason for the upgrade to 5G is more capacity, not necessarily faster speeds. If mobile carriers can get rid of data caps entirely (even while tethering) they can give Comcast a run for their money even at 'only' a few hundred Mbps. There aren't really consumer applications that use more than a few dozen mbps anyway (4K stream is less than 25mbps.)


Ahh.

thing is: you are a subscriber, not a 5G procurer.

Telcos are concerned with how many people like you they can support at once with how much power and opex and capex.

If 5G tech provides n*cell throughput, for 1/n capex and/or opex and you (as a subscriber) see no benefit vs LTE then it will please Telco's much. However, 5G may make a big impact on your experience as well.

I don't know how you measured the latency; but remember that these numbers can be very deceptive due to network technology pretending to do things that it isn't really doing. That's fine when you run a speed test, but not if you are being treated at the side of the road.


1.2 Gbit/s 4G cellular modems? Sounds interesting. Do you perhaps ahve any links to such a product?


The latest Qualcomm X24 can do 2Gbps, of course that is assuming your carriers and everything else aligned perfectly. In reality I am not aware of any Carriers in the world which has these kind of spectrum ready for 2GBps speed.


Cool!


https://www.google.com/search?q=1.2+gbit+4g+modem

I've had one (Cat 18, 1200/200 Mbps) in my daily driver for almost a year. The best download speed under ideal conditions I've seen have been a bit under 400 Mbps.


The query [1.2 gbit lte modem] seems to yield better results


This post, and most of the comments here, seem to miss the main selling point of 5G (at least for consumers): consistency and reliability.

With wireless data connections, you're sharing the medium (and therefore the bandwidth) with all other users of the same frequencies. This is the reason that you can have absolutely terrible performance in a densely populated city, despite having a maximum strength 4G signal. Having the 20+Ghz channels enables operators to install a large number of micro-cells in these areas. Even if they don't penetrate walls, moving everyone outdoors on to these cells free's up the lower-frequency cells for indoor users, substantially reducing contention.

Another improvement over 4G that hasn't been mentioned by anyone is, supposedly, that it will significantly reduce latency and interruptions from moving between cells. This is especially apparent when travelling at speed (e.g. on a train).

The effect to consumers of both of these is that we'd get much more consistent performance.


Not saying you're wrong, but the 4G speed in many US or European cities isn't very representative.

I live in Bucharest, Romania, a city of 2 million people, but it's very densely populated and the 4G data plans are cheap, on prepay or contract, so everybody with a phone has 4G bandwidth to use (not necessarily for watching Netflix, but enough for music or facebook).

And we get good 4G coverage both indoors and outdoors. Right now I'm at my office, indoors. As I'm writing this I just did a speed test on my connection and I'm getting 105 Mbps download, 10 Mbps upload, 17 ms ping. And this is the norm, I know because I like going to coffee shops and use my mobile connection for real work.

Here's a screenshot: https://www.dropbox.com/s/pw86k6tgd9rc5fd/Screenshot%202019-...

I haven't seen such good coverage and performance in the several European or US cities I've traveled too. E.g. in my experience the Internet in Germany or France is really poor. But I don't believe that it is because of the 4G technology.


At least in the U.S., the real problem is the billing model. Those kind of incremental improvements will be nice but that doesn’t seem like enough to justify more than a gradual upgrade since most people won’t see much benefit other than a few edge cases — how many people are going to buy a new phone so they can get online slightly faster when their subway car approaches a new tower?

Every other application, especially the cool ones like AR/VR which do seem plausible for people dropping a lot of cash, will be constrained by the enormous markup on data well before it hits the limits of LTE. Fewer milliseconds on handoff could be nice but it’s hard to think of an application which can’t buffer but is going to fit within a few GB per month.


Yeah, I don't get this either. I live in Canada and mobile data is $$$s. I don't see the point in more-better-data when we can barely afford to utilize the current infrastructure. Providers are likely to recover this new infrastructure cost through price hikes.

I don't give a crap about faster mobile data, I've got a 5GB plan and faster data just means I could chew through it faster. All of the data heavy applications I can think of I use almost exclusively on wifi where it's available.

I think the real sell for 5G is it's capacity to carry thousands of concurrent connections, allowing for everything to be connected. A true IOT solution where anything you can think of can connect without congesting towers.


Yeah, I get that there are benefits to the providers but it just seems like it's going to be a tough sell for get a premium from users to pay for it.


While that's the selling point of 5G; it's difficult for the general consumer to rationalize paying more for something of only marginal benefit to them. While many of us reading HN understand how this can help computing and moving heavy amounts of data, to your grandma with a smart phone, how's it going to improve her life? She can still google a recipe for cake in a few seconds, what's 1 second faster to her?

The potential benefit though I see is that you potentially no longer have to pay an ISP like Comcast. You would only pay one bill for 5g for both your computer/phone. Aside from that, its a difficult marketing sell.


It doesn't enable operators to install a large number of micro-cells, it requires it. And this is where I think 5G may end up being disappointing.

I think telcos will encounter increased resistance to putting up millions of microcells, and the cost will end up being way higher than they anticipate. So coverage in the mm bands will end up being extremely spotty (way worse than 4G now), and may end up being a near-bust.

That's my .02 anyway. But I live in a suburban-ish place where 4G coverage is mostly a cruel joke, despite being just across the Golden Gate Bridge from 4G-ground-zero SF.


I'm amazed at how bad the 4g coverage is in the bay area. I barely get a signal at my house or at my office. One's in the middle of a city, the other is literal minutes from Google. Yet I get a single bar of reception. It's a sad joke..


5G seems like a pretty marginal improvement over LTE to me. The improved latency is nice, but I'm not sure it's worth the many billions of dollars in hardware alone. The high-frequency stuff would be useful in some situations, but really overlaps with WiFi in most use cases.

It certainly seems like much less of an improvement than 2G->3G or 3G->4G.


Counterintuitively, bandwidth improvements do little to help the vast majority of people. Except on rare occasions, like when installing apps. Even though this has been the selling point for ages.

However, latency is a different story. Many (most?) can "feel" improved latency as better responsiveness, like web page load speed. There's still a lot of "ping-pong" traffic going on, where latency improvements do make a difference.


Yes, but personally speaking, I don't even see the latency improvement as being terribly compelling.


I agree! The latency improvement is the interesting one, but honestly LTE latency is already pretty good.

I wish there were more efforts behind IOT support, by way of lower-frequency bandwidth and low-energy radios, maybe even restricted to a simple messaging protocol. That would open up so many new applications!


Ever tried residential fiber optic internet? <1 ms latency. Web pages just pop on the screen — at least when the server is geographically near. It's addictive.

Compare it to cable modems with still lowish 5-10 ms latency. Pages load visibly slower. But you don't mind it or even notice, if you haven't previously gotten used with the faster option.

So I think I'll always take all of the latency improvements available. The difference can be substantial even when the starting point is already "pretty good".


How do you even find sites that load fast enough for this difference to matter? I only know of maybe 1-2 websites that load instantly like that [1]. Everything else seems bottlenecked on the server side (and sometimes the client).

https://www.changedetection.com/dirsearch.html


> but really overlaps with WiFi in most use cases

The UX around free public WiFi is so terrible that I really never want to use WiFi again outside of my own home.


>but I'm not sure it's worth the many billions of dollars in hardware alone.

Fatter Pipes, as stated in the article. There are still lots of people on 3G Network, and we expect once the all of them moves to LTE , the total amount of Data will again double. 5G is designed around capacity, the higher the total capacity, the more Data they can sell you. In reality you will likely be paying the same prices but getting much more Data from your plan. Surely this is a good enough reason for them to spend money right?


I can’t wait to see what kinds of ads and bloatware 5G enables.

Something tells me the bloated ticks on this dog’s underbelly will be grow proportionally to the fatness of this new pipe.


Hmm. With all this stuff about the ultra high frequency 5G data not going through walls, how does it handle bad weather? My phone's already not great if it's raining heavily. I would hate for my home internet connection to flake out any time it rains.


It will fall back to 4g so it doesn’t really matter.


But I spend the most part of my day between walls. It will make 5g pretty much worthless to me. And who needs super high speed when outside? Certainly not people driving or cycling, who should not be focusing on their mobile. I mean it will be great for an home internet connection with an antenna on the roof, but then it doesn’t have the range where it is useful (low density villages). I just don’t get where it will be useful.


I'm not convinced. Current fallback implementation seems to not work very well - my mobile connection is slowest (if it even works decently) when it tries to change between 4G and 3G, while if I force 3G then it works decent.


I'm afraid that 4G will be deprioritized and backhaul removed to favor 5G so that only people paying more get good performance.


no, 3G will be removed. Too complex.


Some of that complexity is what allows me to make voice calls when I don't have any data. Seems worth keeping that over 4G (if we had to choose).


You can have LTE voice even if you've maxed your data cap, from a technical point of view. For 2G/3G it's true that those standards support pretty independent CS (voice) and PS (data) connection, so it's natural to decouple voice and data. For LTE, and also NR, only the PS domain exists. It's still possible to separate Internet access from voice though, and they are. It's just done in a different way. 2G/3G/4G support multiple concurrent data connections, called PDN. Each is like a different IP interface, and some are terminated on the modem itself and not exposed to the user. In the typical LTE deployment, you have an IMS PDN for voice signaling and data, an admin PDN for the remote management of the telco using OMA-DM, and an Internet access PDN for the end user --- and the only one you'll see. Traffic on those PDN are fully segregated, and counted separately. So the IMS and Admin PDN traffic do NOT count as part of your data cap, it's on top.

Thanks to this LTE IMS voice is separated from data, like 3G even if it's done differently.

Of course, this doesn't apply to OTT voice: that is always in the Internet PDN and in your data cap.

To get back to 3G, it'll slowly die. Anything you can do on 3G you can do better on 4G. So operators tend to use their 3G infra to the max, but they won't upgrade it. It'll be replaced by 4G over time, until they eventually pull the 3G plug.


Thanks for the technical reply but I meant for the situations where the network coverage is poor and you don't have a good enough 4G signal to place the call. I would fully expect the provider to not include LTE voice in my data allowance, in the same way they often exclude some streaming/social network services.


Ok, thanks for the clarification. Even with this new understanding 3G will eventually go. Coverage depends on several things: 1) deployed infrastructure, 2) frequency used and 3) waveform / standard efficiency.

3G can have a temporary advantage for (1). But as the 3G infra ages, it will always tend to be replaced by cheaper / more efficient LTE infra as the existing 3G bands can always be refarmed for LTE. And LTE will make a better use of the available bandwidth. There are large differences between operators on the speed of this process, but the trend is universal.

(2) is why 3G was impaired vs 2G: 2G was deployed in low bands (900 MHz for 2G is very common), while 3G was very often only available in mid-bands (1700+ MHz). The higher the frequency, the more challenging the coverage. So 3G coverage lagged 2G for a long time, and is still lagging in places (Europe for example). The situation is different with LTE, due to the digital TV dividend. This opened low bands for LTE everywhere (700 to 800 MHz, depending on region). Those bands are even better then the 2G bands. They're not yet fully deployed everywhere (Europe is lagging), but they eventually will. And when it's done you can kill 2G and 3G.

For (3) or course, new tech has improvements so 4G is better than 3G and 2G.

When you combine all of this LTE will replace all previous technologies everywhere, in time. How long it will take will however vary a lot depending on regions. Europe invested a lot in 3G, and telco there will milk this infra as long as possible. And because 3G coverage in Europe is not as good as 2G (see above, bands) 2G will also tend to stay with 3G. Cheap second hand 2G/3G infra will remain for a long time in developing countries. But in the end 2G and 3G will go away, it's just a question of time.

On the other hand, 5G will take a lot longer to replace 4G. The big change is mmWave, but it's for small hotspots. You can have 5G in low bands too, but then the gain vs 4G is small so I expect operators to take their time to replace their 4G infra. We'll have 4G for coverage + 5G for dense areas only for a long time IMHO.


If they kill 3G, they'll reuse those radio bands for LTE or 5G, and you'll get the same coverage you have on 3G now.

Around here, LTE coverage absolutely kills 3G coverage since LTE is deployed on the old analog TV bands (700/800 MHz) which has great penetration into buildings etc. My operator doesn't even activate 3G on new iPhones - they're effectively LTE-only now.


I'm already seeing confusion around the name "5G". The issue is that new dual band wifi routers broadcast at both 2.4GHz (the old way) and 5.x-5.3GHz (the new option). And they're calling that new option "5G wifi". Adding to the confusion are some people bringing up the idea that "5g Cellular data" with obviate the need for home wifi at all.

Further still, the new "5g Cellular" standard can include use of the 5GHz frequency band.

Actually, in writing this, I'm realizing I myself may have some understanding incorrect. So please correct me if you know better.


This all seems completely irrelevant when wireless ISPs have somehow trained users that they should be paying per gigabyte.

In fact: I don't want faster speeds. Give me much, much slower speeds so that instagram doesn't autoplay video and vampire all of my mobile data plan.


>much higher radio frequencies (over 20 GHz, AKA millimeter wave or ‘mmWave’)

Can anyone comment on whether this might have any effect on us? It makes me a bit anxious to see the huge list of wifi networks available where I am -- are we absolutely certain that all those signals don't have some sort of health impact?


See for example https://www.ncbi.nlm.nih.gov/pubmed/25879308. I don't know why people are so care free about effects of nonionizing radiation. Current level of knowledge is in my opinion very primitive to even grasp the complex effects that may be happening. How many of us know anything about say voltage-gated calcium channels?


Apparently the sun outputs radio waves in the THz-range, sub millimeter waves, so i dont think that this is anything to worry about. Also the strength of the radio waves comming from our devices is really weak, not at all in the suns level.


Yeh but you see. The sun has existed for billions of years and we as humans have yet to adapt to it fully. There is a reason why we live in the shade and use sunscreen.

Excuse me if I still feel skeptical about mmWaves..


Light is 400,000 GHz to 800,000 GHz and has always existed on this earth. Man-made pulsed microwaves are 300 MHz to 600 GHz, and have never existed on this planet before.

Both from a physics standpoint, as well as from a biological standpoint, they are not the same thing.

Both are biologically active, but for one of them we have adapted (Vitamin D production and photosynthesis, for example), for the other we have not.

From another perspective; the FCC limit for for microwave radiation is 10^18 times higher than natural background. That's a difference of a quintillion times.

Backround levels (green), compared to current exposure: https://pbs.twimg.com/media/Dt2NMoJU4AE2oYY.jpg

Therefore you can't really compare visible light exposure with microwave exposure. One has existed on the planet for billions of years, the other is completely artificial and man-made, and unlike visible light it penetrates tissues through the whole body, and there is no adaptation.

https://www.thelancet.com/journals/lanplh/article/PIIS2542-5...


I don't think I'll see any improvement with 5G - even at home, I see very little difference between my 150mbit connection upstairs or my ~20mbit connection downstairs (uses some old powerline networking boxes). My LTE connection is usually around that or above.

The only time I notice a difference is when I'm downloading a big file, which in these days of streaming everything, I rarely do now.

But for normal web browsing, my browser rendering speed seems to be more of a bottleneck than my internet connection. And on my phone pretty much all I do is web browsing (either through a browser or an app)


Next-level "cord-cutting": Even your fixed location home internet is cellular!

I find that sort of appealing as a unification.


But is it desirable? If you get a choice, it doesn’t make sense to saturate a wireless frequency when you could use a wire instead. Home internet is only going to be more bandwidth intensive, with 4k streaming, playstation/xbox in the cloud, etc.


It's not appealing at all. No matter which 'g' it is, wireless will never beat wired. It's simple physics. Wired will always be higher bandwidth, lower latency, and more reliable - less dropouts and more consistent speed.

Wireless is already a 'good enough' wired replacement for areas where running fibre is uneconomical, though.


If nothing else, 5g for fixed locations seems desirable in areas where the local telco monopoly has failed to update local last mile infrastructure, and only offers relatively poor quality VDSL/ADSL2+ service (looking at you, ATT).


Potentially saving 75 dollars a month is very desirable to me. There are, of course, many reasons why that might not actually happen, but as long as I'm paying two separate companies for two completely different connections, it's almost guaranteed not to happen.


Having additional wireless competition will improve the quality of the wired options. They've been able to skate along in near-monopoly conditions for too long.


Or get wired connection providers off the hook to provide fibre to the most remote, unprofitable, locations.


This blog post is negative, unimaginative, and misinformed. One major factual error is the author's understanding of fixed 5G service, which is already offered by Verizon using what they call 5G TF (Technical Forum). They will be able to upgrade all related hardware for 5G NR (New Radio) once the standard has been set in stone.

Cars being able to get new information within milliseconds about things that happened distant from it, but on its course, will be useful.

AR outside the home could be incredibly useful, if provided with low latency and high bandwidth over 5G.


I don't think 5G TF can really be considered 5G.


Why?


Because it's a proprietary Verizon standard that is different from and not compatible with the standard that the 5G committee is producing.

Verizon is being a bit deceptive in using the "5G" designation for this. It reminds me of when US telecoms started calling their non-4G systems "4G" for marketing purposes.


Is there any data about the better economics of 5G and 4/4.5G?

And further more - with people generally not needing more data, and not willing to pay more, why would wireless carriers bother ?


In Germany carriers heavily lobby to get a taxpayer funded development. Even some media outlets like „Der Spiegel“ fell for it (the most popular German website that does not support https by default...)

Just recently some journalist cover the biological aspects of 5+ GHz radio on humans which assume that there are true risks compared to the sub 2 Gigahertz bands.


> Just recently some journalist cover the biological aspects of 5+ GHz radio on humans which assume that there are true risks compared to the sub 2 Gigahertz bands.

Can you elaborate on this?


I can't. Here's the link to the publication in German https://www.tagesspiegel.de/gesellschaft/elektrosmog-europa-... and https://www.tagesspiegel.de/gesellschaft/mobilfunk-wie-gesun...

Tagesspiegel is usually a good newspaper but this reasoning is are very light in facts. They should stick with uncovering lobbyism.


>with people generally not needing more data

I don't see that anywhere. All the data has shown average Data usage per user is still increasing YoY. And as new younger population comes to age, replacing older generation which are less tech savy, Data usage will only continue to grow for the foreseeable future.

We certainly do not want to pay more, but I have never heard anyone who does't want more data, assuming they are not on unlimited plan.


So who will pay for it? The customers won’t directly pay more. In Germany lobbying towards subsidies is underway. I assume either that will be successful or we just have to wait until UTMS investments paid off...


The users will paid for it, without noticing. Carriers don't paid the billions in one go and hoping to recoup their investment. It hasn't been working like this for long. Your Carriers basically sign agreement to rent lot of things. Rent for Datacenter, Rent For Fibre if they are not an ISP themselves ( Which is very rare now ), Rent for Outbound Network Connection, Rent for all the Cellular site ( That is why property pricing has an effect on Mobile Pricing ), Rent for the Retails Shops, Rent for Spectrum, Rent for the Services Agreement with Infrastructure provider, which today is either Huawei, Ericsson or Nokia. Along with their own staff. As you see they are all pretty much a fix cost to achieve a services level requirement by law.

You have been paying for the Services agreement to Infrastructure, and as long as you continue to paid, under the accepted condition Huawei, Ericsson and Nokia will continue the hardware and software investment as well as Network tuning.

So if you are paying $100 per month, and $30 had been going to Nokia / Huawei / Ericsson, as long as you paid the same, there will be money for future upgrade.

And one of the reason why you see many countries has Carriers consolidated to three, becasue the nature of carriers business requires certain minimum amount of revenue to be sustainable.

Note: Of coz this is an over simplified description. But it basically shows you don't need to paid a lot more to get 5G.


If the hardware companies already get their rent, what incentivized them to invest so much in 5G r&d ?


Because if they don't have something new, the rental will drop. And Carriers are always asking for more capacity, there are also lots of improvement for the backend which makes ( or suppose to ) things cheaper and easier to manage, most of them normal users don't get to see.


I don't think there is a killer use case for 5G yet, but I don't mind. I think the performance improvements alone are worth it. It's hard to predict the future, and while often there are points of diminishing returns in technology (which maybe Apple is experiencing right now) I'm personally not sure we are there yet in terms of speeds and availability of computing power for cellular networks.


> I don't think there is a killer use case for 5G yet

Why do you need a specific killer use case for more bandwidth and lower latency? That helps almost all applications.


Isn't that more or less what the rest of the comment says?


I though LTE was supposed to encapsulate all future generations of mobile network. Isn't LTE standing for Long Term Evolution?


Phone makers need something new to justify as a new feature to sell those $1000 phones.

'LTE generation 4.3' just doesn't have the same ring to it.


One thing I always wonder: your typical cell plan (that I'm familiar with) with, say, 6 GB per month would be completely used up within, what, 5 minutes or less? And then you can either pay more or wait for next month? What's the point?

(Yes, and presumably they'll increase the cap, but even a 100x increase would give you 1 full day of usage per month.)


But isn’t 5g really short range? Isn’t there a big problem with the frequency being blocked even by human skin?


I agree with this, but in a sense it's also rather underwhelming: there's no killer use-case right now that would be instantly popular and economically viable were it for just a bit more faster, even lower latency IP connectivity emanating from the same four companies' installations everywhere. Every idea is just the industry and pundits' wishful thinking, where modest and grand visions mingle in our imaginations before likely suffering an underwhelming end -- probably poor execution, or a nonexistent business model. Except, of course, cloud gaming.

But that's where the economics get fuzzy too. Building out all these base stations will be an enormous cost. Mainstream consumers may tolerate modest price increases for connectivity, but much fewer will bear significantly higher prices, or spring for significantly better plans. Such a market segmentation would also dampen the consumer excitement for use-cases as well.

To get around this, corporations who want to ensure connectivity for their application will push to become MVNOs and offer captive access to the corresponding product, so that the end-user doesn't have to pay the cost directly. This works best when they control the hardware too. Vertical ecosystems will proliferate, where the experience can only be consumed using the corresponding hardware.

It's not hard to imagine the likes -- and competitors -- of an always-connected successor to the Nintendo Switch, streaming games from a nearby server farm using a captive MVNO, or one of the many Amazon or Google's decidedly non-gaming, 'smart hub' devices that ensure their own connectivity without the need to put them on your Wifi. This has serious implications for privacy and business models too: it will be commonplace for devices to be connected to the home base by default in a way that's difficult to thwart, but correspondingly license and authorization servers will always be reachable, so DRM-enforced subscription business models can continue to thrive.

But the issue is, you can already do the entire latter part -- the always-connected home hub, or the always-on DRM captive media player with LTE or lower, but it's not yet done. Why? Because people willingly join them to their Wifi for free. 5G will have to fit into the holes left by existing alternatives, and do it at a price point or cost structure that makes sense.

I expressed my view before [1] that most of the hype surrounding 5G is the industry's own buzz -- likely to get investors excited -- and then amplified by tech journalism, whether intentionally or unwittingly. It greatly remains to be seen how much its deployment lives up to its big expectations.

[1] https://hn.algolia.com/?query=niftich%205G&type=comment




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: