Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I remember wiring up cat5 gigabit about 20 years ago in an industrial workplace. My house of that age also has cat5 everywhere and it’s aged very well. 1gig is still the standard for wired with few but expensive 2.5g and very expensive 10g home and small business options.


I live in a house nearing 20 years old, and was incredibly pleased when I moved in and realized that all the phone jacks in this house were backed by CAT5, and if I was willing to invest the time (which I am), I could have at least one Ethernet jack in each room, and a pair channeled up to the attic as well. My only regret was they stripped way more than needed and didn't leave a lot of wire available, but enough that I could terminate and add a keystone jack that will last past my needs. Or so I thought until I learned that my ISP offers 2.5GBps to the house.


Implicit QoS - each device has a max 1GBps share of the 2.5GBps connection. Most hardware can't handle more anyways.


That's exactly how >1Gbit/s connections get sold. Hardware with multiple high-speed ports is very expensive, so typically you have 1 2.5G/10G port LAN port and all the other ports are 1G. So they say you can have multiple concurrent 1Gbit/s streams.


> Implicit QoS

Hahaha. Love it.


Even now 1gbps is plenty for virtually every end user and I suspect it will be good enough for a long time (maybe VR changes things??).


Neural compression is an emerging field and already shows some striking compression abilities, especially if the compressor/decompressor includes a large model which amounts to something like a huge pre-existing dictionary on both sides.

Stable Diffusion XL is only about 8 gigabytes and can render a shocking array of different images from very short prompts with very high fidelity.

1gbps might be enough for more than we think.


Deterministicly?


Sure. The only reason image generators aren't deterministic is that you inject randomness. Set the same random seed, get the same image. Download Stable Diffusion XL and run it locally and try it.

There are models that can be run in both directions. Take a source image and generate a token stream prompt for it. That's your compressed image. Now run it forward to decompress.

CPU intensive but massive compression ratios can be achieved... like orders of magnitude better than jpeg.

It's lossy compression, so we're not violating fundamental mathematical limits. Those bounds apply to lossless compression.


Well, the value proposition of image formats are 1. transmission, which requires both sender and receiver to have the exact same model, which requires us to standardize on some model and ship it with everything until the end of time, and 2. archival, which would require storing the model alongside the file (which might more than counteract any data saved from improved compression) and would be highly fraught because, unlike existing decompression algorithms, it cannot be described in simple text (and therefore reimplemented at will), which risks making the file inaccessible and defeating the point of archival.

It's a cool idea, especially for high-bandwidth, low-value contexts like streaming video calls, but I don't think it's going to wholesale replace ordinary lossy formats for images or prerecorded video delivery (and this is without considering the coding/decoding performance implications).


Most people using using streaming video don't require archival support. The source and the internet archive etc can manage the archival part.


From my puttering, image->tokens->image yields something that may or not vaguely resemble the original, but is never anywhere near identical.


My personal experience with VR is 1gbps is plenty, the issues with VR more boil down to things like latency (for instance, streaming a quest wirelessly with VR desktop basically requires Ethernet, with regular wifi the experience is just awful).


Do you have WiFi 6? I found that to be adequate for me and my Quest needs.


Only so much data the human brain can pay attention too. And for big data you are running in data centres anyway while you are sending control data back and forth from home.


> very expensive 10g

For me it's just 2€ more expensive than the standard 1g plan. It's really unfortunate to see how bad internet prices are, especially in the US and other countries with ISP monopolies. The only reason my internet is so relatively cheap is because early on there was a lot of competition in my country.


That isn't what is expensive.

If you get 10 GB, the question is, then what? I have bidirectional 1 GB plugged into a $200 EdgeSwitch, which then feeds Cat 6 throughout the home ($100/500 feet). This then filters down to $20-30 unmanaged 1 GB switches elsewhere. The whole thing is under $500.

If I wanted to go up to 10 GB I don't just need to change to a $2K~ EdgeSwitch, I also need to run fiber/6E to be able to deliver more than 2.5 GB to any endpoint, then invest in expensive switching infrastructure elsewhere in the home to turn the incoming 10 GB signal into something more devices can accept (e.g. 1 GB or 2.5 GB).

Safe estimate, is to go from 1 GB bidirectional to 10 GB bidirectional, it would be $5K in equipment and pulling new cable.

For $100/month I can do 10 GB, but I won't because of the equipment cost/diminishing returns rather than the ISP cost difference. If network equipment comes a LONG way, and I can do it for under $1K, I'd consider it.


Correction: you can usually run 10Gbit/s over CAT5e if the cables are not too long, so you probably won't need to replace your cables. But the hardware is indeed expensive.


That isn't a correction, that is pedantry. Nobody is going to spend $100/month and thousands on equipment and then run their 10 Gbit/s network on 5e. The lengths to remain stable won't even bridge floors of a home let alone from end-to-end.

If it was free in terms of equipment, you might have a point. Since then 10 GB is just a "bonus" but it isn't, or even close. So you'd be cheaping out on the final 10% of the cost.


I'm not sure what you mean, good quality CAT5e cables should easily give you 10Gbit/s under 30m. If it works, why replace it?


> The lengths to remain stable won't even bridge floors of a home let alone from end-to-end.

This isn't true. I've run 10gbit over cat5e many times in 10-30m lengths. That might not be enough for end to end on your house (although it is for many peoples), but it's certainly fine between floors.

10gbit switches are becoming significantly more common. Vendors like fs.com and even Netgear offer some reasonably priced options. Mikrotik and other vendors offer better (pricier) options, but assumedly if you want 10gbit (or even 1gbit) you're an enthusiast or business anyway.


There's much cheaper equipment than $2K. Try something from Mikrotik. They have offerings for 100G for less than half of that.


Can you say what country?


With Digi in Spain 10G is €5 more than 1G - https://www.digimobil.es/fibra-optica/


I just got on Digi 10G and I measure 8 Gbit/s symmetric, for… €25. Who can complain about that?

Anyway, I hadn’t paid attention and yes turns out consumer hardware support is a bottleneck, not the actual ISP which is nuts. Seems like someone fell asleep at the wheel – I suspect people are gonna have a wake up once >1G becomes widespread – not even their new fancy motherboard will support it. It feels like for ages, it’s been the opposite - you buy hardware that is prepared for the future or at least the present. At least that’s how it’s been with SATA etc..

For instance, the cheapest NIC is like $75 and the router can only do 10G on one port. Wifi6 does not get near those speeds either and even so laptops have to be very new (even my M1 does not have full support for the 802.11ax thingy - partner’s M2 was at least able to pull ~1.6 Gbit/s). Switches and routers are premium priced.

However, I was very impressed with the silent progression of cat cables, they’re just better and better ever iteration, backwards compatible and dirt cheap. Got a cat8 which is way, way more than I need for like €1/meter. And they’re flat and easy to pull long distances. Beautiful tech!


It's not so gloomy, 2.5G ports are becoming standard on consumer desktop chipsets, and switches are not that expensive. For 10G, you can get copper cables easily, but SPFE is more common, I guess once chipsets get faster, do not consume as much/run as hot, copper might return there as well.


> 1gig is still the standard for wired

A lot of people already get more than that from their ISP. I had at least 2.5g on every consumer product from the past 5 years. Small businesses use at least SFP on the floor level. Yada yada yada. Point is it's probably a regional thing.


Yeah, no. You live in a bubble. Most consumers are not above 1g even on fiber. It's probably not even regional, it's very location specific.


Where I live (the Netherlands) gigabit is available pretty much country-wide. Not everyone may subscribe to a gigabit plan but it's available to them if they wanted it.

ISPs are now just starting to roll out multi-gig, a few are already offering 2.5 or 4gbit plans. Even the ones that do not offer multi-gbit plans yet are already installing 10gbit capable CPEs. I suspect 10gbit service will become available nationwide within a few years.


Nah, you're in a very small minority. Most households have few hundred megs at most


You’re probably in US. Meanwhile in Switzerland people are enjoying 25gbps internet: https://www.init7.net/en/internet/fiber7/


I'm in the UK. What I stated applies to most countries. Switzerland is a niche.


You also need to have 2.5GB capable hardware in all of your home devices though.


No, you don’t. You do need it only if you want to pull 2.5gigs on one device.


"One device" could be your router, and if your router can't do 2.5gb then nothing will be doing 2.5


A lot of people?

2.5G is rapidly approaching, if not already past the point, for a lot of people where a single machine will never use all of that capacity, and the advantage of higher total bandwidth is to support multiple people doing high bandwidth tasks.

In this scenario a 2.5G (or 10G) router is all that's really required to get the benefit, while using the existing 20 year old wiring.


>rapidly approaching, if not already past the point, for a lot of people where a single machine will never use all of that capacity

Now where have I heard that before...


Back in college we had 100Mbit internet connections in our dorm rooms when most people had 10Mbit cable or DSL at most. At the time it was considered ridiculously fast and certainly not something an average consumer would ever need.


> Now where have I heard that before...

Ok, sorry, in another 30 years time people might want more than 1G to do brain dumps to their robo-shrink.

In 2023, there are very few uses for home users that will exceed what a 1G connection can provide.

But please enlighten me about where you think you've "heard this before"?


> In 2023, there are very few uses for home users that will exceed what a 1G connection can provide

Video games are getting bigger all the time. The latest Call of Duty apparently is 200GB. On 1gbit you are limited to 125MB/s downloads (assuming zero overhead) that's almost a half an hour to download. PCIe4 SSD's are capable of write speeds of about 7GB/s and PCIe5 SSD's are just hitting the market with even faster speeds. At 10Gbit you can download that game in less than 3 minutes. In neither case are you even approaching the speed at which your PC can store that data.

When PCIe5 SSD's go mainstream a home PC user would even be able to saturate a 100Gbit connection.


Please, I implore you, to read the line you quoted again, and then perhaps pull out the old Oxford English, and look up what "very few" means.

I'll be generous and give you a hint: it doesn't mean none.

But your example also has great relevance to the "familiar" sentence in my original comment which was:

> 2.5G is rapidly approaching, if not already past the point, for a lot of people where a single machine will never use all of that capacity, and the advantage of higher total bandwidth is to support multiple people doing high bandwidth tasks.

I italicised the part that I knew people would somehow ignore in my original comment and I've done it again, because obviously once wasn't enough.

Here, let me pull out the important words yet again just to make it really clear:

> for a lot of people


Not the commenter but I’ve heard people make statements like that time and time again, only for those limits to be obliterated a few years later.

The thing is, the moment a new upper bound becomes available, developers find a way to use it. It’s like the freeway problem that adding more roads ironically adds to congestion.

Take storage, the greater the storage capacity of media increased, the larger game assets became. The faster CPUs and system memory became, the heavier our operating systems and desktop software became.

Likewise, the faster our internet becomes, the more dependent we will become on streaming high fidelity content. 4k on a lot of streaming services is compressed to hell and back to work with people on slower internet connections. And much as Google Stadia was shutdown, video game streaming services aren’t a failed experiment. Plus even with more traditional services, how many of us roll our eyes at multi-hour download times for new games?

Once gigabit internet becomes the norm (it’s common place in a lot of countries already, but it’s not quite the norm yet) then you’ll see more and more services upscale to support it, and thus others on the cutting edge of the tech curve finding that gigabit internet isn’t quite fast enough any more. And that will happen sooner than you think.


> 4k on a lot of streaming services is compressed to hell and back to work with people on slower internet connections.

A 4K UltraHD Bluray (that's 100GB for one movie) has a maximum bitrate of "just" 144Mbps. If you're suggesting online streaming services have some swathe of content that's (checks notes) in excess of 7x the bitrate used for 4K Bluray discs, I'd love to hear about it.

> video game streaming services aren’t a failed experiment

I'd have thought latency was a far bigger concern here, but even if not: it's still just sending you a 4K video stream.. it just happens to be a stream that's reacting to your input.


> A 4K UltraHD Bluray (that's 100GB for one movie) has a maximum bitrate of "just" 144Mbps. If you're suggesting online streaming services have some swathe of content that's (checks notes) in excess of 7x the bitrate used for 4K Bluray discs, I'd love to hear about it.

We are still a long way off the parity with what our eyes can process so there's plenty of room for bitrates to grow.

Plus the average internet connection isn't just streaming a video. It's kids watching online videos while adults are video conferencing and music is being streamed in the background. Probably with games being downloaded and software getting updated too.

A few hundred Mbps here, another few there. Quickly you exceed 1 gigabit.

> I'd have thought latency was a far bigger concern here, but even if not: it's still just sending you a 4K video stream.. it just happens to be a stream that's reacting to your input.

Latency and jitter matter too. But they're not mutually exclusive properties.

Plus if you're streaming VR content then that is multiple 4k streams per device. And that's on top of all the other concurrent network operations (as mentioned above).

You're also still thinking purely about current tech. My point was that developers create new tech to take advantage of higher specs. Its easy to scoff at comments like this but I've seen this happen many times in my lifetime -- the history of tech speaks for itself.


> Plus the average internet connection isn't just streaming a video. It's kids watching online videos while adults are video conferencing and music is being streamed in the background. Probably with games being downloaded and software getting updated too.

That's exactly the scenario I gave where 2.5G WAN would be useful, but a 1G LAN to each machine is likely enough for most tasks, for most people - multiple users simultaneous use.


> That's exactly the scenario I gave where 2.5G WAN would be useful, but a 1G LAN to each machine...

You're moving goal posts now because your original comment, the one that sparked this discussion, neither mentioned 2.5G WAN nor that your 1G comment was specific to each machine rather than internet connectivity as a whole.

> but a 1G LAN to each machine is likely enough for most tasks, for most people - multiple users simultaneous use.

For today, yes. But you're demonstrating a massive failure of imagination by assuming those needs are going to be the same in a few years time. For example, the 4k figures you're quoting are fine and dandy if you don't take into account that TV manufacturers are going to want to sell newer screens. Which means more emphasis on content with high colour colour depths, refresh rates and resolutions. This isn't even a theoretical point, there are already 8k @ 120FPS videos on YouTube.

Heck, I've already hit the 1GbE limit for a few specific bits of hardware in my home set up. Mainly my home server and some "backbone" wiring between two core switches which join two separated buildings on my property. But if I'm hitting that limit today then it's not going to be many more years before other people start hitting it for far less esoteric reasons than mine.

You're also overlooking that fact that if you have router providing GbE to desktops and WiFi 6 to other devices, it's very unlikely to be powerful enough to switch all of those devices at gigabit speeds, let alone routing at 2.5G to the WAN. And that's with regular IPv4 packages, never mind the additional overhead that IPv6 adds. Underpowered consumer networking equipment is already impacting home users right now. So again, we are seeing limits being hit already.

---

Lets also not forget all the other noise being introduced into homes. Smart speakers uploading voice recordings for speech-to-text analysis. Smart doorbells and over security devices uploading video. Smart lights, fridges, plugs, plant pots and whatever else phoning home. Set top TV boxes, and other multimedia devices phoning home, downloading software updates and streaming adverts. In fact have you ever run wireshark on your average home network recently? There is a lot of noise these days and that's only set to grow exponentially.


Not moving goal posts at all mate.

My original comment, in reply to someone saying "a lot of people already get more than 1G from their ISP" and implying that it's therefore worthwhile to have 2.5GEth on all local devices ends with:

> In this scenario a 2.5G (or 10G) router is all that's really required to get the benefit, while using the existing 20 year old wiring.

I'm sorry if the correlation between having a 2.5G router and having greater than 1G WAN wasn't obvious to you.

Complaining that a quasi backbone link saturates gig eth when my entire point was that single computers are unlikely to need more kind of misses the whole point I was making for an excuse to complain.

I never said no one needs more than gig for anything.


AFAIK we're still very far below the dynamic range human eyes are capable of seeing, so there's plenty of room to need to up the bit depth (and rate) for video if displays can improve. Our color gamuts also do not cover human vision.


So I had to use a calculator to help me here, and I used https://toolstud.io/video/bitrate.php, but apparently the raw bitrate for 4K@25fps/24bit is 4.98Gbps, which then obviously gets compressed by various codecs.

Taking the above 4K@25fps/24bit and pumping it up to 60fps and 36bit colour (i.e. 12 bits per channel, or 68 billion colours, 4096x as many colours as 24bit, and 64x as many colours as 30bit) the resulting raw video bitrate is 17.92Gbps... so it's an increase of <checks notes> about 3.6x.

It seems quite unlikely that we'll have every other aspect of 36bit/60fps video sorted out, but somehow the codecs available have worse performance than is already available today.


My understanding is that today's HDR sensors and displays can do ~13 stops of dynamic range, while humans can see at least ~20, though I'm not sure how to translate that into how much additional bit depth ought to be needed (naively, I might guess at 48 bits being enough).

I don't see why we'd stop at 60fps when 120 or even 240 Hz displays are already available. Also 8k displays already exist. The codecs also have tunable quality, and obviously no one is sending lossless video. So we can always increase the quality level when encoding.

So it's true in 2023 (especially since no one will stream that high of quality to you), but one can easily imagine boring incremental technology improvements that would demand more. There's plenty of room for video quality to increase before we reach the limitations of human eyes.


I believe they are referring to the quote from Bill Gates '640K ought to be enough for anyone' in reference to ram.


I guess I shouldn't be surprised that someone who believes the debunked Gates quote is real, also can't comprehend the difference between "for anyone" and "for a lot of people".


I just got 5gbps symmetrical FTTH installed. I'm in Michigan, so hardly some connectivity utopia. I'm going through a round of upgrading my network devices to be able to actually handle it.


You never visited Germany, aren't you?


[flagged]


I'm in the Nuremberg area. We have no fibre in downtown Nuremberg for the company in an old building.

I live in a very small town in the outskirts, we will have no fibre for next years. Only the main town will get fibre the next few years.


My home was redone at some point in the late 1990s and I also lucked out in this regard with Cat 5 used for telephony, but easily converted to proper Ethernet.

I ended up purchasing a “lifetime” spool of Cat 6 to fill in some blanks, but it’s the optimal networking setup for me.


Most cat5e structured cabling is completely fine for home-length runs at 2.5 or 10 gigabit/s - I am using existing cables for the 10G run from my fiber drop to my router, and for the 2.5G runs from my router to my wireless access points.


hah. Drilling holes to run Ethernet through centuries-old castle walls was my favorite.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: