I wanted wired Ethernet in my office but didn't want to punch holes in the wall. Luckily I have an old "central vacuum system" that as far as I can tell has never been even powered on, many years before I bought the house.
I tied a ping pong ball to a piece of string and used a vacuum cleaner to suck it through the pipes, then used the string to pull a stronger rope and then the Ethernet cable. So now my Ethernet goes from my office to the wireless router via the vacuum ducting.
My wired ethernet triumph was in my current rental. There's an old under-floor evaporative air-conditioning ducted system which is no longer used. I bought a $40 remote control car with large rubber wheels and a low CoG, strapped a GoPro and a head torch to it so I could stream video to a phone, and tied on a piece of string. The mission was to get from my study to the kitchen which is about 10m all up.
It was more of a challenge than expected. The ducting had ramps, multiple corners, debris/rubbish, and a mouse skeleton to navigate around. All of this wasn't helped by the terrible latency and needing to move the phone to different places to reduce distance and attenuation through the floor. I got stuck quite a few times but could always use the string to pull back a bit when I got wedged on a rock. Sometimes I just needed to gun it to get over an obstacle.
In the end, the car and ethernet cable were cheaper than a good wireless card or POE kit, and I got to give away the car to a kid.
> It was more of a challenge than expected. The ducting had ramps, multiple corners, debris/rubbish, and a mouse skeleton to navigate around.
My uncle worked for a company that designed and installed commercial ductwork. The joke he liked to tell while they were going in: "Make sure Bruce Willis can get around in there."
I visited an old Victorian National Trust house in the UK where the owner had used his rabbit-hunting ferrets to help install electric wiring under the floors rather than rip up the expensive wooden floors and panelling
Ferrets were often used to drag cables to the right place. It is apparently an urban myth that Boeing did this in jet manufacture, but they were used in other similar contexts.
I mean, he was running a string under the floor connected to the RC car. He could probably have piped the video (and maybe power) over a thin enough wire to go completely wired.
for those who do not have old pipes to use, most of us have coax, there is something called MOCA, simply go onto amazon or whatever and search. it allows you to run your ethernet over the cable tv coax in your house "over the top" of any cable tv you have runnning on it. cable uses the first 800Mhz of spectrum on the coax, moca rides above 1ghz so they can coexist. You can extend ethernet from any coax outlet to any other coax outlet. not hole punching, not central vacuum or whatever.
If you do already have telephone and/or coax cabling in your home, it might be easier in the long run to just replace it to be honest. Ethernet sockets and cables are cheap, and don't consume power.
I was looking into MOCA, but eventually I decided against it due to pricing and latency concerns. In the end, I opened up the coax socket at my router, tied an ethernet cable to the coax cable, went to my office (room above the room where the router is), opened up the coax socket, pulled out the coax cable, and voila, I now had an ethernet cable in the duct.
It's not that simple everywhere, We have brick walls, replacing cables means lot of dust and repairing those opened walls... provided you don't damage anything already there.
So here I'm feeling adventurous for pushing 10 GbE over a run of Cat5E for the next decade or more.
True, although if you are lucky your cables run through cable ducts. My house was built in 1935 and has brick walls as well, but all cables run through plastic cable ducts embedded in the walls. Where I'm from this is very common, as virtually all housing has brick (or concrete) walls.
Unfortunately, I bet a lot of this kind of cable is stapled down along the run. And with enough twist and turns I'm not confident the existing wire wouldn't break while being pulled, even without the staples.
And you don't even have to lose the coax cabling. If you tie two ethernet cables to the coax cable, pull them both through the duct, you can then pull the coax cable back in place.
Ya, well, Moca 3 is pretty new BTW, and it's not your mommy's ethernet over coax, you are thinking of old timey ieee802.3 this is full on OFDM modulation over coax, RF'ey stuff like Docis. Fios uses this to get your signal in over your house cabling.
I use it to get my mm-wave 5G 2Gbps signal from my antenna on my garage over the same coax my cable TV runs on to my lab in my basement and have a wifi AP for the rest of the place.
Ya, sorry, I stream but I still like my cable, shrug :)
Make sure you put a filter on the outside of your house if it is hooked to the cable system so you don't create feedback and screw everyone else around you who are using the cable system.
Interesting. I'll look into it and see whether it's possible.
While my highly-tuned Wifi 6 setup at home is honestly quite excellent (even for my competitive FPS gaming needs, and even wireless VR has been brilliant), it would be nice to ditch it for higher reliability.
Problem is, I'm pretty sure my rental has a single coax socket for cable TV (which is used for my NBN internet connection now). I'll have a look though!
Also Ethernet to bridging MoCA does add on about 3-5ms of latency round trip. Whether this is of any concern is another thing, it probably isn’t for most - but it is something to know.
As I mentioned in the top post that it runs over the top of docis which is what the cablecos use which is 800Mhz and below and Moca is above 1Ghz so they coexist.
Yeah fair enough, though I think the bigger challenge for me is that most places I've lived in, in Australia, have a single cable connection point in the lounge room and that's it (for Foxtel). Though I'm going to check!
For what its worth, I saw this comment and bought Moca hardware (ScreenBeam Bonded). Ten minutes of setup later after an Amazon delivery and I'm happier with Netflix / Prime / Hulu / etc. There is literally no comparison on the performance I'm now getting.
I have Fiber to the house and then an Eero first generation to provide Wifi. I also have an AmpliFi HD. And we still had regular (3x per week) problems with streaming video.
I learned about MoCA from this thread too. Placed an order the next day and set it up today morning, and now my desk downstairs has a direct wired connection running at full speed, all the way across the house. This was an incredible solution for my needs!
There is also Ethernet over powerline via powerline adapters. I run these in my apartment. From the router you plug a “host” adapter into an outlet, plug in the ethernet cable. Now you have a source. Throughout the rest of the apartment or house you plug-in a receiving adapter (there’s a little sync button to sync it to the host) and plug in an ethernet cable as an output. Plug that into a small switch if you need more connections. Repeat throughout your home. It’s just as cool as MOCA.
Powerline sucks. It's slow (in most houses you're lucky to get 100mbps throughput even with high end "2000mbps" adapters), and it tends to suffer from bad jitter and latency spikes to the point where it's more like bad wifi, while MOCA is generally pretty close to ethernet in terms of having basically no additional jitter or ping spikes (only real downside is the cost of the MOCA adapters, and that it's a shared link between all the adapters so you're limited to eg. 2.5Gbps combined on all MOCA links, which isn't a problem for most people).
Also, for the cost of either powerline or MOCA adapters you can buy a big roll of CAT6 CMR and a fish bit and run ethernet through the walls (just do a good job on the patches if you're renting).
I have to begrudgingly agree with you. A couple years ago I plopped a couple powerline adapters on either end of my house. They're on different circuits, but the same phase of the panel.
It worked great. I got a consistent 80Mbps through them. I was happy and probably commented here or elsewhere about how they can work, or, "don't dismiss it, it might just be the trick."
Well, now they're pretty flaky. They still work just enough for me to keep using them, but other WiFi issues I'm having are convincing me that I gotta stop being lazy and run some cable to all my rooms. (I even have a crawlspace! It will be relatively easy if I can just get cracking on it.)
I don't know what changed between then and now. Phone chargers on the circuit? That random smart plug I bought last year? My A/C cycling in the summer? Doesn't matter. Powerlines are a hostile environment for high-speed data.
If you have accessible spaces and don't have high ceilings, this is true. If you have neither basement nor attic (or they are not accessible easily), or you have ceilings high enough to require crossbraces, and especially if you only need to make one or two runs, MoCA is very cost-effective.
I've gotten a pretty reliable 400-500mbps over a "2gbps" powerline adapter. Granted this was in small 1-2br apartments, but you're probably dealing with only one coax jack in the whole place and it's not feasible/worth it to run your own cabling.
Imo, in a small 1-2br apartment it's easy enough to run ethernet and hide it along the walls, either by tucking it under the moulding if you have carpet, or using cable hiders.
I'm also just laser focused on good consistent latency though after years of suffering awful cable internet and trying to minimize all latency that I could control (https://i.imgur.com/i84mIsD.png was what it was like during the day in 2020 in the first wfh period. Luckily I've only been getting consistent spikes up to ~200ms throughout 2021 /s).
edit: That ping was on hardwired ethernet with no load as well, and Rogers (realistically the only ISP where I live in Canada because my available Bell DSL connection is too slow) was throttling uploads to 4mbps.
My go-to solution for routing ethernet while renting are 3M Command Hooks. They are remarkably strong, while being easy to remove – albeit a tad bit on the expensive side.
They basically turn all the electrical wires in your house into a giant antenna and radiate broadband noise.
And they do it very poorly, I've never been able to get powerline networking to work well. I had trouble reaching the Wifi node in my office from the bedroom on the far side of the house, I thought powerline networking was the answer, but the link was not that fast (around 10mbit), with very variable latency, anywhere from 5msec to 100msec, and around 2% packet drops. Both outlets were on the same leg of my home power panel, I verified it at the breaker box.
I finally ended up putting up a couple Unifi Nanostation M2's (the bigger ones with the 10dBi antennas) aimed at each other through the walls to act as a point to point network, and it was faster and more reliable. Still not super fast, I get like 30mbit, but latency is a nice constant 7msec, with little packet loss.
But in a previous house, the walls were real plaster with metal lath behind acting as a faraday cage, and I had no choice but to use powerline networking (it was a rental house, so I couldn't easily run ethernet)
It very much depends what band/frequency you are listening on. An Icom TRX could be HF, VHF, or UHF. And of course it will depend where your antenna is.
I would choose MOCA over Powerline. Powerline trips AFCI breakers, is really sensitive to other devices and wiring quality, add even in best conditions seems to perform worse (in my experience).
I use those and have a generally positive view on them, but from personal experience, the adapters tend to de-synchronize, and re-synchronizing them is a pain (first of all because you need to actually access them, but since they are ugly they are generally kept hidden in hard to reach places).
They are also the other issues people reported.
IMO, they are fine in a rental where you don't have a lot of options and don't expect too much, but if you have a choice, use a better alternative.
Powerline networking is a terrible idea. Trying to feed RF over unshielded and electrically noisy wires is doomed to failure, as the signals can leak both in and out. P/L will cause interference to other services and also suffer from interference.
The big problem is that it might work just fine one day, then fail catastrophically if anything in the environment changes.
Oh for sure there’s no way I’m getting those speeds. I don’t have that need though. A simple 1Gbps is enough for me. While advertised as 1Gbps adapters, I get at max 800Mbps which is fine. I don’t suffer from stuttering or packet loss as reported by others. I also don’t have a bunch of RF interference as reported by others. So I’m happy with them and will continue to use them.
oh man this is why i read HN; i had no idea that this existed-- somehow-- and am the kind of person that runs ethernet cables all over the house rather than compromise in any way (powerline SUCKS ASS); and also i happen to currently live in a house with 1.2 trillion coax jacks, but no cat5/6 wiring to speak of, so this seems like it completely solves my problem in the best possible way. i don't know if i could possibly be more excited than this (i wish i were kidding more than i am).
yeah, sure, if i got a bunch of really fancy wireless equipment, that could probably swing it too, but would also require getting new WiFi cards to actually be able to take advantage.
I learn about in the last house I had which was in FiOS (fiber) available area and I of course ordered it and when they were installing it I asked the guy a bunch of questions and one was "who are you getting the signal across my Cable TV Coax" and he told me they were using MoCa so I googled it later and thought wow, that's interesting.
Then Later I moved to this place and there is no FiOS but I have mm-wave 5G (2Gbps) but there was no Ethernet cable in the house so I thought, hey, I know, I will use that MoCa thing, and it worked like a charm, got my signal from the CPE antenna unit mounted on my Garage (best signal) down to my lab in my basement.
I think it is a pretty decent review of the pro's and con's of using Ethernet cable vs. Ethernet cable. Perhaps others not reading this thread could also benefit from the knowledge.
The various people who have replied to you with their own cable solutions is interesting. Mine seems downright uncouth by comparison.
I just ran mine up near where the wall meets the ceiling. When inside I look down a lot more than I look up so the ugliness of the cable doesn't bother me.
Here's part of my Ethernet cable run [1] through the living room. The other cable in that picture is a speaker cable for one of the home theater rear speakers.
You can also just own it, cables are not inherently ugly. I’m not saying go full on Centre Pompidou, but a couple of dirt-cheap adhesive clips in an appropriate colour can go a long way (though not if your walls are plastered). The setup in 8-Bit Guy[1]’s old studio is a good inspiration.
(Sometimes the clips can be a bit too cheap, in which case tearing off the original crappy adhesive and using some of that insanely-strong double-sided 3M tape helps.)
They are sharing their stories because they are more clever than the rest of us.
I had a cable going diagonally across the wall for ages before the SO put their foot down and demanded something slightly more aesthetically pleasing. Now it follows the trim at least.
And if you don't want it to look especially great surface mounted trunking is decent enough and affordable. Actually it is very common here. Small rectangle mounted either to ceiling or floor of the wall.
There's cable gutter kits you can probably get at your local DIY store to make that look a bit neater if you're looking to improve. They're usually stick-on so no drilling or anything required.
There's also decorative ones that run along your floor, I forgot the English name, but they're a bit bigger than the usual ones and can hold a number of cables out of sight.
You mean baseboards? Checking them is a good idea; some do have a space for running a cable. So if there's no need to go over a corner, it might be an option.
If you are thinking about doing this make sure you use plenum-rated cable, otherwise in addition to packets your Ethernet cables will also be an excellent transport for fire.
I’m under the impression the concern is more about smoke being emitted during a fire which would then filter into a room and obscure escape paths.
This is more of an issue in a commercial building where you have lots of people who need to move through potentially narrow escape routes than it is in a home where every major room will have multiple means of egress.
Off-topic but since you brought it up, does anybody ever use builtin central vacuum systems? I have several friends who have houses built during the era when these things were popular but none of them ever use the systems.
Yeah I definitely do, the hose is stupidly clunky but as someone with allergies the big advantage is there's no pass through of fine particles into the room you're vacuuming.
There are systems these days where the hose retracts up into the vacuum tubing in the wall, so you just pull out however much you need and then let it feed back into the wall when you're done.
This is what I have and it's great. I also have a toe kick adapter, so I can use a regular broom and sweep the dust to a little port under a kitchen cabinet and it automatically sucks it up.
I don't have allergies, I just never liked the way it smells so dusty after using a vacuum cleaner inside.
> Off-topic but since you brought it up, does anybody ever use builtin central vacuum systems? I have several friends who have houses built during the era when these things were popular but none of them ever use the systems.
I grew up in a house with one. I haven't lived there since 1998, but, yes, we used it all the time. As my sibling commenter lreeves says (https://news.ycombinator.com/item?id=31361950), it's good if you have allergies.
I also once heard a vacuum cleaner salesman say (and yes, that used to be a thing) -- we know we can make vacuum cleaners less noisy (drop the universal motor IIRC), but people buy vacuum cleaners assuming the noise == power.
My house has this. I've only used it when something was wrong with the alternative. I love the idea, but the hose is huge and ours has a minor defect where it turns itself off if you plug the hose in completely. Roomba and then a handheld Dyson for details is much more convenient
Yes! Use mine all the time. Strong sustained suction and much quieter. Less frequent bag emptying. Air quality seems better. Hose is a pain to deal with, but not a big deal as long as you have somewhere to store it.
Our house, built in 2004, has one. We’ve never used it. The closet that has the vacuum in it also has some other equipment and is a bit of an IDF for my home network. Strongly considering removing the vacuum unit entirely and using the ducting as conduit per the above comment!
> I wanted wired Ethernet in my office but didn't want to punch holes in the wall. Luckily I have an old "central vacuum system" that as far as I can tell has never been even powered on, many years before I bought the house.
Lucky you, you had a preinstalled system of cabling conduit :)
Before I had my basement finished, I installed a bunch of PVC conduit for the network wiring, so it would be possible to upgrade or add to it in the future.
I once wired up an apartment for ethernet using the old rj11 (telephone) wires. The phones were all run with cat5e so it was very easy. If you're only doing 100bt you only need two pairs for full duplex, leaving two other pairs to run two phone lines. There are a lot of homes built using cat5 for the phone systems with 2-3 unused pairs in each cable. All I did was swap out the faceplates for rj11/rj45 combo plates and rewire.
Of course these days you might just give up the land line entirely to get gigE over all 4 pair and just remove the old rj11 jacks.
Around 20 years ago the Internet suddenly became a thing but you had big hotels that were not pre-wired with RJ45 and the owners did not to spend lots of money running new cables.
One solution was to use the existing phone line in each room and put an ADSL modem on it. They got Internet (and kept analog phones) in every room by just upgrading the endpoints.
Yeah, 1 pair for signal, one for shielding. Cat3 and Cat5/5e/6 aren't the same - but I think you can practically get up to 1gbps over Cat3 - 10gbps is unlikely.
The "shielding" is done by measuring the difference in signal between two wires in a pair -- that's why twisted pairs are used. There isn't a dedicated pair for this purpose.
You need only one pair to run half duplex 10/100bt (contrary to what's stated above), and two pair to run full duplex 10/100bt. In FD, there's one pair each for tx/rx. 10/100bt are very similar in this regard.
The different cable categories mostly relate to the wire quality and the number of twists in each pair.
> You need only one pair to run half duplex 10/100bt (contrary to what's stated above)
How do you wire your one pair for half duplex so it works with commercially available Ethernet equipment? I'll grant you that only one pair will be active at a time, and that 10base2 works with two wires, but you need two pair for 10baseT-HD. I'm very happy to be wrong, please point me in the right direction.
There's also 10Base-T1S, but it's a new specification (2020), and seems limited to automotive and industrial applications, not generally available.
10base2 is totally different. 10base2 is thinwire coax - as opposed to 10base5, thick ethernet.
With 10baseT, the T stands for "twisted pair," as opposed to coaxial cables. Everything we're talking about here is going to be baseT, for twisted pair.
In 10baseT full duplex there are only 4 of the 8 rj45 pins used. In 10baseT half duplex there are only 2 of the 8 rj45 pins used. I don't remember which pins off the top of my head, but you could easily experiment to figure it out. I'm sure it's using one of the two pair used in full duplex.
You don't need to wire anything differently. When you select half duplex mode in your driver it will start communicating over only 2 pins. This sometimes happens naturally with faulty or broken cables, when it looks like the other pins aren't connected properly.
100baseT is more or less the same as 10baseT in terms of pinouts, but at a higher frequency.
The pinout for full and half duplex is the same, the behavior is just different.
If you get traffic on the RX pins when you're sending on the TX pins, in full duplex, that's great. In half duplex, it's a collission and you stop sending your current packet and send a jam signal instead through the minimum packet time.
In 10base2, the RX check is more nuanced, because it's reading from the same bus that TX happens on, collision is only signalled if the RX differs from TX. In 10baseT, I think (but again, would love to be corrected), the expectation is that an ethernet hub will isolate a sending port from receiving its own transmission to the shared bus inside the hub, and of course switches are switches.
For the simple case of two Ethernet devices connected with a crossover cable, there is no possibility of actual collision, but if they run in half-duplex mode, recieving a packet while sending will be processed as a collision anyway.
If you connect only one pair, you can connect RX to RX and TX doesn't go anywhere, TX to TX and TX goes (subject to collissions), but nobody listens, or Peer A TX to Peer B RX and get a one way connection; peer B may detect link up because it gets link pulses, but peer A won't. I'm not sure what Auto MDI-X does in that situation, but it still won't work.
Autonegotiaton usually doesn't do anything smart about failing pins or poor connections either. If you've got two pair connected, you can autonegotiate (at 1Mbps, with the 10baseT link pulses) to the best connection offered by both ends, which will then fail to work if you negotiated to GigE and only have two pairs, or you negotiated to 100M and the cable is really terrible. It's getting somewhat common for OS drivers to detect this negotiation succeeds/connection doesn't behavior and restart the negotiation with fewer options though.
If I have some time today, I'll make a single pair cable and confirm, but if it were as easy as you said, there would probably be a web page telling people how to do it.
I could swear I remember running over a single pair, but it's been a few decades so I might be wrong about this!
If 10bT really does always use 4 pins then I wonder what would happen if tx+/rx+ and tx-/rx- were shorted? I wouldn't be surprised if there's no page illustrating it because why would you ever want to?
If you've got a burried run of two pair wire that you're running ethernet on, but something happens and one pair goes bad, it might be nice to use the working pair, because trenching is a pain. Or maybe you have two pair, but you really want a phone line and ethernet (and ethernet+voip won't work). Etc lots of cases where rewiring is hard, but you don't have enough wires.
Right you are! Twists for shielding, two pairs for duplex - which means that since the 90s on, you practically always want two pairs pr connector to enable duplex (be it 100mbit or faster). Ed: but probably 4 for gigE.
I'm so glad that in my country we build houses with ducts for cables, with two separate duct networks (one for AC power, and one for "telephony"), when it comes time to change technology we just push those spring-tipped cable guides through the ducts and then pull the new wire (or even just use the old outdated wire as a guide to pull the new one). I've installed Ethernet cables and even replaced AC wire (to a wider gouge or more phases) on rentals with no need for any McGyver shenanigans.
I think in most (West-) European countries there are strict regulations/requirements that require ducts/pipes for any kind of power line. When my house was built (35 years ago) there were already regulations in place that required separate ducts for each group, separate groups for wet rooms (kitchen, bathroom) with grounding, etc.
Overall I found the standards of electrical work in Europe is significantly higher than US counterparts. In modern buildings every outlet and line has mandatory grounding, and GFCI is built into the fuse box on every power line for any building that's less than 30 years old. That really came in handy once when I accidentally hit a power line when drilling through the ceiling. I can't imagine doing any kind of renovation work without GFCI on everything.
I've seen none of the problems all these (presumably N American) commenters are reporting, but our power systems are famously much better in Europe. I'm from England but live in Czechia and apart from different plugs everything works the same.
I bought a rice cooker from a Taiwanese woman a couple of years ago and was shocked (fortunately, not literally) to discover that apparently Taiwan runs on American sockets and American (crappy) voltages. Luckily I had an adaptor plug and it seems to cope with 220V mains without releasing the magic smoke.
Other alternatives I've found to work are Ethernet over power (varies depending on how well your house is wired though) and ethernet over coax using MoCA adapters (if your house was previously wired with coax for cable TV these work great)
I've used Ethernet over Power, and it sucks except under ideal conditions. The pre-G.hn versions are quite slow.
G.hn is fast, but all of the G.hn devices I've found seem to use the same chipset (and presumably similar firmware, since the configuration pages have identical options, but different branding), and the chipset seems to buffer a massive amount, which means a single hiccup while using a high-bandwidth application makes the network unusable for a while.
Example: while streaming video over a G.hn adapter, I was running ping. There was a brief hiccup, and then ping times exceeded 20 seconds (!!), finally recovering after a few minutes. I eventually ran a program that ran on devices connected to each side and would reboot the adapter if ping times exceeded 500ms. This made the network far more reliable.
This hasn’t been my experience at all. I use TP-Link adapters and don’t see any ping interference or latency. I think it also has a lot to do with how in phase your power is.
I've used the TP-Link AV2000's for a few years, they're OK for me, sure I'd rather have proper hard wired ethernet, but I also prefer not punching holes in walls/ceilings all over my house. It's ridiculous they get marketed as 2000Mbit though, best I've ever got is close to 200Mbit and I'm in a fairly ideal set up for it. Also, never mix lower adapters like 500's, as this seems to cap the whole network at the lowest common denominator, even when they're not in use.
I use the TP-links too, but sometimes, every few weeks, they fall over and one of them needs a power cycle. I put them on the same circuit so they don't have to traverse the consumer unit to talk to each other.
I could have put Ethernet in, but adding a wire under the floor was vetoed in the interests of not making a mess.
I had the same issue, except it was usually every couple of days they'd need replugging to resync. Only regret from running ethernet recently (yes, it was messy) is that I didn't do it 10 years ago.
It’s the AV1000s. I don’t seem to have the issues you all are reporting with them. No packet loss. No latency beyond a reasonable 1ms. My apartment’s power lines are clean, properly grounded, and perfectly in phase I guess.
I ran a set of TP-Link powerline at my last 2 places: in both, I rented a room in shared flats (apartments).
In the first I got a very poor intermittent wifi over 3 floors from my landlord's wifi: 2-5 Mb/s if that. With powerline I got a consistent 60-70 Mb/s.
Yes, they needed rebooting regularly, but it was totally worth it.
2nd place, here in Prague, I bought new units, because the tech had updated. My old landlord still uses my old setup as a wifi repeater, over 4Y later.
New place, my flatmates had put Ethernet across the floor, so it was in very poor shape: walked on, trapped in doors, insulation frayed, etc. Connection was very intermittent.
In that place, powerline allowed me to remove the messy dangerous ugly cable and gave me a steady reliable 200-250 Mb/s connection.
I think the complaints here may be from Americans who are famous for having mains power like wet string, running at half the voltage of the rest of the world. Across Europe it's all 220V, with mandatory earth (ground) pins, and every dwelling is on a separate ring main, so nothing your neighbours do affects your home and there is zero leakage.
Also, electric kettles work, and as I run on tea, that's important. :-D
Really the G.hn PL modulation seems more than adequate, but the behavior when conditions change is incredibly poor; if I had time, I'd find a "sufficiently noisy" appliance, some romex and run them all together. TCP and many video streaming sites are quite good at handling variable bandwidth, but the G.hn seems to turn that into variable latency, which is terrible. The people who design these are presumably not idiots, so I'd like to know why treating noise like congestion isn't sufficient...
Note that Ethernet over power should also work on coax, or really any other cable. (connect adapters to the wire and feed it with eg. 48V power supply)
Not really the newer ones use the 3 wires (phase, neuter and earth). I have one and reach 400-600Mbps in a really old building with really long and shitty cabling. Ping is 5ms more than pure Ethernet though.
Hard to compare what "crazy" expensive is but I bought a pair of these for $90 https://amazon.com/dp/B01MRV4WA1. For the 3 places in my house I wanted to wire with Ethernet, I was done under $300. Patching drywall, paint touch-ups, new cables, and the overall time and headache would have cost a lot more. I was done in a matter of minutes using old coax and these adapters.
In my case, I didn't build a MoCA network, just used these devices point-to-point. If you go full MoCA, think of it like a distributed hub - no switching, no routing, no security. In fact you should put filters to prevent neighbors from listening in using their own MoCA adapters. For me, point-to-point worked. YMMV but Ethernet over Coax is definitely worth looking into for upto 1gbps speed, which is plenty for wifi, netflix etc.
> Who knew that inbound cable internet has outbound data capability?
The physical medium isn't directional, so there's no reason to assume it would not transmit and receive. A coax cable is equivalent to three wires with better isolation from the environment (and with one "wire" usually used for ground).
There's an asymmetry in commercial broadband over coax, but Im not sire why that is precisely. Perhaps someone familiar with DOCSIS etc. could say.
It's true but you might not need that many. Our house was built in 2000 and every room is wired with coax but no ethernet. With some strategic placement I was able to get a hardwired switch in my office + two hardwired wifi APs with just two MoCA adapters - $300 well spent.
It would be pretty expensive to turn every coax run in my house into a hardwired ethernet switch but well placed hardwired APs obviates the need to do that.
In my case I used just a single pair of them to get wired Gigabit up to my second floor from the basement. Based on my layout and needs that got me everything I couldn't easily handle with Ethernet runs.
Ethernet runs are cheaper in parts, but MoCA adapters are crazy effective at avoiding large drywalling jobs.
Reminds me of my dad who connected to ISP through antenna made out of Pringles can. Surely he found a recipe somewhere but everybody was in awe regardless.
Around here, the cable installers nail the cables around the outside of the cheap houses and just punch holes through the walls to access the interior.
I have a fairly modern house (built mid 2000's) but for some reason wasn't built with ethernet; what was there was phone and coax cable lines, going through these yellow PVC tubes embedded in the wall.
In theory it would be a matter of tying an ethernet cable to an existing coax cable and pull through, but in practice, eek. I don't believe those lines are continuous, or there may be debris or kinks in it somewhere in the wall. In the end it took an electrician half an hour and an improvised cable pump to get it through.
I recently did some renovation work and the house had unused telephone (and COAX cable) wiring running to most rooms. Decided they would probably remain unused forever, so I replaced everything with cat6. You can usually use the old cables to pull through the new one. It's a bit of a chore and may require some specialised tooling for certain outlet brands/types, but it's so nice to just have ethernet outlets everywhere.
The great thing is you can upgrade the cable in future for newer technologies (eg. over a decade ago I bought used Mellanox equipment cheaper than 10G and ran fiber drops).
Depending on your wiring, the performance can be terrible.
I have one of these, advertised as 2000Mb/s but actually doing about 80, and in a small apartment. Every 6 months or so it connects to my neighbour's system without me doing anything.
Yes, in principle. In practice I found all the TP-Link ethernet-over-power equipment I used to use to me even less reliable than the WiFi I was trying to supplement. It would work fine for a while, then just lose sync, and I'd have to go round unplugging and replugging everything to get it to sync again. In the end I ran cat-6, but it took me a long time to conclude it was worth the effort. Now my only regret is I didn't decide to do it 10 years earlier.
Yeah, I've never really seen the appeal. Wires are just so reliable. I am always in meetings with "sorry, my headphones died, can you repeat that" or people mysteriously vanishing because their laptop can't charge as fast as Zoom draws power and their computer just dies. I have a wired mic, headphones, and power, and it just never breaks.
Wireless is just temporary, if you have wireless headphones, you have to plug in a wire to charge them. If your computer has a battery built in, you have to plug it in every day to charge. So it's just an extra chore that makes your life worse. Not that exciting.
Maybe everyone fantasizes about being a digital nomad. I do not. I have a great desk, chair, keyboard, and 3 monitors. I couldn't work without them. If I want to see the world, I'll go do that without bringing my computer. A vacation with your computer is just work.
A question I'm asked is "what if there's a power outage" but we have had about 10 total minutes of power outage here in Brooklyn in the 10 years I've lived here (during Hurricane Sandy). My thought is I just won't do any work if there isn't any power. We waste an hour a day dealing with wireless not working, 10 minutes every 10 years is no big deal. ("But what if it's the big one!?" Then my ISP's backup batteries will also be drained and I still can't work.)
>I am always in meetings with "sorry, my headphones died, can you repeat that" or people mysteriously vanishing because their laptop can't charge as fast as Zoom draws power and their computer just dies.
You're missing the point. Wireless technology failing is a fantastic excuse to get out of Zoom meetings!
I like spending time around my house, sometimes in garden, sometimes around a coffee shop, some work/some socializing/free time.
This can't be possible (at least practically) with a wired setup. I've almost never ran out of battery on my laptop or headphones and never have issues with Wi-Fi (worst case I have wireless hotspot on phone anyway).
I don't see any problem with a wireless setup other than convenience, where the extra capacity/reliability of wires isn't needed 99.9% of the time anyway.
I know the potential "problems" in the article but I've never encountered them.
Literally never.
I'm actually quite surprised that people still have problems with wireless devices in this age.
Even if I had a desk, I'd probably only plug my laptop to prevent battery cycles (and to keep ready for when I wanted to move) but would never bother with Ethernet, or a wired mouse, or headphone jack for example. Any additional latency/performance problem is imperceivable for daily tasks (unless I am a professional esports gamer or trying to download a huge blob over a gigabit connection as fast as possible, which is probably not 99.9% of people do).
You basically say that you don't care about latency/performance loss, but are surprised that people still have problems. Maybe because these persons do care?
While downloading/uploading something there will be no practical difference between Wi-Fi and Ethernet unless you are on a superfast Gigabit fiber.
While on a Zoom meeting the latency introduced by Bluetooth, while slightly perceivable, isn't of an issue over the latency of being connected over the web itself.
Any wireless system would have more or less be less performant than a wired equivalent. What I'm saying is that for 99% of the people in 99% of daily use cases, that difference wouldn't matter.
I have about a 300 Mbps connection with my ISP. The gateway/AP provided by the ISP is located on the second floor of the house. On the first floor, the connection is degraded to the point streaming will occasionally stutter.
On the second floor the difference in speed tests going through cable vs 10 ft. away through one thin wall on my gaming pc is 250 Mbps -> 24 Mbps. There can absolutely be a practical difference in daily use.
Most people do not have multiple APs, most people are not getting APs other than what is provided by the ISP.
Well the problem is your access point, not a vague 'wireless'. I can max out my internet connection through a couple walls. And while I had to get my own AP, I would have had to get my own cable if I wasn't doing wireless. They didn't include any that can reach between rooms.
Multi stories have that problem (probably because of the shape of antennas designed), it can be easily solved by a gigabit ethernet cable and another AP.
While definitely not ideal, it's a solve-once-and-forget problem.
Wireless headsets are something I understand because I'm someone who never wants to go back to wired headsets. I always hated being tethered to my device by my head. And when they were off my head I would always end up snagging on the cord in some way and pulling them to the floor. And I am often in calls with people where I want to get up and grab something or even do something else. It's very nice to be able to just stand up and walk away from my PC while still being able to hear and talk.
The fact that they need to be charged doesn't matter. My current ones have a swappable battery with the battery that's not in use able to be charged by slotting it into the wireless reciever that sits on the desk. But there's also newer wireless headsets that have 30-40 hours of battery life. So you only need to plug them in overnight (or every few days depending on usage).
> Wireless headsets are something I understand because I'm someone who never wants to go back to wired headsets.
That's also true for me. Ultimately, I think it's a matter of using good judgment. All my major devices - even laptops - run on cables by default, tablets/phones wireless of course. For keyboards and mouses (OK, not WiFi in a strict sense) it depends.
Not in my experience. Ethernet cables have bern reliable, but most have connector latches that self destruct.
HDMI cables are a lottery, and the prize is in no way related to the ticket price.
mini/micro USB connectors go out in about year, depending on use.
All 3.5 mm audio cables seem to have half-life of around 4 months, regardless of price. The only survivour cable came as an extension with old Sennheiser earbuds.
In comparison, I have yet to experience those bad bluetooth earbuds just stopping to work.
[edit] not to mention all the equipment with soldered-in cables that are either too long or too short and cannot be replaced. Hi, homepod mini!
I've got ethernet cables that I've been using since 2011 without them breaking. My headphones are in fact from 2008, all I've got to do is replace the leather pads every 5 years.
I'm still using the USB-C cables from my Nexus 5X today, no problem with them.
With HDMI cables I just buy them, connect them to my GPU and wait for the AMD driver to tell me the maximum bandwidth they support. Super easy way to measure what they'll support and what not. So now I've got a dozen that work at 4K 60Hz HDR just fine.
On the other hand, I've never been able to get a single bluetooth device to sync reliably for more than 5 minutes.
> I've got ethernet cables that I've been using since 2011 without them breaking.
Me too. My problem is going to store and finding such cables, that will last 10 years.
> My headphones are in fact from 2008
That reminds me. When I was complaining about 3.5 mm cables in one music store, how my only good cable is from earphones I bought ~20 years ago, the store person went, like, "of course it won't break if it was made back then! why do you expect to buy something like that now, they ain't now making any"
> On the other hand, I've never been able to get a single bluetooth device to sync reliably for more than 5 minutes.
That sounds pretty unbelievable to me - even the cheapest crappiest Bluetooth devices can pair with the cheapest crappiest USB Bluetooth adapter reliably for more than 5 minute's.
Headphones snag, the jacks get wobbly, theyre fixed lengths (standing desks mean they need to be long enough to move now so they're messy unless you do a bunch of extra work). Ive got problems with interference with my wired mouse that seems to cause stuttering if I use a specific usb port, but maybe once every 2-3 weeks or so the port I do use just doesn't recognize the device so I have to un plug it and plug it back in. Bluetooth may not be perfect but it's tidy, easy to use, well supported, reasonably reliable, and good enough quality for 98% of my use cases.
I’m not sure what’s causing it, but I just can’t get a DualShock 4 to sync reliably. Or the WH1000MX3 from my partner. And with the Jabra headphones I had for work, I always had to use their dongle because it would never sync reliably.
In the end I just gave up and now I use the dualshock 4 with cables. And for headphones, I’ll continue using my wired sennheisers from 2008 (I checked and updated the year in the previous post as well).
The Dualshock 4 gets about a minute at best before I get suddenly random input from it and it stops working. That’s with a genuine Sony DS4, second version.
I actually still use wireless mice, but only the ones with custom dongles.
This post makes me wonder what I do differently. I've some cables wired up here that are easily 5+ years old and I've had micro USB charging ports survive daily plug/unplug cycles for 4 years. My number one cause for cable-related malfunction is when I don't plug something in correctly.
I think the mouse that feasted on cable insulation in my backpack may actually have made it to second place. That's an entirely different story...
Crimping ethernet is a mandatory tech skill for every geeky homeowner. It's not that hard, but there's a few things it takes a while to figure out are really important. Once you do that, you can do it really consistently.
Wireless is nice and all. Powerline ethernet is great when it works out of the box, and a Kafkaesque nightmare when it does not.
You don't even need to crimp unless you really want to. Everything should go to punchdown blocks, either on the room jacks or a patch panel. Crimping is for the patch cables, if you can find manufactures cables for less, no need to make your own.
I suspect the biggest barrier to most homeowners running Ethernet besides not knowing that it’s not that difficult is insulated walls. Running some cables through walls without insulation to the attic and making some drops is not problem, but if you have insulation in the way the job is a lot more difficult.
I do acknowledge that there are many small (big) barriers.
I am on my second home, the first was a condo where I had no access to the "above" or the "below", only the "within".
Phone lines had been run, so I thought I could attach Cat5 to the phone lines and pull them through -- but the previous owner had decided to wall up the phone jacks. No idea where they were, and no access to the "above" part.
Let's just say - I tried a lot of things to figure out where the walled up phone lines were, but never could get to the level of confidence you would want before you start knocking random holes in the wall in a property you own.
Powerline Ethernet was a godsend - for a while. Then one day (7ish years) it started struggling, and eventually couldn't stay connected. To untangle that was going to be hundreds of dollars of hardware and research and/or contractors (side note: have you tried asking an electrician to assess Powerline Ethernet issues?) -- I went wireless at that point.
New house maybe a year back after getting married (yay!). Ran ethernet. Used these guys:
Yeah, I drilled some holes in the wood floor. I am not that rich, nor is my neighborhood, it's a non-issue. Game consoles, NAS devices and work computers have ethernet. Everything else less hungry (or important) like a tablet or mobile has wireless. Life is good.
> people mysteriously vanishing because their laptop can't charge as fast as Zoom draws power and their computer just dies
This doesn’t seem to have anything to do with your “wires are more reliable” thesis, does it? A laptop with onboard devices drawing wall/battery power would be drawing more power than if those devices each have their own separate batteries, right?
>A laptop with onboard devices drawing wall/battery power would be drawing more power than if those devices each have their own separate batteries, right?
Not necessarily, because the laptop has to do additional work transmitting Bluetooth/whatever for the wireless devices. In any case, his thesis appears to include preferring a permanently wired desktop computer to a laptop, as well as preferring wired peripherals to wireless.
Personally I use mostly wired peripherals, but I am a sucker for a wireless headset. Being able to stay on a call while I walk over to the other side of the room to grab something is a great freedom, and plugging the headset in at night is not a big deal.
The "additional work" required to transmit a Bluetooth signal is insignificant compared to the amount of power needed just for the display being on.
My current setup is two USB-C laptops, one for work, one for personal use. When I switch from work mode to My Time Mode, I just swap one USB-C cable from one to the other - all the peripherals follow. Trackpad is wired, keyboard is wired. Headphones wireless - I broke way too many wired headphones by snagging them on crap while moving around.
> The "additional work" required to transmit a Bluetooth signal is insignificant compared to the amount of power needed just for the display being on.
The relevant comparison would be the power required to transmit and receive Bluetooth vs the power required to power a wired peripheral (e.g. mouse, keyboard, headphones) where the wireless version would have its own batteries.
> I am a sucker for a wireless headset. Being able to stay on a call while I walk over to the other side of the room to grab something is a great freedom, and plugging the headset in at night is not a big deal.
Agreed. My last job had a bi monthly all hands meeting at 4pm on a Thursday, I used to just stick my headphones on and make dinner during it.
I don't have too many issues with WiFi, but my house is sort of isolated. I don't have much if any interference from neighbors. If I lived in an apartment complex with 10 or more different WiFi networks in the same airspace it might be different.
Bluetooth on the other hand is an absolute no-go. I have had uniformly bad experiences with it, and refuse to use any device that depends on a bluetooth connection.
I live in a high rise and it still isn’t an issue. The floors and walls block out almost all wireless signals. I can see a handful of networks but they are all on 1 bar strength. On speed tests I can reasonably get 600mbps on wifi.
Maybe.. interestingly I was sitting on the balcony and noticed the video call cutting out. Did a speed test, 6Mbps, I open up the glass sliding door and suddenly it’s full speed again. Kinda shocked a pane of glass was enough to completely cripple wifi.
For me the appeal is that I don't need to figure out how to run wires all over my house, or have them stapled to the walls or something. Wifi works well enough that I would never get to a net positive after considering the ethernet wrangling project inconvenience.
Wireless makes cleaning and vacuuming so much easier. Also means I can walk around while in calls. And it just works well enough that the times it doesn't are more than made up for the times it does work.
WiFi is fine for most consumers, it’s only really a problem if you have a big enough of a house to need multiple APs with wired backhaul, but the cabling isn’t there so you go with wireless backhaul / meshing. Wired backhaul always works better in my experience.
If my house were that big, I would seriously consider only having connectivity in half the house. But fortunately (?) that's a first world problem I don't have.
I absolutely love the idea of working for a month from one place and a month from another. But the problems you call out are very real and I'm honestly shocked that nobody had solved them to take advantage of the fact that relatively wealthy techies can now work from everywhere. Airbnbs will advertise Wi-Fi and maybe a tiny desk with a kitchen chair. I'd pay a premium for a place with Ethernet, a real desk, a real desk chair and a proper, external screen. I worked for a few weeks from my parent's house in Germany, but even that required need to ship them a long Ethernet cable, a screen and keyboard ahead of time. I don't want to buy all this every time I want to work a few weeks remotely.
Reliable, yes. Easy to implement? Not necessarily. If you don't own the property, you might not be allowed to run your own wires. Even if you do own it, some houses make it more difficult/expensive than others to run wires everywhere they need to be (e.g. consider formerly-external walls full of obstructions in an older house). Let's not make the classic HN mistake of assuming everyone's overpaid, able/willing to pay for the privilege of owning the right kind of property. My family does pretty well with a combination of Ethernet, G.hn powerline, and wireless. Then again, we don't subscribe to the Apple groupthink either so MacOS's misbehavior is not an issue for us.
Please. It's not a privilege to run Ethernet any more than having Internet and a laptop is a privilege. A spool is like $100, and you don't need to drill gigantic holes and run 60 wires back to a central drop to a 10 gig switch. I have run Ethernet cables in all my rentals (even when I was working minimum wage), usually only need 1 or 2 at the most. It's not rocket science either. You can go around the outside of your house, just like how a cable TV or phone installer will do it.
Landlords never notice or care. Just don't make it look shitty, they will just think it's a phone jack anyways.
So, your argument is that you casually violated your rental agreements? And you don't see how that reeks of entitlement? It might play well to this particular audience, but people who are less technical and less libertarian than the average HNer might find it less convincing.
idk, man, I rented for 15 years and just used thumb tacks to hold the cable in the corner of the ceiling, wall, floor/wall to push them around to where I needed to be. The hole that levees is quit difficult to see, you just pull the pins and coil the cable and walk away. Renters have been doing stuff like that with cable before internet too. Living in an apartment means you probably can't/don't want to run cables in the walls for someone else, but people have been running cables around apartments for years because you just have to do your thing.
I still find wired Ethernet to be a better experience all around though I do have a pretty decent wireless setup and I do roam quite a bit as well.
He didn't say he violated rental agreements, in fact he explicitly said the landlord doesn't care. It seems it's you who has a chip on your shoulder about HN that you have to cram entitlement into the situation.
You say chip on the shoulder. I say low tolerance for entirely predictable BS.
When I was poorer, even the "minor" cost of cables would have meant even more missed meals. (Fortunately that wasn't an issue because residential internet wasn't a thing back then, but you get the idea.) Losing a security deposit would have been an absolute disaster. I went to court twice to avoid exactly that, and could well have ended up homeless had I not prevailed. Not every landlord is forgiving, and not everyone has the option to prioritize freedom to make modifications over things like location, price, square footage, etc. Being able to shrug off costs and risks that loom larger for others is practically the definition of privilege.
Even among the more affluent, not everyone would put up with visible wires in every room and on every side of the exterior, or with the effort/expense of "doing it right" by fishing through walls with multiple obstructions. They might well prefer the tradeoffs represented by wireless, even if it's not perfect. Do you use landline phones because you never have to worry about being in a dead zone? Different people, different choices. That's all I'm saying, but somehow people who can't conceptualize that other people aren't exactly like them think that any choice other than their own deserves an inquisition.
I you're really paranoid you can use that fancy 3M tape that pulls off cleanly to install some clean cord covers[0] wherever you want. Most come with their own adhesive, that might leave some marks, but it'll come off with a slight scrub.
When you move, just remove them and install them in the next place - or have a chat with your landlord and leave them for the next tenant.
This is one reason why I'm still down on Apple's decision to remove 1/8" audio jacks from iPhones: they took away a decades old "just works" standard that wasn't subject to interference or battery or latency issues and replaced it with something that fundamentally is. Their efforts at filing down the sharp edges here with the airpods (and they have done as well as anyone could expect and more) don't change the fundamentals.
"Get a dongle" -- maybe I will when I finally give up my original iPhone SE, but I'm not going to be cheerful about paying an extra $30 for the privilege along with the overhead of keeping track of an additional thing.
Yeah, for me it's the "additional thing". I have a few different pairs of headphones depending on location and purpose. They're cheap, effective, and easily replaced if they fail. Having another doodad is just not appealing, whether I try to keep one on the phone at all times or buy a bunch that I add and remove from the headphones depending on what I'm using them for.
My solution was to just leave the lighting-to-3.5mm adapter attached to my headphones, rather than my phone. But I was lucky enough that I had two dongles spare and only two sets of wired headphones I cared about. Works well though.
They do. At least mine on a MacBook Pro as well as a Lenovo Yoga and a Dell XPS 13.
I bought a totally cheap no name one when I got a company phone without audio jack. Since then I have used it on many different machines as well as (Android) phones. The adapter stays with the headphones as I just would loose it otherwise.
The "dongle" is apparently a pretty good for the price audio DAC. I'm generalising with my lack of knowledge, but a semi-decent DAC can improve audio quality and boost volume.
I usually just go wireless, although it has its problems especially with Spotify.
But if I decided to go wired, I would have no qualms about going wired with the Apple USB-C audio jack dongle.
It's active. I use them for my android devices. They'll also work as a soundcard on any laptop/desktop with usb-c ports. Durability sucks but if it lasts 6 months I'm ok with picking up a new one from Walmart for $7. Both my phones and my IEM's cost around 100x more. When measured against that, it's a good value.
There's a quirk that requires a script to uncap the volume on Android but it's the best IEM DAC I've heard under $200. I don't think they subsidize it. It's just economies of scale at work. Here's a Head-Fi review and analysis of it: https://www.head-fi.org/showcase/apple-usb-c-to-3-5-mm-headp...
Edit: More context on 'fixing' the Android volume capping issue for it (quirks are common for devs to add at usb-audio driver level in kernel, but this is something different with Android's defaults): https://forum.xda-developers.com/t/guide-magisk-apple-usb-c-...
I have an anecdotal possible reason they were motivated: I broke four screens due to my wired headphones pulling the phone out of my hand/off a surface.
I now use AirPods, and it's, mentally, very different, since my head is no longer part of my phone.
I've had wired headphones save my phone from a drop
¯\_(ツ)_/¯
I also use air pods now and am fairly certain this water fountain i walk past daily has a short causing some sort of arc that interferes with bluetooth as they always cut out walking past it.
I got the dang dongle. Now, I want to listen to a podcast while doing to bed. Of course, because cellphone batteries only last like 24 hours, I have to charge overnight.
So now I've got some big fancy inductive charger, so I can leave my charging port open, to attach a dongle. But hey at least Apple saved $1 and got to remove the only non-proprietary port on the thing.
I had some adapters with the plug built in like that, they were notably lower quality than the one without the plug, but it could be that the ones I got were just some no-name junk brand.
Based off my experiences with my iPad, I'd really like an iPhone, but whether its for reasons of practicality or just principle at this point I refuse to cave to "just use a dongle" or any of that. The few times I've used wireless audio its presented me with a load of problems (having to charge something Im not used to having to charge, connectivity issues, poor audio quality, you know the drill) with the only upside being I can move further away from the thing that fits in my pocket.
I'm willing to use the equivalent of a dongle with my PC (audio interface), but obviously thats a very different use case. I'm just glad laptops haven't decided to take the "brave" action yet.
there was an obvious trend with other phone manufacturers as well. as soon as they made their own brand of wireless headphones/earbuds, the headphone jack was removed from their phone.
If you need to move around, wired headphones suck. IME, interference and latency are non issues with BT. Otherwise, the lighting to 3.5mm dongle works perfectly.
At my desk, i do have wired headphones, but they are plugged into my TB dock.
Yeah, at my desk I use a decent pair of cans (HD6XX or K550 depending on situation) hooked up to a nice DAC/amp since there, cords aren't much of an annoyance. Especially the type on larger headphones which is thicker and far less prone to tangling.
Away from the desk though? Bluetooth all the way. Even if I'm just lounging on my bed bluetooth is better because there's no wires for the cat that will most assuredly be in my lap to chew.
Just out of curiosity, what bluetooth do you use? There is no bluetooth options on the market that don't actively physically hurt my ears in a way that wired headphones never ever have.
Prior to the pandemic it was mostly AirPods Pro, but now that I spend most of my time at home it's a pair of AirPods Max. In the past I've also used a pair of Sony WH-1000MX3 which were pretty comfortable.
AptX low latency apparently makes it better (35ms is the "guarantee"), but I've honestly not tried it. I'm not going to make my 400FPS on a 240hz monitor low latency gaming in Valorant worse by going wireless for audio.
Though I do use a wireless mouse (G Pro X Superlight) and it's brilliant, though it's wireless adapter is attached to a cord that lets me plug it in to keep going if I forgot to charge it, so its the best of both worlds.
Really, its only my mouse that I move around enough for wireless to be worth it in competitive gaming.
I played competitive FPS games for years at high ranks with a Bluetooth headset before learning of the delay. I did not notice. However, for a while I switched to a Bluetooth keyboard and until I realized the input delay was severely impacting my performance. Wireless mice (non-Bluetooth) on the other hand have low enough latency that most pros are fine using them.
Some advice for Bluetooth headset gamers - Bluetooth completely breaks the audio for some games like Battlefield 1. The only solution is to go into device manager and disabling Handsfree Telephony on your device.
Music might be fine for listening, but it is emphatically not fine for playing. It's completely unusable. It's like trying to talk while listening yourself 100ms delayed. Probably fine for competitive gaming if you're just spectating too.
While off-topic, somehow Apple has screwed this up in the M1 MacBook Air built-in audio hardware too. I wanted some monitoring while playing acoustic the other day so I opened logic and rather than connect my expensive audio interface, I just wanted to use the built-in mic and headphone output to monitor myself playing and singing in real time. For some reason there was still an audible delay even though I wasn’t using plugins and had the buffer set very low. I’m pretty sure it was in low latency mode… you shouldn’t have to use an external audio interface just to do real time monitoring like that such a simple use case.
I've been under the impression this whole time that you need a specialty audio interface if you ever want to do near-real-time (<10ms) monitoring or recording. I'd been lead to believe years ago that default consumer audio hardware just cannot do it.
Yup. That's why my spare phone is a 6S+ and I am considering fitting its 3rd battery and getting it refurbed.
I have some fancy Sony Bluetooth noise-cancelling phones I bought just before my kid was born. They are _amazing_ but they should be for over £300. The noise cancelling is superb... so I use them when flying. Otherwise, too much faff.
My £20 Sony wired bass-boost earbuds sound nearly as good and live in my pocket and I use them far more, because of the convenience factor. They never need charging, pair 100% instantly the second I plug them in, latency is a couple of nanoseconds (a metrr of copper at the speed of light?) and quality is superb.
They also are excellent for voice calls, while the fancy Bluetooth cans are rubbish at picking up my voice.
I have a phone from 3 years ago with both an IP68 rating and a headphone jack. There's zero reason that Apple couldn't do the same, they just wanted to convince a lot of people to buy $150 headphones.
The first comment was comparable in performance (IP68), but the one I replied to, with the battery cover, was not (IP67). The point of my comment was to point out that they are not comparable. The purpose of my comment was to hint that the features from the IP67 may contribute to them being IP67 (battery cover in this case).
I think the problem is that you included feelings in something that was clearly technical. The comparison they were making was 1/6th the spec. It was a technically inappropriate comparison. It could only support the claim that these features were removed to increase water resistance.
The comment I made doesn't require that I push, or be for against, some higher narrative that you seem to see me as having in my head. It's perfectly valid to criticize the technical merits of an individual comment. If you see that as weaselly, then tribalism has infected your mind.
I'm still using a number of Motorola Defy phones from 2010-2011 which are IP67+ rated (30 minutes at 1.5m water depth). Needless to say they have 3.5mm headphone as well as micro-USB connectors. The solution to this problem lies in a simple rubber plug which closes the hole when the connector is not in use.
Does it work? Well, one of the phones used to be my daughter's. She forgot to take it out of her pocket when she put her trousers in the washing machine so it ended up going through a full wash and spin cycle. It still worked, there was no water ingress in the phone itself. The earpiece did seem to have gotten damaged so I replaced it (5 new earpieces for $2.50 incl. shipping, I still have 4 left...). This phone is now used as a media player, running mpd which can be remotely controlled from other phones in the network. It connects to my jobsite radio (an oversized and overweight boombox I made around an old car MP3/CD/Radio with 2x40W speakers with bass reflex and a SLA starting helper/power pack) using that 3.5mm jack.
Yes, there are plenty of phones that have IP68 & 3.5mm without any rubber flaps etc., not to start a flamewar but it's strange to me that a lot of people seem to think they would be a necessary tradeoff... Apple had other reasons to get rid of it.
1) wireless degrades by degree and is hard to debug
2) it fails silently or just in non-obvious ways
3) people generally have no idea how good it is supposed
to work, until they experience it working properly
I think there are many things, in life generally, but especially in
tech, where these qualities conspire to give people a rotten time.
People will complain when things fail, or take action. "Graceful
degradation" seems like a feature, but it can keep you on a slippery
slope of increasing tolerance toward ever worse performance.
The complexity, and fuzzy "connection to everything" means that no one
has to take responsibility. Maybe it's your neighbour? Or the trees?
Or the alignment of Venus and swamp gas on a full moon?
Unless you have a full spectrum RF field meter and other very
expensive test gear microwaves are "black magic" to even great
engineers.
I'm sure when Apple met to discuss remotely degrading iPhone
performance the words "They'll never notice" were spoken. Companies
that sell complex services and oversell resources love this combo of
fuzzy, hard to measure, diffuse, naturally erratic behaviour, broad
tolerance, mixed with vague customer expectations.
They prevent you from knowing, and comparing, what you should be
getting, and what you are getting.
Many recent development directions, 5G, secure enclaves, encrypted
updates and exfiltration (telemetry), are set to increase the "random
magic factor" and make gear even more precarious and mysterious to the
end user.
See, the alternative to what Apple did is having phones that randomly shut down at peak load. I have had this happen to me on a degraded battery in a cold Rochester winter. It’s not fun! It means I can’t even trust my phone to be a phone anymore. I think that is more of a “random magic factor” than things getting slower.
> Unless you have a full spectrum RF field meter and other very expensive test gear
You actually have that, at least for the frequency range that matters: the wireless hardware in your computer.
Your wireless card absolutely knows how good/bad the RF environment is - it wouldn't work without it. The problem is that none of this information is exposed to the user.
The Wi-Fi "signal strength" meter is consistently useless on basically every device I've tried. In fact, I wonder what would it take to actually make it go down, since it's happy sitting at max level even if you can't get a single packet through. I guess it must be measuring RSSI, when the one that matters to the user is "how many packets am I actually missing" which can be obtained by keeping stats on TCP flows or pinging the default gateway.
"I'm sure when Apple met to discuss remotely degrading iPhone performance the words "They'll never notice" were spoken. Companies that sell complex services and oversell resources love this combo of fuzzy, hard to measure, diffuse, naturally erratic behaviour, broad tolerance, mixed with vague customer expectations."
I suspect Apple would prefer their products to be reliable rather than erratic.
They quite deliberately decided to make them degraded (to prolong
battery life). And they were fined for it [1]. Nobody mentioned
erratic in that case.
Not just to prolong battery life, to prevent the device from overdrawing the battery and shutting down when it couldn't provide the same peak current as it could when new.
This was likely a compromise from that design decision.
Since the slowdown was based on battery health and not absolute age of the device it was (and is) still possible to replace the battery.
But “the market” (ugh) seems to have decided that it wants phones with integrated batteries, and touch screens, and no physical keyboard, that are too big to use with one hand.
I think your posting of this comment may have had a "graceful degradation" for an improved user experience. Unfortunately it has appeared twice with people commenting different things on each thread ;)
edit: disregard this, I don't see the other thread anymore. Maybe I was the one experiencing magic fuzz factors.
A little pro-tip for those about to run wires through your walls: use shielded cables. Also, probably don’t run them anywhere near AC wire. Some people recommend a full stud between them, whereas the NEC allows them to be run next to eachother as long as there is at least 50mm of clearance.
I think this is usually recommended for safety reasons, to prevent mistakes from causing high voltage to get into your low voltage runs and become a shock hazard or potentially start a fire. But,
I have noticed empirically that running a bit too close with the wrong cable does absolutely result in some serious interference issues. So… your wired connections are not free of signal issues either, especially if you want to neatly run them in-wall.
edit: also, worth noting that this is a highly U.S. centric piece of advice. I mean I’m sure it applies elsewhere, but both electrical systems and construction differences probably make some of the details very different, and I’m not cultured enough to know exactly which and how.
Ethernet on copper cabling doesn't seem to be keeping up, shielded or not, previous and current gen ethernet (10g/100g) over copper is really power hungry or nonexistent.
Consumer devices hardly even have 10Gbit, and often don’t even have WiFi 6. Even if they did, my NAS running on spinning rust is not bottlenecked on network bandwidth even slightly, and affording the amount of switching fabric necessary to sustain 10Gbit across all my devices is not economical right now.
I mean, this might be different for fiber in some regards, but I doubt that is terribly economical either.
My M1 Mac Mini only supports 1Gbit. My Synology NAS only has 1Gbit ports, and while I do have the 10Gbit interface card for it, using that prevents you from using NVMe cache, and with SATA, even a good SSD will have trouble saturating SATA due to the exact reasons NVMe exists. My phone is too old for Wifi 6, My surface laptop 4 has Wifi 6, but no thunderbolt port; just a single USB 3.1 port. Technically enough for 10 GbE, assuming you didn’t need absolutely anything else on that port.
Then you need 10Gbit network equipment. A 10Gbit switch with more than 8 ports is going to be expensive and consume a pretty ridiculous amount of energy. It would be difficult to argue that there is any truly “consumer grade” 10GbE equipment; almost all of it comes in sizes designed to fit in racks… Wifi 6 APs exist, but my AP-HD isn’t, because it wasn’t purchased that long ago, still works, and I have had no good reason to replace it. Not to mention the actual speeds I get on WiFi are never even close to the theoretical speeds, making it seem unlikely it could even saturate my 1GbE setup in practice.
Although I do have some 10GbE equipment, even directly connecting devices together I have found that many of my devices, of which many are consumer-grade, already bottleneck elsewhere and can’t even come close to saturation. So as of today, 10GbE does not seem to be a huge win for most users, even if they do own some equipment that can do it.
How is your nas not bottlenecked by 1gbps? I'd expect pretty much any modern single hdd to have a sequential read speed over 120 MBps, let alone an array.
- It is true that a modern HDD has over 120 MBps sequential read, but not by much: 4GiB WD Reds are probably somewhere around 150-160 MB/s.
- Sequential speeds aren’t really that practically useful for me. While I do have some large files, most workloads are going to involve smaller files and a lot more seeking.
- When using a filesystem with parity, like btrfs or zfs, you definitely have some overhead for writing and checking parity. I’m sure it’s still very fast, but it is overhead, and NASes don’t tend to be the most cutting edge compute boxes.
- Even if some RAID configurations might improve performance, the configuration I am using has a fair bit of overhead. I suspect you need something like RAID 0 or 10 for maximum performance, but those setups are inflexible compared to RAIDZ or Synology SHR.
- As it is now, my NAS is unable to saturate the network bandwidth using Samba alone. It just never reaches a gigabit. Part of this is probably related to Samba or the CIFS protocol, but honestly I don’t know if it has the raw CPU muscle.
All in all, I am not too concerned about it. It is possible that I could yield better performance with 10 GbE, but I don’t think it would matter for most of my use cases. Plus, disk resources are definitely being consumed by background tasks as it is.
I think those transfer rates are typical for spinning HDDs from many years ago. See eg this review of 4 TB drives from when that size was trending: https://www.tomshardware.com/reviews/desktop-hdd.15-st4000dm... - the middle drive of the 2013 comparison has the avg throughput that matches the 120 MB/s max theoretical throughput of 1 G ethernet.
The transfer rates go up roughly hand in hand with capacity.
I would not call it a trap, I would call it a trade off. But I am quibbling over semantics. I would reference Shannon here
https://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theore...
which essentially says the following:
C ~= B * SINR ( see the link for exactness )
C is the capacity of the link (think bps)
B is the amount of spectrum ( how many Mhz can I get across the medium )
and SINR is Signal to Interference and Noise Ratio
If you think about this, of course, wireless will be limited spectrum, there are only so many Mhz that is made available for whatever wireless standard you are thinking about, Wifi, 4G, 5G, Bluetooth, whatever.
And, obviously, wireless will struggle with Noise and Interference.
Wired solutions, whether shielded twisted pair (e.g. CAT6) or Coax, or Fiber or Waveguide, whatever, will do much better on SINR, you have thermal noise sure, but generally, it would be a much quieter medium.
Also, you would not have the FCC saying you only get X Mhz, whatever you can get across that medium you get to use.
Thinking about all of these factors then of course, Wired will generally win out, all things being equal.
The Trade off, is that wireless is more convenient.
When you’re using 100% of the spectrum on your wires, you can just put down a second shielded cable. When you’re using 100% of the spectrum of the air, that’s it.
I ventured under my 120 year old house to make a couple Ethernet runs. It took a lot of meditation on my part to bear doing it. On one run, I needed my wife to pull on a fish line from above, but then she had to feed the baby, and so I watched an episode of star trek on my phone for half an hour while wedged between wood stilts.
6 months later, I got a notice that the cable I used was recalled for a bad UL rating. Monoprice had to pay a contractor had to come out and replace it. When coordinating the work, I warned the company that it was a very dirty job that would require two people. They sent an older guy out all alone, and he poked around for half an hour then left, saying he would come back with PPE and a helper. A couple weeks later the same guy came back alone and unprepared again. Finally the third time he came back with a much younger guy and bunny suits, and they redid the runs.
I had stapled the cables up every couple meters, and followed a Manhattan path based on where I could crawl to. they just fished it in a straight line, and left it resting in the dirt. They presumably billed Monoprice several thousand dollars...
What is a "bunny suit" in this context? The web results make me wonder if there was miscommunication about hiring plumbers for a "very dirty job" if you know what I mean.
Free but uninstallable CAT6 has some uses. I've seen quite a few street artists use the thin copper wires to make art, usually jewelry. But with 400 ft, I could see some art or construction projects that make use of it as a physical not electrical product.
It would be perfectly safe to run in a conduit or to bury. It's just not rated to run between floors, but was stamped saying it was. My home is single story, and I mentioned that during the recall process, but the process was rather rigid...
What does "the process was rather rigid" mean? Couldn't you just decline? (Although maybe it wasn't even safe for your use.) But then, I don't even understand why it would be safe to run except between floors.
Also, did you try pushing back on the replacement worker's work? It's a bitch to do, but I've gotten companies to either redo their handyman's bad work or just let me choose a vendor (which I would ask for after the 2nd time he showed up.) Stapled Manhattan distance wiring is not the same as diagonal through the dirt.
I have another problem with bluetooth - sharing. With a cable you always know the device is connected to you and you only. With bt, another person's device can overtake your audio even if they don't mean to. They were paired once and now simple notification sound can take over your bluetooth speaker. These invisible connections from multiple devices are hard to track and often cause problems.
I find bluetooth sharing to be a feature. I can make my headphones enumerate which devices they are connected to. They can also connect to both my computer and my phone simultaneously, so I can be using the computer audio mostly and have my phone override the audio input/output when a call comes in.
Most of my problems are with JBL Charge 4, which I often share with friends. Multiple phones get paired and connected during a day. Notifications definitely can stop currently played music from another phone in this case.
The LATENCY is noticeably better. If you are on a zoom call with both parties on headsets and on wired connections (so no echo cancellation needed) or other audio connections I find it noticeably better (less stepping on each other etc).
I have AT&T's fiber product (2 Gbps) and the office has their biz product (300 Mbps) and latency between is good and to google et al is great (< 10 ms in almost all situations).
Wired ethernet resolves most of my own teams/zoom/webex etc anxiety that any glitches in anyones meeting video or audio are probably NOT caused by my own connection....
At home, my WiFi vs Ethernet ping is just a few ms different. For data, my WiFi 6 devices (most of them now) are faster over WiFi.
But, at work, there's hundreds of ms difference between ethernet and WiFi. I assume it because most offices use mesh networks now, and none here are WiFi 6.
So, network setup and router quality definitely play a part. My AP was ~$200, used for streaming VR.
My broadband access has been wireless ethernet, over a roughly 5 mile hop from the nearest tower, for roughly 20 years. I'd still be using 14.4kpbs modems over copper telephone wire without it. From the post in the South pasture where we pick that up, it's wired to the two dwellings on the place, and pure wireless after that, including extension APs in two more outbuildings, and a LR outdoor AP. Rarely have any trouble with any of it. Of course, I don't have to worry about my neighbor's router causing interference - the nearest neighbor with broadband is half a mile away, most of which, from a line of site point of view, is through dirt. Before I retired, I was CTO for a hospital where we ran 14,000+ APs on roughly 100 sites, serving 100,000 or so employees, patients and visitors daily. I regularly used it at a restaurant across the street from one of our buildings. The service was highly reliable and available nearly everywhere. (And, believe me, if you took away a surgeon's wifi that she or he used to do quick email and other updates between surgeries, you'd know it in short order.
So, yeah, wifi can require some management, but the thought of running all those laptops, phones (you couldn't rely on cell service - which of course is another form of wireless - inside most of the buildings - worked some places, not others), and portable medical devices only on wired ethernet is a non starter.
As for other forms of wireless - Bluetooth and BLE have their issues, but you get an awful lot in return.
This is a true techy topic :) But the author proably mostly has expierence with bad environments (like apartment buildings with lots of competing WiFis) and/or cheap wireless devices or those which have to do severe compromises (e.g. style or size or battery runtime over RF performance).
These were the issues I had:
1. SO's cheap wireless mouse produced WiFi interference -> massive packet loss
2. Small 2.4Ghz USB dongles plugged into USB 3.0 port (e.g. Logitech's wireless dongle or USB WiFi for RPi) -> USB 3.0 interference render the dongle unusable
3. The HTC 10 has a metal casing -> incredibly poor BT RF performance, causing massive drop outs (especially bad with my FiiO Q5 DAC/headphone amp, which has a massive aluminum case)
4. Client stays on AP with very poor signal (barely in range) instead of jumping to different AP -> retransmissions cause massive degredation for all other clients
5. Lots of APs running at full power in an apartment complex, paired with a router/AP that had low WiFi performance anyway -> bad everything
All but the last can be solved relatively easy (okay, I didn't swap the HTC10 due to BT, though that really annoyed me; it's time eventually came when the battery lasted half a workday without using the phone for anything). Especially coordinated WiFi APs (like Unifi) are great. And even the last point was a non-issue both in our last apt (one well placed UAP for testing) and our new old house (four UAPs).
> Any program on your computer can ask your wireless card to enumerate the nearby networks. This causes it to go into “polling mode,” where it spends less time transmitting data and more time listening for routers advertising their network info
Except for the Wi-Fi widget I click on when I want to choose Wi-Fi network, what software needs this access? I can think of two use cases:
- A Wi-fi diagnostics application.
- Data collection to co-relate Wi-Fi networks and locations.
I don't see why the OS doesn't simply refuse going into to polling except for white listed applications, because out of the box there is only one application that needs to ask for polling (my wi-fi widget) and I don't care if Google Maps wants to geo-tag the wifi in my surroundings, so just give it the list from the last whitelisted poll. And if I need a Wi-Fi diagnostic tool, then I am an advanced enough user to figure out how to white list it.
Yup. Also just mostly wireless equipment is super low quality. It's easy to see this when living downtown in any larger city. You'll easily be competing with 100s of other APs for the 2.4 ghz wireless band. With tbr 5 and 6 ghz this is getting better but there still isn't nearly enough channels.
> For instance: most people, when their video call stutters, blame their Internet service provider. That’s understandable, since most ISPs are overpriced oligopolists with barely-usable software and horrible customer service. However, every time I can remember helping someone track down the source of their connection problems, the culprit has turned out to be their wifi.
Can confirm. I work on videoconferencing infrastructure, and easily 90+% of the end-user quality problems we investigate are caused by the end-user's WiFi. It's very difficult from a customer support perspective too, because it's common for WiFi to have latency and loss problems that are severe for real-time video or audio but not so noticeable for bulk TCP transfers (like most speed tests) or web browsing. So communicating the issue to users is hard.
USB-C hubs or monitors with power delivery make this really easy these days. One cable gets you everything. Power, display(s), mouse/keyboard, Ethernet, and any other peripherals you need. It's absolutely worth getting a USB Ethernet adapter. They work way better than WiFi.
A wireless mouse is still worth it though. Logitech G305 is essentially perfect IMO. Extremely reliable, extremely low latency, months of battery life, reasonable price.
I have a wireless gaming mouse. It's never been anything but completely reliable. The dongle is integrated into the charging dock, which sits about two feet from the mouse.
I've never heard a single person complain about a wireless mouse. The author is just a crank.
I would advise people to not get a Logitech mouse, however. Their button switches are poorly made and fail much sooner than any other manufacturer.
Also from what I've read, usb-c has a lot of interoperability problems because the connector is a different standard than all the different communications standards that it can be used with.
My biggest gripe with wired mice is that the wire never seem to cooperate --- it would drag on the table, be too stiff, tangle, etc. Plus, latency for at least the dongled mice (not the bluetooth ones) is low enough for me to notice.
I tried Bluetooth headphones a decade or two ago and at the time it was unusable. Then recently I thought to myself: it has been almost two decades, wireless audio is probably a "solved problem" by now, especially with its newfound notoriety after apple ditched audio cables. So I just bought some new Bluetooth headphones, and I'm thoroughly underwhelmed, it is nearly as bad now as it was then, high latency, audio drops (even some failures to connect at all, with no way to debug). Bluetooth headphones are the new printers, they fail (partially or completely) at their job half of the time and yet people just accept that as a fact of life and go with their days.
my use case for Bluetooth headphones is noise cancellation on public transport/airplanes. for listening to music latency doesn't matter, although for me it's enough that I can't use wireless headphones to play games on my Switch
I bought a Homepod Mini on sale some time ago, and it's one of the worst experiences I've ever had. It continuously stutters. It stops working. I realize just how magical it is that it works at all. But the realization that it needs WiFi to stream music to, despite having bluetooth, has been quite depressing. I don't think the product will ever work without jitter given the amount of overlapping WiFi signals in my area.
It's a bit like the butterfly keyboard of speakers. A wonderful idea in theory. But it makes you wonder if anyone at Apple has ever tried using it an apartment building, a dusty environment, or some place that isn't a pristine lab or upper middle class suburbia
I set up a second wifi hotspot in the same room as the HomePods and they still struggle to connect to wifi. At this point I’ve given up and assume it’s like the Apple Watch - the wifi works when it wants to. It’s annoying that Apple makes both products that work great with wifi and products that don’t work great - and the ones that don’t work great with wifi often lack Ethernet connectivity options or I would use that instead. At least my watch can connect to cellular. It’s disappointing when you find yourself wanting 5G home speakers to ensure proper internet connectivity…
I have a stereo pair of Homepods and they work perfectly. Maybe the wifi channel of your AP is congested? Surprisingly, the Homepod also works when there is no wifi available.
It still boggles the mind why there is no wired input though.
On my home network, the HomePod Mini's demand to use both IPv4 and IPv6 is what I'm thinking is contributing to the unreliability of my stereo pair. My ISP doesn't provide me an IPv6 address, so my home router and pihole both seem to struggle with the HomePod Minis. My original HomePod doesn't have this issue.
What I'd really love is if my HomePod Minis stopped assuming the primary HomeKit Hub role and instead let my Apple TV connected via Ethernet handle all of the requests. There's no way to select a device to be the primary hub as far as I can tell.
Good wireless is expensive, and not maintenance free. This is at least true in any business setting. At home, wireless is simply fantastic for mobile devices, but wiring in fixed items (PC, consoles, TVs) is 100% the way to go.
Also, being conscious of what's in the air space is as crucial as the quality of your wireless equipment. It's all for nought if your microwave, AV sender, bluetooth headset or crappy 2.4Ghz wireless mouse is shitting into the same airspace as your network connection.
I say all of this as someone that pushed HARD to switch our office from wired to wireless when we moved. Aside from the direct cost saving (~200k in wiring alone) the side benefits of things like not having to use a crappy 'presentation PC' in each meeting room was a complete revolution. You just bring your laptop, and it's not even going to shit itself roaming from wired to wireless because boom, you were on wireles the entire time anyway. Same principle about wiring in fixed items applied though, we put fancy ipad meeting room control stuff in, and dropped extra on ethernet adapters for them because they're fixed devices, wireless is not a good choice for that despite ipads being wireless devices primarily.
Lack of visibility is probably the biggest shortcoming for enthusiast-level home wireless. I've had good outcomes buying ex-business Aruba gear and repurposing it for home use, but it's power hungry and you need to know what you're doing to get it off the ground. But being able to actually see what's sitting in the spectrum and watch the impact of things like turning on the microwave for ten seconds is a game changer.
I have everything wired except my headset (Technics EAH-A800), which works well, I haven't had any problems with it. Network cable vs Wifi is a REALLY different experience in terms of performance and ping, which I care more about for gaming than for work.
This is really noticeable with video chat. I never use Bluetooth when talking with my family... between the Texas - Australia lag and the Bluetooth headphones - Android phone lag, it's enough to impair conversation.
Now the problem is just how to manage all the cables that come from having a gaming PC and work laptop that use the same keyboard and monitor. I have a KVM switch but I can't hide it under the desk if I want to press it.
Most KVMs have a keyboard shortcut that will let you switch, commonly double-tapping scroll-lock and then hitting 1 or 2 for the first or second input.
> I have a KVM switch but I can't hide it under the desk if I want to press it.
Why not? I bolted mine to the underside of the table roughly 20 cm (about 13/20 ft) back from the edge, it's both invisible and the button is trivial to find without fumbling.
I dunno my wireless just works. Two people in the house running meetings on wireless. TV is on wireless. Everything seems fine.
My wireless keyboard acts up sometimes, but I blame the keyboard not wireless.
I do run wires when I can, but that is mostly just the equipment near the modem
Perhaps it's because we are in the suburbs and the house is fairly open. Or maybe I just have my wireless setup properly.
I convinced my 8yo offspring that crawling like an alcatraz escapee over the under-house dust, asbestos, DDT, and probably decaying possum was "fun" and we did it together.
OTOH he got the most fun using the power tools drilling the holes.
AX has BSSID Coloring which really helps improve performance when multiple APs are on the same channel. Still have the saturated channel limitation but that rarely happens on 5Ghz
Anecdotal personal experience, but I don't think it would take 802.11ax to improve available bandwidth, just better AP hardware.
I replaced a high-performance n/ac router with years old leftover enterprise AP. Price-wise the enterprise AP would've cost as much 2nd hand.
It improved the situation in a very very crowded airspace in every measurable metric (~10-50%). Better latency, average, minimum and maximum speeds, connection stability, packet loss and even security (PMF support). All that on the same standards and other conditions.
My suspicion is that consumers are being sold bottom of the barrel hardware and new protocols just rise the bar on what s*t they can get working.
Wireless network links should be exclusively the domain of nomadic, temporary or sacrificial endpoints.
If the device is expected to be used in a fixed spacial or temporal manner, or you care about the data conveyed across the link, constrain its path as much as possible with a waveguide.
I prefer all wired connections. Many things are wireless but I use wired as much as possible. I have a wired internet connection, keyboard, mouse, display, telephone, etc. We can clearly know what it is connected to without needing configuration, there is less wireless interference, spying, etc.
The radio is wireless, but that is public broadcast and is not a connection to a specific device. Sometimes it is unreliable, but that is not much of a problem in this case.
> For instance, on my Mac’s built-in bluetooth, my mouse (a Logitech MX Master) displayed noticeable jank—stopping, then jumping, instead of moving smoothly.
Anecdata. On my Mac the MX Master 3 works flawlessly via Bluetooth. However, it has janks when I use it on my recently purchased ThinkPad running Linux.
> Similarly, when connected to Mac bluetooth, my Jabra Evolve 75 headset would frequently have the mic or sound drop. It (mostly) worked fine on its own dongle.
Again, all my bluetooth headphones work fine on my Mac, but on some (not all) of my Linux computers it sometimes work and sometimes doesn't. I don't know if it's Linux or the hardware.
Also, on the aforementioned ThinkPad, when I connect a new device via bluetooth the wifi disconnects and immediately reconnects 40% of the time. Don't know what's up with that.
I've been using Linux long enough to take things like these in its stride :)
(Btw, it was after I was issued a Mac by my employer that I started using Bluetooth devices regularly; before that in Linux it was mouse via IR-usb-thing, wired headphones etc.)
I agree with the general sentiment that wired devices are inherently more reliable. But I don't know if we should collectively move towards wired, or improve the reliability of wireless.
> Again, all my bluetooth headphones work fine on my Mac, but on some (not all) of my Linux computers it sometimes work and sometimes doesn't.
The issue I'm currently dealing with is that my Bluetooth ZMK keyboard works flawlessly on Linux but won't pair with a Mac. I guess devices are nuanced.
Since the migration of most employees to working from home I've run into this on a nearly weekly basis in IT where wireless works until it doesn't. Corporate networks can be challenging enough to tune in an environment where you have a lot more access and control than someone at their home. Often this is in a dense urban setting where their setup consists of connecting to the cheapest wireless equipment their ISP could buy.
Once the usual quick checklist items are exhausted,suggesting that maybe we ought to just run a cable before getting into troubleshooting buggy wireless drivers, old hardware that doesn't support modern 802.11 amendments, or conducting a full spectrum analysis are often dismissed, or even if they are open to it, no one seems to own an Ethernet cable these days or their gateway is in a closet, basement or some other inconvenient location.
Someday I hope that installing Ethernet jacks throughout the house becomes as commonplace as it once was to do the same for phone lines, but to many it is an afterthought as they are used to their crappy wireless.
> You might think this could be solved by having routers automatically figure out the least interference-prone channel to use, but many of them seem to be quite bad at this.
Partly because of the physics involved. The AP can't magically sense the best channel for the clients, since it can't sense the spectrum at the client's.
- I can move houses and just put all my WiFi mesh nodes anywhere and I’m online again.
- I can play music from my phone while cooking, and use the phone besides my stove to view the recipe simultaneously
- I can browse photos and create a photo book online sitting next to my wife on the couch, instead of sitting at an uninspiring desk in the working from home room
I feel the exact opposite. When I was younger, I thought wireless anything was terrible. Slow, inefficient, finicky, requiring constant charging.
But since then I've realized things have changed. It's incredible how much wireless and battery technology have improved in the last 10+ years.
If you want to perform at the top level in most games, you need a wireless mouse. It allows for significantly easier and freer movement than a wired mouse, even with a bungee. Charges can last for weeks, and due to faster charging can charge in an hour or two.
And as my list of equipment and peripherals grows, I find myself wishing more of them were wireless so I don't need to deal with them getting tangled, needing to re-route them, them being too short if I raise a desk up, etc.
However, there are some exceptions:
* Internet. The reliability problem has largely been solved if you have decent quality equipment. (I have a Ubiquity Wifi 6 lite, and it works great) However, the delays are still there, and some houses/apartment layouts and materials just aren't workable. I think people overall underestimate WiFi though, as most setups are very poorly done. The crappy WAP in your comcast router/modem/switch/wap combo box is terrible and will never perform well no matter what. But a good dedicated setup with WAPs spread and placed properly will do amazing.
* IEMs. There are no standards for transferring high-resolution uncompressed audio wirelessly right now (BlueTooth cannot do it), so the only way to get the best experience is a wired IEM. This isn't true for all digital audio transfer, like streaming. That can be done wirelessly just fine over wifi. (IE, wirelessly sending audio to a Raspberry Pi, when then sends it to a DAC, etc)
The moment such a standard exists for IEMs, I will gladly swap over. The wires constantly get tangled, are too short, get damaged, or rub against things while moving leading to undesirable sound coming into the ear.
I always use Ethernet at the office because Wi-Fi sucks when everyone is connected to it at the same time. Even though they use commercial grade routers.
At home switching to Wi-Fi 6 has solved some f dead zone problems. I notice a stark contrast between performance of Wi-Fi 5 and Wi-Fi 6 devices.
I work the "Wireless and Digital Systems" division of my company. Nobody distrusts wireless products like wireless engineers.
That said, I adore wireless headphones and would never go back. Whether it's my earbuds to listen to podcasts while doing the washing up, or my BT headset that I use during the day for work. My BT headset (a Bose noise cancelling one) is definitely the most frequent cause of call issues (now that I've fixed a motherboard problem causing wifi dropouts), but it's still worth it to be able to move around, still hear what's going on in the call while I get a drink etc.again).
Wi-Fi used to be bad around the 802.11b era. After 802.11n I barely even see a need to upgrade. A cheap consumer router easily covers my 2-floor house with a half dozen clients without any dead spots.
I imagine concrete houses or congestion might be different but I can’t even find a reason to get a mesh system in my house.
Wireless mice and keyboards also work great. I use them even for gaming where many swear by wired.
You can’t hide a mouse cable thing like you can hide an internet cable. It’s literally on your desk no matter what you do, so the upside of wireless there beats the downside. Same with wired internet to a smartphone…
What's a rough ratio of power consumption by wifi compared to cable? I'm guessing the wifi transmitters are like radio towers but shorter, emitting radiation radially from the "tower", unless orientation doesn't matter and there are two other wires so coverage is spherical? Our mesh network units each have an obvious base, though I haven't looked inside yet.
I don't care much about heat from electronics when I'm trying to warm the space anyway, but I'd rather minimize it in the summertime.
> It was the epithet of King Harald Bluetooth, who united the disparate Danish tribes into a single kingdom; Kardach chose the name to imply that Bluetooth similarly unites communication protocols.
[https://en.m.wikipedia.org/wiki/Bluetooth#Name_and_logo]
Another thing, not mentioned in the article, is that microwave ovens tend to interfere with Bluetooth in my experience.
Not just your experience. I worked for a company that made a lot of wireless devices. We had a pretty open-plan office and learned not to test anything around lunchtime.
Seriously. There were many times that I was working on something that suddenly had a spike in the receive failure rate and I'd look at the time and realize that somebody was heating up their lunch in the microwave.
I haven't had problems with Bluetooth in a while - love my TWS earbuds, RF mouse, BT mouse and BT/RF keyboard. Those devices last for months/years on two AA batteries (other than the headphones of course). I will concede that the audio delay degrades the call experience a bit.
Wifi on the other hand... I have no idea if its interference or the modem. It worked fine for months, then all of sudden my connection went to 2Mbps. And my Chromecast can no longer connect to it, etc..
> That means that if I’m chatting with a friend in New York, the audio data will take about 50ms to get from them to my computer, and, say, 200ms—4x as long—to get from my computer to my ears. Since high latency ruins the natural flow of conversations, I’d like to eliminate as much of it as I can.
For whatever reason this isn't a problem for me. I work with a laptop connected to my WiFi, use bluetooth headphones, and have 2-3h of meetings per day. Not a problem whatsoever.
Maybe because you just talk, or just listen. Having a conversation where you go back and forth is when it's the worst, especially with multiple people.
The further problem with the Logitech device can be explained by driver or firmware updates: the driver may upload a firmware blob when device is powered up and if the combination of the two is not well tested you might get some unwelcome surprise. Another more mundane reason for jankyness could be related to I/O saturation: your mouse will not necessarily be prioritised when a variety of workloads are being handled by your OS.
> Qt included a component which would poll for networks every 30 seconds whenever a “network access manager” was instantiated, causing pretty much any Qt app using the network to degrade your wifi for ~5 out of every 30 seconds.
This intrigues me and makes me wonder:
1. Why should it need to do this?
2. Why should this degrade the wireless network performance? (I don't recall the details of wi-fi but shouldn't it be able to do this passively without disrupting anything?)
I vaguely remember from the time (~10 years ago) is that this was a requirement for phones (eg, when Qt is used on a Nokia phone) that the network request go to the interface preferred by the user (so not to use the cell network when wifi is available).
Anyway, the whole thing was implemented in a "bearer manager" plugin, and that thing was quite buggy. https://doc.qt.io/qt-5/bearer-management.html
I think apart from people running Qt on phone, this had virtually no use. And yet this was enabled by default on all platform, and causing lots of troubles. What we ended up doing for the application we were shipping was to make sure that this plugin was not included in the final package.
2. if it needs to send/receive at the lowest bitrate (used for discovery/broadcasting) that takes up a ton of wall clock time, which is equivalent to a lot of bandwidth at the highest bitrate.
1. If you want the machine to switch to stronger / more preferred networks when available it needs to know about them somehow.
2. The radio in most (all?) WiFi radios can only tune to one channel at a time, scanning all channels requires retuning to each one, which takes a little time.
> Most Bluetooth headsets introduce around 150-300ms of latency (the time between my computer receiving the audio from the Internet, and the sound coming out of the headphones)
I must have an incredible $10 BT speaker (used for quiet TV listening at night). I don't see any delay versus the regular audio (plan old 1/8" audio jack out of Android box, to stereo). You would notice 150-300 ms as serious lip synch issue.
Modern Bluetooth devices can communicate their internal latency, and modern AV stacks use that figure plus known physical layer latencies to delay the video just enough to compensate for that.
For interactive audio like games or video conferencing, this doesn‘t work of course.
Working on Logic (just hobbyist here) and gaming are the only two activities that I connect my wireless headset via USB C. The latency (because of Bluetooth protocol itself) is too much.
If you're watching non-interactive content, the device can just play the audio a little ahead of the video to keep things synchronized.
I've noticed this with YouTube where sometimes the audio gets out of sync when listening on bluetooth headphones and I have to pause and re-play to force it to sync again.
I built my house with this guy's new mindset. There's 2 Ethernet jacks in every room. Even the closets, bathrooms, and garage. I have 3 wifi routers with matching SSIDs as well, because we're not animals... but everything can be wired if I want it to be.
There's a tradeoff with wired headphones. Sometimes they act like their own antenna, and if you live in an area with very poor reception, then that becomes a huge problem. In that scenario, bluetooth earbuds are actually preferable, until they run out of battery mid-call.
Not saying this is impossible, but it doesn't make sense to me.
The phone antenna isn't a straight piece of wire entwined with your cable (probably not even parallel with it), and your cable isn't connected to the antenna port.
Maybe they mean the noise you can hear from the phone as it communicates. I wonder if it is more pronounced when reception is bad as it searches for other towers? I had wired headphones + mic with an amplifier for a while and if I put my phone anywhere near it you'd hear the classic GSM buzz.
So they do share ground, which is where the interference comes from. You might find sometimes even stereo sound speakers pick up a ticking noise right when you receive a phone call. This is pretty common with speaker systems, wired headphones being a type of speaker system.
Also, a lot of SoCs with FM antennas use the 3.5mm ground as the antenna, so they are wired into the SoC in such a way that they can receive undesired interference.
I agree very much with what Ben have said.
In my experience, I elect to use my once Bluetooth noice-cancelling overhead phone with a 3.5mm cable plugged, and voice from webcam. Life with Zoom are much improved than using it as wireless headset.
My covid project was wiring my house with Cat-6. Getting wires through existing walls is an art form, and pretty rewarding when you get a run working! I'm by no means a pro now, but I do own a 6-foot drill bit.
I ran a sixty foot HDMI cable and a powered USB extension cable through my crawlspace from my office desktop to my living room television. Relaxing on the couch is so much better than sitting at a desk.
Yet were moving quickly to millimeter wave 5G tech... which apparently doesn't have the same congestion issues that WiFi or other older cell technologies do.
Wireless (and LTE hotspots) gives me freedom to get my laptop and work with wherever I like with my laptop.
I don't need cabled connections of any kind 99.9% of the time, Wi-Fi, Bluetooth headset, and my laptop "just work". Any stability and performance gain from using a wired setup is negligible in daily life (apart from very specific tasks) and convenience of wireless is far outweighing any gains of wires.
> I yearn for the day that I will get to work at a company that gives me a wired connection at my desk.
It boggles my mind that any company wouldn't do that. Wifi sucks so bad and offices are so dense with people. The cube at my office have several network drops each, across many different cubicle systems. I figured they were standard equipment.
Wired would be so much better if it were just a replacement for wireless, where you just plug a wire between two points, and they start communicating without any configuration necessary if the two points already had a wireless communication going.
It... usually does? For example, I can plug in an ethernet cable to my router and computer, and it works perfectly fine. And if I unplug, it's still connected, just by wifi.
I blame the wifi. I can ssh from one wired machine to another just fine, but when I unplug the cable the wifi doesn't seamlessly take over. This is clearly the fault of the wireless.
I think it's a problem with the TCP/IP stack somehow. If I try the same thing when downloading something from chrome, the download fails. If I'm loading a website I get a "connection interrupted" error. Really there's no reason that my device shouldn't have its own IP and treat wifi and ethernet as two different routes to my device. Remember "IP routes around damage"?
Nothing to do with Wi-Fi, the same would happen if you had 2 Ethernet connections.
Your Wi-Fi and Ethernet are both network interfaces with their own IP addresses - those addresses don't move when you switch between interfaces. If you established an SSH session while on Ethernet and then unplugged it, that session will not carry over to Wi-Fi unless you change your Wi-Fi interface's IP address to the wired one's and that both networks are on the same L2 segment (typical for home networks, but not always the case for enterprise).
A potential solution is either MPTCP (which will establish multiple connections over all interfaces and can tolerate the loss of all but one of them) or to VPN into your router and use the VPN link as your default route - this means that the VPN connection might drop and reconnect but your IP address to the other hosts (which is actually the one of your VPN gateway) will remain constant, so traffic flow will be restored as soon as your VPN reconnects even if your local IP changes.
If you configure things so that the wifi adapter gets the same IP address as the wired connection you should be able to continue communicating with those machines. SSH uses TCP which is connection-oriented so as long as the connection can be made it should work (bar problems with MTU etc.). You might experience a stall before the connection becomes alive again but it should work.
Sun/Solaris machines used to have a machine specific MAC address. Then DHCP, bootp etc would give it the same IP independent of interface. Wonder what would happen in a current *BSD/Linux system if you gave switched the MAC address over.
It's been literal decades since I configured this (and carried it forward with the rest of my dotfiles), but there's a configuration setting that makes SSH reconnections of existing sessions seamless. No, I don't know what it is off the top of my head, let's call it an exercise left to the reader.
I tied a ping pong ball to a piece of string and used a vacuum cleaner to suck it through the pipes, then used the string to pull a stronger rope and then the Ethernet cable. So now my Ethernet goes from my office to the wireless router via the vacuum ducting.