Oh FFS. So they approve 9.6Gbps Li-Fi aka IEEE 802.11bb, and the devices shown in the photo used inside a prototype is only capable of 1Gbps, but every article including one submitted last week said "100x Faster " [1] than WiFi at 224GB/s or 1792Gbps.
And just like 802.11be and 802.11ax, I had to read up on the actual Spec again to understand WTF is going on instead of relying tech reporting. And this isn't just a problem with reporting, the actual PR of Li-Fi cant even do half its job.
IrDA above 115kbps is basically dead, and I don't think anything beyond 4MBps is feasible (ie: months spent on an FPGA to handle all the issues)
I'm a big fan of using simplified IrDA in 2023 in hobby electronics. But I don't see a smooth path to 9.6Gbps.
1Mbps to 10Mbps is already exhausting for me to think about.
--------------
Is there any industry chip with planned development to handle this standard? Finding a damn LED and Photodiode that is guaranteed faster than 20ns to feasibly handle 4Mbit is horrifying. Let alone the signal processing circuits (I think you need 20Mhz OpAmps which isn't a 6001 or lm358 if you know what I mean) or FPGAs needed to handle this relatively low speeds.
Or are we doing some kind of multi-frequency multi LED thing? Are there any LEDs or Photodiode that can emit and detect such a narrow band? What are the sources of IR noise to mess with those frequencies?
-------
Oh, but 115kHz is plenty useful for a hobbyist and I highly recommend people playing with that.
There are much faster IR Links for industrial application. They are for example used in automatic crane application. I don't know the data rate just now, but we have a crane with a such Profibus link since over 20 years in our factory.
Well... I think I can visualize a 100 Mbit IR link in theory.
If we use Wi-fi like processing (256-QAM and the like), we'll only need 12.5 Mhz-class LEDs and photodiodes to communicate at 100 Mbit. (assuming 8-bits per 80 nanosecond or whatever). I'm pretty sure that cheap LEDs with 50 nanoseconds aka 20MHz speeds, as well as analog components with high-gains at 20MHz (I dunno, a 1GHz bandwidth-gain, giving x50 gain at 20MHz) are all possible.
I think the article's claims of 9.6 Gbit are so absurd however, that I'm having a great amount of difficulty seeing how it gets there.
I admit that I wasn't aware of the automation IR links that you posted. I'm impressed that IR has gotten that good today, but its not so good that its become unbelievable. The 9.6 Gbit headline on this article however is just absurd at face value. I really need to see a physical demo before I'll believe something like that.
-------
I know that IrDA is a 90s standard made for far slower LEDs and components. But its a good baseline for thinking about the issues that occur in the space of light-based communications.
I'm sure once the demand is there, the silicon will be mass produced.
I like 38khz modulated IR more than 115kbps, you can get really good range at low cost and power, without high parts count, and if you have large amounts of data rather than just stuff that takes advantage of directionality, you can do it over Bluetooth for cheap.
Never got a chance to play with 115kbps though, seems like it could be pretty cool, but I'm not sure what I'd do with it.
Edit: I re-read your comment and I think I see what you might have meant now. I think you were asking if there would be a solution for low speed devices to use this standard.
I'm asking if the LEDs that support 9.6Gbps (what is that? A 1ns or faster LED?) even exist in any substantial way.
Isn't that like, laser speeds? Not overhead LED speeds. And over the air? Not inside a fiber line?
This is basically a 10Gbps standard for fiber lines except over the air. All those issues need to be solved without the guarantees or isolation that fiber optic lines provide.
That's just transmit. Receive has to deal with the same issues, amplifiers and other electronics that move at the requested speeds.
I don't know, I was pointing out for the data rate that seems to be part of some COTS devices, not the data rate that you can negotiate with research lab hardware.
Single quantum well LEDs are well-commercialized at 150 Mbps for MOST150 (that was intended to, running over POF, be cheaper than an ethernet PHY and copper); I'm not sure exactly where SQW LEDs drop off and laser diodes become the only option, but it's substantially faster than traditional LEDs.
Could one PWM such a light to display high bit color content? By my estimates: to send 9.6Gbps, you need (at least) 9.6 GHz, and that's 160,000,000x faster than 60hz. log2(160 million) = 27.3 bits.
I'm sure that this idea is more of a fun hackaday-style gimmick (i.e. not productionizable for various reasons), but would be fun to see.
I'm curious who has a problem that this solves. You need to have line of sight to something very nearby. So you'd need to wire 10Gb Ethernet to the LiFi points.
This would be fantastic in my current apartment. My immediate surroundings are completely saturated by WiFi access points across both 2.4 and 5.x GHz making all wireless connections unreliable at best, and the layout of my apartment is such that the only way to have a wired connection in the living room where my desktop sits is to run an ethernet cable across the floor.
So I run an ethernet cable across the floor, and it's not great.
I also think VR/AR use cases may benefit from this eventually.
I wonder, does this pollute the IR on your apartment so that your TV remote would no longer work? Although with streaming set-top boxes, those are using RF most of the time these days.
I’ve actually tried this too! Had great luck at an apartment in the past, but the wiring here is also really old, and I had issues getting reasonable speed.
I need to look into newer generation Ethernet over power to see if it’d work better than the aging pair of plugs I purchased a number of years ago.
My guess is that if the power cables suck the newer generation will have bad speeds too. If you have the option buy a pair with the right to return and try it. I was very happy with this solution.
I have the same problem here and I'm considering buying a 6GHz tri-band (2.4 + 5 + 6GHz) Wi-Fi 6E router. Problem is: the only ones I can find are "gamer" routers bloated with features I will never use and priced 5x to 10x a regular Wi-Fi 6 (without the "E") dual band (2.4 + 5GHz) router.
Both my phone and computer supports 6GHz, but it's still hard to find a reasonably priced router.
I was going to suggest the Banana Pi BPI-R3, but that still doesn't have 6 GHz. The successor to that one is going to have Wifi 7, but might not get supported by mainline OpenWRT.
It is virtually impossible you’re getting enough distortion from other 5GHz networks to matter, unless your walls are made out of tissue paper. I would recommend analyzing your own devices for faults.
5GHz is significantly better than 2.4, but I still get latency spikes that make it unacceptable for gaming.
And my range it atrociously bad.
It’s an old building, and I live on the top floor just below a roof with assorted antennas. I think there’s some microwave gear up there that would explain my issues.
I’ve wanted to do a site survey for awhile to figure out what’s up, but have just lived with the Ethernet cable across the floor.
But that means your distortion issue is not 5GHz networks.
Way back when I was still in university, if I would walk out of my room and close the door (which would put a few cm thick wooden door, some plaster wall and about 14m between me and my router) I would already have a significant reduction in connection strength.
Even with the ridiculous walls Americans use for their homes, there is just no way a neighbors 5GHz signal is strong enough to cause significant distortion. And if there somehow is, there is enough room on the spectrum to choose a different band.
Thinking on it, almost all civilian microwaves also only operates at around 2.4GHz. And since you mention sudden spikes, I would check for radar interference and if you are on a band that’s radar-assigned.
> Thinking on it, almost all civilian microwaves also only operates at around 2.4GHz.
I spent a number of years as a network engineer working with consumer microwave gear, most of which was in the 5.2-5.7 GHz range. There is plenty of this stuff around.
And I’ve seen what a poorly aligned backhaul can do to the wireless networks in the vicinity.
That said, I’m not committed to a specific reason it’s not working well, it’s just not. The point was that wifi is not a good option for me for some use cases given the apparent spectrum use here.
I think it’s probably a combination of factors since I live in a dense city. Radar is certainly a possible factor.
I own my condo but have this exact issue. There are 1Gb Cat5 cables inside the cement walls and I cannot replace them if they ever break, nor upgrade them.
Already this is a problem as I have 1.Gbit internet.
> There are 1Gb Cat5 cables inside the cement walls
I believe you when you say you can't replace them, but I'm curious why? Every wire I've ever seen run through concrete was in a conduit that could have replacements pulled through it. Did they just run the wires and pour over them?
If you need speed and money is of limited concern, consider fiber. There are some nearly invisible configurations that stick onto the walls, and baseboards or crown molding can be a channel as well. It’s possible you could sneak it under a door, or through a small gap in the frame behind the trim. You might even be able to smuggle it through electrical conduit for short distances, since it’s non conductive (but don’t trust me on the safety or compliance issues of that, I wouldn’t trust myself, though I’ve seen zanier from the previous homeowners here...)
The obvious downside being the extra expense of media conversion.
Is MoCA reliable? Just bought a new house and wiring Ethernet into some of the rooms will be tricky, but a previous owner already did coaxial in the whole house.
I think I'm running into some of those gotchas with older wiring. I've got moca adapters set up between a few rooms and have periodic dropouts on one of them, which unfortunately is the room with my wifi AP.
Traced the cables and the only splitter I can find is a moca 2 aware one, and the coax cable that would have been used for cable internet is unplugged so that's not interfering.
Nice when it works, real pain to troubleshoot when it doesn't.
2 things that solved my problems with moca was buying a moca adapter to filter the signal out from leaving/entering the house (adapter on the primary incoming coax) and then another adapter on the coax plugging into your cable modem.
This fixed some (probably modem specific) problems where internet would randomly drop in the presence of moca.
There is no such beast as 1Mb cat5. Do you mean 1gb? Assuming the cables are in conduit you may be able to use the existing cables to pull new cables. This usually works.
Oh no, maybe, but I would have no way to fix it. The run is literally from one side of the condo to the other, probably through the ceiling or floor. Or it's wired wrong, (and I lost my LED blinker somewhere :( )
because it's like 5Gb/s down and 1Gb/s up. Which is odd...
CAT5 is in theory only rated for Fast Ethernet (100M) while CAT5e is rated for Gigabit Ethernet. In practice is very hard to actually find a cable that meets only CAT5 and not CAT5e.
Just a vaguely related anecdote, I consulted for a company that had two offices on a river, in old converted houses, separated by a marina. Running a cable proved problematic for various reasons, both bureaucratic and relating to the nature of the site.
To connect the networks between the two buildings, they had a laser connection set up in the widow's walk on the buildings. It worked great except when it was foggy or raining very heavily, or when the marina moved a sailboat so that its mast blocked the beam.
Point to point WiFi links are pretty accessible nowadays. I bought a €60 no-name set from Amazon to link two buildings 250m away that don't even have line-of-sight (other buildings and trees are in the way) and it works good enough for my needs (CCTV).
The same reason people use Bluetooth audio. It's more convenient because there's no cord. (But there's a battery to manage, interference, pairing, and inferior audio quality).
Two university buildings separated by a privately owned house or two springs immediately to mind. Not sure if they still do but my local university used to have IR links between buildings.
Once it's widespread in consumer gear, I'm sure it's only a matter of time before people start hacking it to operate over longer distances, similar to what's been done with custom WiFi antennas, parabolic dishes, etc.
I'm imagining offices that put Li-Fi devices on the ceiling and laptops made with Li-Fi in the edge of the screen lid so that you can just open your laptop and it will start using the Li-Fi for faster speeds.
Someone mentionned VR headsets (if they can actually make the speeds advertised work)
Office spaces are going to eat that shit up. They are already running Ethernet everywhere, and they already have many wifi access points. This is probably lower-crosstalk, faster and more localized than wifi.
Seems like the fact that it's a "tightbeam" vs. WiFi's broad coverage could result in some fun niche hacks like tiny autotracking IR transceiver turrets that follow devices around a room or (working together to hand off the connection) a building, or larger versions that keep two transceivers pointing at each other over a longer point-to-point link. Who will be the first hobbyists to establish a LiFi link bounced off of one of the old retroflectors on the moon?
If the IR is bright enough, it might even make sense to put a NIR camera in every access point so it can double as an unobtrusive security camera, and/or track devices (for beam-forming, forensic information for physical security staff, etc.).
Do you have a reference on any of those? I can only find papers about lab setups. Since wavelength of IR is much shorter phase control of different emitter is much harder to accomplish than in the radio. Moreover thermal deformation of the device would affect the phases much more significantly.
Perhaps it would even be useful to do networking use audio (like dial up sounds or cassette tapes) One could carefully pick sounds to be the least annoying and have [say] a printer announce to the world that it is still online. One could probably transfer well over 50kb/s but it hardly seems necessary if one rarely prints anything. I already scream at rarely used printers it would be great if it would listen for print jobs. It could take dictation too. Perhaps it is funny to have the entire print order in human speech.
Iphones have some weird feature to listen for coughing or water(?) We already have door bells and door knockers sending out audio signals. (Video doorbells are great but knockers are more reliable.) Lots of devices in the home make identifiable sounds. Combined with the security footage you could generate a film of you using your wireless drill, every time you look in the fridge, open the front door, when you shoot someone, etc "vacuum cleaning the movie".
The vacuum cleaner can be asked for a status report, its serial, build date etc, how full its bag is utilizing a human friendly machine readable audio signal.
It seems much more secure than evil men silently communicating with your devices over the internet.
Am I correct that this would require direct line-of-sight for any device using an antenna[0] enabling this? It seems a bit impractical for your typical consumer use-cases if so, but maybe beneficial for some more commercial scenarios.
Would love to know if I am wrong here though! It seems very interesting.
I think the idea is you have normal wifi as well, so you should never lose the connection but certain preferred positions will go much faster using light.
I’d be curious to know how it impacts things like FaceID. Or if bathing a room with rapidly modulated IR would affect thermography (as used in, say, energy efficiency audits).
This is pretty cool - being able to identify/segregate traffic by its physical room seems like it would be a useful tool for anyone in security, university professors giving exams, etc.
I can't really see it totally replacing WiFi though. It's hard enough to get a TV remote to work outside in the shade - getting higher-bandwidth IR-based WiFi to work outside in the sun seems... ambitious?
Where I believe LiFi options would be great is to eliminate HDMI / Display Port cables to monitors or TVs. Just one less wire to run and figure out how to hide. Well at least until wireless energy systems get better. ;)
Or to have a desktop computer across the room from a display and be able to switch displays.
Interesting and agreed. Is it more secure (or less detectable) than Ethernet? I suppose Ethernet has some form of magnetic radiation. I wonder what all this infrared light bouncing around will do to us monkeys.
I wonder how it offers bi-directional communication?
Do mobile devices have to have small light bulb (like infrared) to support bi-drectional communication? It would be too big for modern smartphone though.
I remember my old Nokia brick having the ability to send files and apps through an infrared thingy on its side. You could simply hold two phones side by side from each other and transfer data, but I remember handly anyone using it (I guess it was kind of akwad to use - but probably still simpler than getting the devices paird with bluethooth).
It wasn't nearly as fast the advertised speeds mentioned in the article.
Sure let me attach an ethernet cable to my VR headset, and then magically pull another from the wall to attach to my laptop. Can't forget the mystical third wire which I need for my phone!
Hopefully it doesn't take much to grasp the benefits of ethernet without a cable directly to your device at all times, right?
But those devices, particularly the VR headset and mobile phone, are not good candidates for an IR-based networking solution as their use-case inherently has a lot of movement. What good is a VR headset that loses connection with the host computer when you accidentally block the receiver while moving around?
The Li-Fi works with light bounce. Like light bulb illuminates the whole room, the access point should be able to see any device in the room. Might need to make sure the "antenna" doesn't get blocked completely so probably need to be on top of device. It would be perfect for VR headset, a cheap version of 60GHz WiGig.
Wouldn't that require a pretty hefty amount of redundancy and error control, heavily lowering the throughput ?
The actual device speeds is already way lower than what is touted in the title, so at the end of the day it would be less practical than plain wi-fi from an antenna in your room (except if you're really affected by other longer range signals that touch even the 5Ghz band)
Light is going to bounce off surfaces so they need redundancy and error control anyway. LiFi is built on 802.11 protocols so the higher levels are the same. The lower level is pretty easy to add redundancy to signal.
The advantage of LiFi over WiFi is that don’t have to deal with interference. The light is confined to the room when WiFi can go to neighboring apartments. The downside is that need devices in each room. If they can get speed higher, within the published ranges not the crazy one, they can do same job as WiGig.
Another advantage is that light diodes and receivers are simple and cheap. My guess is that it is easier to scale LiFi to higher speeds or make super cheap. But that may only work for special cases.
Probably harder in a practical sense - its signal doesn't travel through walls or most commercial windows so the intercepting sensor would have to be located in the same room (if indoors) or have a heroic signal/noise ratio to avoid getting washed out by sunlight (if not).
50 or 60Hz is low enough that even people who aren't sensitive can notice it. Cheap LED lighting more often flickers at 100 or 120Hz, using a bridge rectifier with a series capacitor to limit the current.
If I have a room full of these can they communicate with each other to provide a connection? This reminds me of base stations for my VR setup where my feet trackers will sometimes lose connection due to a physical obstacle like a desk. The solution is to add more base stations typically or make your space more sparse.
Wireless laptop dock is one application. A 1080p monitor signal is 3.2 Gbps. Dual monitors is 6.4 Gbps. One "4K" display is 12.5 Gbps.
Wireless is notorious for quoting numbers that include protocol overhead and assume a near zero noise and congestion environment. The typical layer 7 throughput of 802.11 is about half of the link rate. Of course it goes down from there without line of sight in the same room.
Perhaps so, but this new standard reminds me of the born-dead USB1 standard, which at the time of its launch was so slow as to be almost useless.
This new optical standard is only about 50% faster than the now-ancient copper SATA standard of 6Gbps. It has stuff-all reserve for future development!
Every time we have one of these new standards, Wi-Fi whatever, they are just a bit above the previous standard and never at the limits of the tech. There is never enough reserve to give them decent longevity. It's as if those manufacturers involved in setting standards were making a standard that would be quickly obsolescent—that is, they'd guarantee ongoing production/sales without much effort.
It's been such a consistent problem for so long that it has shades of the Phoebus cartel about it. As always, the user ends up paying more because of premature (planned) obsolescence.
> This new optical standard is only about 50% faster than the now-ancient copper SATA standard of 6Gbps
you are surprised that free-space optical transfer over many(?) meters is "only" 50% faster than data transfer over <1m carefully constructed differential coax-pairs?
Electronics is an extremely cost conscious industry. There's not a viable market for massive jumps forward if the laptop would cost 10x. Approaching the Shannon Limit also is at odds with maximizing battery life. Recent Macbooks only have 2x2 MIMO. Presumably 3x3 MIMO used relatively too much power, space, or BOM.
> It's been such a consistent problem for so long that it has shades of the Phoebus cartel about it.
Which wasn't what the general public thinks it is and, much like equating short-run in-device transports with lossy, room-distance ones, is a good signal that one is comparing apples and trailer trucks.
OK, at the end of my WiFi I have a SATA or much faster drive (or any other data source) attached.
So the WiFi becomes the transport layer in the OSI model. It's the throughput data rate that matters not the type of devices that are connected at either end of the link (nor the protocol by which they're connected).
.
Right, I think conditioning is definitely part of it, but my point was (as I hinted elsewwhere) the move to optical would have allowed for a vastly better performance with little additional overhead.
That 224GB/s figure is some serious bullshitting. For comparison, the fastest optical ethernet transfer over fiber is 800Gbps or 100GB/s, and even that is over 8 fiber pairs and still in-development tech. Claiming to do over twice of that in free-space with "common household LED light bulbs" is pretty ridiculous.
224GB/s is unrealistically fast but 9Gb/s unacceptably slow. A compromise of 9GB/s would perhaps be reasonable. 1/25th the speed would be manageable.
"common household LED light bulbs" is pretty ridiculous." Likely so, but all we have to do for 9GB/s is to use the transmitters and receivers used in standard commercial fiber optics (or a suitable adaptation thereof) and we'd romp home.
Well the speed is the speed of light, ideally this should have low latency. The datarate will be effected by how quickly you can turn on and off the light and that will have some limits.
Latency in WiFi comes from interference and contention (wait until other devices have finished sending; this has improved with Wifi6 AFAIK), and some other, probably less significant factors. Travel time of the signal through air should be the same.
Light and radio waves are the same physical phenomenon. The difference is the frequency. Light is much much higher frequency than radio. Nuclear radiation are even higher frequency. Maximum throughput with maximum damage!
And just like 802.11be and 802.11ax, I had to read up on the actual Spec again to understand WTF is going on instead of relying tech reporting. And this isn't just a problem with reporting, the actual PR of Li-Fi cant even do half its job.
[1] https://news.ycombinator.com/item?id=36707789