Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Radios, how do they work? (lcamtuf.substack.com)
641 points by todsacerdoti on March 26, 2024 | hide | past | favorite | 109 comments


Their primer article [1] is also really nice.

> Today, I’d like to close this gap with a couple of crisp definitions that stay clear of flawed hydraulic analogies, but also don’t get bogged down by differential equations or complex number algebra.

Related: many, many years ago, when Facebook didn't exist yet, Google still passed as a "good" company, and hobbyist electronic geeks had almost only PICs to choose from, I found online a very long and complete electronic course that went from 0 to basic R/C concepts, to transistors, up to pretty advanced topics like magnets/transformers and IIRC radio too.

It was made of pretty raw HTML pages and images, and what was most peculiar about it was that it managed to explain a lot of concepts up to an applicable level (as in, actually designing analog circuits) without (any?) calculus at all.

Some of those may be false memories, but if I remember correctly:

* Its HTML style had a yellowy background * It was taken from an old-ish (US?) navy electric engineer-focused applied electronics course for training naval engineers. * It was more focused on analog circuits

I remember I downloaded it all but after all those years who knows where it could be. Maybe in some 1GB disk of my first Pentium PC, so it's basically lost.

Does anyone in HN knows what I'm talking about? I was never able to find it again.

[1] https://lcamtuf.substack.com/p/primer-core-concepts-in-elect...


I guess the original "NEETS" content is this:

http://compatt.com/Tutorials/NEETS/NEETS.html

Content updated in 2011.


THAT'S IT!!!

Thank you so much! I've been looking for this for at least fifteen years!

And there are even links to previous HTML versions (this one is PDF)... amazing!


Here's another trove of documentation: https://maritime.org/doc/


Amazing thanks for remembering a great resource that you saw years ago. Seems very comprehensive. Bookmarking this.


Hmm... Unfortunately the site is infested with spam links - visible with ad blockers turned on.


FYI your ISP might be inserting those spam links since this is unencrypted http. I’m not seeing anything even with blockers turned off.


Replying to sibling comments: The links are there, but are hidden by Javascript. Since I'm on my phone, I didn't look into how.


What's a spam link? I don't see anything unusual. Can you post a screenshot (using imgur or something)?


(replying to myself since I can't edit) In retrospect, I wonder if [the person I'm replying to] was referring to a different link in this thread, such as the rimstar one, which is pretty bad.


Absolutely no spam links visible here. Tried Safari and Brave without ad blockers.


>flawed hydraulic analogies

I want to say that’s cool, avoid common pitfalls in explanations, but I want to to point out that all analogies fall short, otherwise they would be the same thing, and not an analogy.

That is, if the hydraulic analogy were perfect, then that would mean that electronics would just behave as a fluid and we could teach it an a part of fluid dynamics.

But instead it is an analogy, electronics is not a part of fluid dynamics, there’s just a few similarities that can be used for teaching.

It’s not unusual to teach an imperfect simplistic model at first that you intend to supplement later with more details that break the analogy.


Water analogies for electronic circuits can go a long way, particularly if you don't need to model inductance, or not fully.

We can use water to explain why capacitors in series have less capacitance!

I'm now thinking whether we could build a water pressure multiplier using the cascade voltage multiplier topology.

https://en.wikipedia.org/wiki/Voltage_multiplier

Use some rubber dams or flexible pipes for the capacitors, one-way backstop valves for the diodes, and then some back-and-forth pumping mechanism to generate water AC. It should work.


This is an excellent article, thank you for submitting it! I love how effortlessly this article delivered an intuition for why an ideal antenna length would be half of the wavelength of the signal you want to receive. I was also delighted by the point about how all methods of modulating a wave can be recontextualized as frequency modulation!


> I was also delighted by the point about how all methods of modulating a wave can be recontextualized as frequency modulation!

That's the classic way to think about it. Another way is to view the input as simply a sequence of voltage readings. Extracting a useful signal from that is an exercise in exploiting redundancy in noisy data. [1] Software defined receivers work that way.

Analog radio (AM, FM, etc.) is a hulking big carrier weakly modulated by the signal. Analog TV, which was AM video with FM audio, had 80% of the power in the carrier. Analog UHF TV stations often had multi-megawatt transmitters to overpower noise by sheer RF output. Digital broadcast TV transmitters output maybe 150KW, because the modulation is more efficient.

Modern modulation techniques are insanely efficient. It's amazing that mobile phones work.

[1] https://ocw.mit.edu/courses/6-450-principles-of-digital-comm...


I may not be understanding what part of the operation you are talking about. but radiated power? no transmitter had megawatts of radiated power. 50Kw for a fm broadcast antenna is a common number passed around. The huge "voice of america" shortwave station was 300Kw.

I have heard of military radars having megawatts of radiated power. but even then it was in the low megawatts.


The goat testicle doctor did get permission for running his border blaster at a million watts.

https://en.wikipedia.org/wiki/John_R._Brinkley#Brinkley_and_...


For UHF television stations, the effective radiated power (ERP) is typically 1 Megawatt. That is accomplished (for example) with a 57 kilowatt transmitter and an antenna with 12.44 dB gain.


Up in the UHF TV bands, huge transmitter power was required. The FCC allowed 5 megawatts.[1] Few stations actually used that much power, but 1 MW was not uncommon.

Amusingly, over-the-air digital TV is making a comeback. The cable industry pushed prices up too high. But the "comeback" is to only 14% of the viewer base.

[1] https://en.wikipedia.org/wiki/UHF_television_broadcasting


> Modern modulation techniques are insanely efficient. It's amazing that mobile phones work.

There's old modulation techniques that are crazy efficient too - the most obvious one being GPS. A small antenna, just 2cm short [1], is all you need to pick up and distinguish from each other the signal of up to a dozen different GPS and dozens of other (GLONASS, Galileo, BeiDou) navigation satellites. This is mind-blowingly nuts.

For modern high-bandwidth communication, microwave point-to-point links can achieve 65km range with barely 4 watts of RF energy - e.g. German Telekom's Skylink system connecting an aviation radio transmitter on the island of Helgoland with the continental backbone at Cuxhaven [2]. And actually that is way more power than required, half a watt is enough but they chose to go for higher power to achieve perfect reliability even in worst-case conditions [3] (does make sense, given how safety-critical flight radio communication is).

If all you need is satellite communication for a few bytes of information, a cellphone can send an emergency message to satellites anywhere on this planet (at least, where regulatory clearances allow for that) [4].

[1] https://electronics.stackexchange.com/questions/586900/how-c...

[2] https://www.youtube.com/watch?v=q_Ja6QtWwpQ

[3] https://www.telekom.com/de/blog/netz/artikel/skylink-richtfu...

[4] https://support.apple.com/de-de/101573


> If all you need is satellite communication for a few bytes of information, a cellphone can send an emergency message to satellites anywhere on this planet

This month, an unmodified Samsung S21 Ultra achieved 17mbit download speeds from a Starlink satellite, which is. Well, quite something. I do wonder what the upload speed would be.

https://www.androidheadlines.com/2024/03/starlink-achieves-s...


I love the fact that GPS signals are below the noise floor; its only through correlating the complex modulation that you can find the signal. At least, that's how I understood it when I did my degree!


""…all methods of modulating a wave can be recontextualized as frequency modulation!"

That's the classic way to think about it. Another way is to view the input as simply a sequence of voltage readings."

Right. And modulation of any type produces sidebands as per Fourier! Do anything whatsoever to disturb a pure sine wave then math and physics dictates it so.


I went to my friends eecs graduation a long time ago at ucla and the founder of Qualcomm talked about how what drove him to get his phd was his curiosity and determination to understand truly how radios worked.

He said that he got his phd because that’s pretty much how long it took him before he felt like he really understood how his radio worked, and even then sometimes wasn’t sure.

Was a good speech that this article reminded me of.


An old and very accessible classic for the "general audience" to understand the theory behind "Radio Science" is Jim Sinclair's How Radio Signals Work All the Basics plus where to find out more.


Tim Hunkin has posted a remastered version of his "The Secret Life of the Radio" TV program (from 1987) which recreates some of Hertz and Marconi's experiments with spark gaps and coherers.

https://www.youtube.com/watch?v=LMxate9gegg


I can't recommend this entire series enough, Hunkin's work is a masterpiece.


I think it is also worth mentioning the role of the ionosphere - which is the (charged) part of the atmosphere that will reflect radio/EM waves, and make it possible to communicate with someone on the other side of the globe. The ionosphere has different layers, and is quite dynamic - depending on the sun and its activity.

Basically, imagine a charged shell around the earth that reflects electromagnetic waves back, and that the properties of said shell is constantly fluctuating. Solar storms (and following northern lights) are bad news for radio communication.

That's the very, very ELI5 version.


What's worth mentioning is that shortwave offers much lower transfer latency than optic fibre so it's possible to establish faster cross-continental communication over radio than trans-oceanic optic fibre cables.


> Solar storms (and following northern lights) are bad news for radio communication.

Although it can allow for auroral backscatter propagation on VHF, which is fascinating. https://www.electronics-notes.com/articles/ham_radio/amateur...

Definitely not good for HF though. The Sun wilding out the past few days has at least given me the opportunity to work on some stuff in the evenings that isn’t screwing around on FT8, lol.


I truly enjoyed the article. When I played the vimeo video of ½ λ dipole antenna electric field propagation I reached for my headphones hoping to hear dark side of the moon. No dice. I get antennas and their physical characteristics and I am always intimidated by the math behind digital signal processing (DSP). Again, great article.


> In today’s article, I’m hoping to provide an introduction to radio that’s free of ham jargon and advanced math.

Sounds great! Let’s dig in.

> … the fundamental mirroring behavior is still present, but it’s usually managed pretty well. Accidental mirror images of unrelated transmissions can be mitigated choosing the IF wisely, by designing the antenna to have a narrow frequency response, or by putting an RF lowpass filter in front of the mixer if needs be

Mission failed. Ah well.


Not really unless you refer to the use of 'IF' and 'RF'. Maybe it would have been better if they wrote these out as 'IF (intermediate frequency)' and 'RF (radio frequency)' with a link to explain in which context IF is used but for the rest that sentence looks OK to me.


The meaning of "mirroring behavior", "narrow frequency response", "lowpass filter", "mixer", "IF", "RF" are all unexplained both in this article and the listed prerequisite articles.

The meaning of "mixer" might be inferred from the earlier "the basic operation of almost every radio receiver boils down to mixing (multiplying) the amplified antenna signal with a sine wave of a chosen frequency." And the meaning of "mirroring behavior" might be inferred from the earlier "we can see that every peak of the driving signal reaches the ends of the antenna perfectly in-phase with the bounce-back from the previous oscillation". But these are still explanations that rely on a good amount of past expertise and other jargon not covered in this or the author's other pages, which probably has to involve quite a lot of what most people would consider "advanced math", though as always "advanced" is relative, or alternately a lot of direct experience playing with analog signals.


If anyone enjoyed this article, then i'd recommend reading this one as well[1], it's an interesting article with a focus on the relationship between radio and probabilistic reasoning in the early 1900s. https://www.argmin.net/p/the-spirit-of-radio


Would it be possible to construct a rudimentary FM radio receiver with only the most basic parts ala Masters of the Air?


AM is quite easy (a diode and a capacitor can be enough), an FM receiver need a local oscillator that require some active elements (transistors) and a more complex circuit.


Under what circumstances is a diode and a capacitor enough to make a radio receiver?


A Foxhole radio was often made from a coil of wire (inductor), a razor blade and pencil lead (diode):

https://en.wikipedia.org/wiki/Foxhole_radio

> The aerial is connected to the grounded inductor. The coil has an internal parasitic capacitance which, along with the capacitance of the antenna forms a resonant circuit (tuned circuit) with the inductance of the coil, resonating at a specific resonant frequency. The coil has a high impedance at its resonant frequency, and passes radio signals from the antenna at that frequency along to the detector, while conducting signals at all other frequencies to ground. By varying the inductance with a sliding contact arm, a commercial crystal radio can be tuned to receive different frequencies. Most of these wartime sets did not have a sliding contact and were only built to receive one frequency, the frequency of the nearest broadcast station. The detector and earphones were connected in series across the coil, which applied the radio signal of the received radio station. The detector acted as a rectifier, allowing current to flow through it in only one direction. It rectified the oscillating radio carrier wave, extracting the audio modulation, which passed through the earphones. The earphones converted the audio signal to sound waves.


If you are building an AM crystal radio. [1] You will also need a high-impedance speaker [2] if you want to operate it without a power supply, otherwise you will need an amplifier. You can avoid using a commercial diode by making your own point contact diode as done in Foxhole radios [3] and you can make your own piezoelectric speaker from Rochelle salt [4]. Here [5] is one personal projects site touching all those topics.

In conclusion, you should be able to build a simple radio from copper wire, aluminium foil, a pencil, a razor blade, and baking powder.

[1] https://en.wikipedia.org/wiki/Crystal_radio

[2] https://en.wikipedia.org/wiki/Crystal_earpiece

[3] https://en.wikipedia.org/wiki/Foxhole_radio

[4] https://en.wikipedia.org/wiki/Potassium_sodium_tartrate

[5] https://rimstar.org/science_electronics_projects/index.htm#S...


AM peak detector is probably the easiest and primitive AM demodulator: it's basically made by a diode, a capacitor and a resistance. I implemented it when I was in the high school and I was making the first physics experiments.

The idea behind this demodulator is quite easy: the diode filters out all of the negative part of the signal, then the positive signal charge the capacitor and the energy is released in a quite constant way (R*C must be several order of magnitude higher than 1/f where f is the carrier frequency) during the negative signal "hole".


I was trying to say that a capacitor and diode (detector) is not a complete receiver.


You will need also a resistor otherwise the capacitor is not going to discharge, but the resistance is the easiest component :)


I read "AM is quite easy" as "AM demodulation is quite easy".


Very high impedance transducer, and very low forward voltage diode.


By coiling wires separately to form an inductor


In theory. Reception of FM works much like AM except you need an FM detector before turning the signal into audio. A circuit a little more fancy, than a diode (or rusty razor blade and wire) used in AM. But not that much more fancy. There are many ways to detect FM. Most methods actually use the phase - the immediate relative change in the frequency - and not the actual frequency. While the modern technique is a phase-locked loop, it can be much simpler.

A classic passive RC or LC or RL filter, will cause a phase shift. Take the difference between two phases effectively decodes FM. No active elements required. With only RL filters it would be very inefficient. So you may need to be standing next to the transmitter if you don't have at least one transistor or tube. But one such tube or transistor would be plenty for a strong local broadcast.

If you have some capacitors, it gets easier. Maybe paper and tinfoil rolled tightly, or flakes of glass or mica between thin slices of wood. Check out the slope detector circuits here [1] (fig 3, 4); the article is a good introduction to FM detection in general.

[1] https://wiki.analog.com/university/courses/electronics/elect...


Yes, we did this in middle school. Was great fun :D


This has potential to be an interactive topic like one of https://ciechanow.ski 's topics.


Also, I’m not sure if people are aware of the number of radio systems that enable their smartphones.

NFC (eg. Apple Pay) is a radio, range a few cm. Bluetooth is a radio, a few meters. WiFi is several radio systems, range tens of meters. Cell phone is several radio systems, range up to kilometers. GPS (and rival systems) range up to thousands of kilometers.


NFC is not really a radio. Basically it uses a loosely copled transformer. Works much closer than 1 wavelength and only magnetic field matters.


Good point, NFC is described as operating in a particular RF band but not through radio waves:

> As with proximity card technology, NFC uses inductive coupling between two nearby loop antennas effectively forming an air-core transformer. Because the distances involved are tiny compared to the wavelength of electromagnetic radiation (radio waves) of that frequency (about 22 metres), the interaction is described as near field. An alternating magnetic field is the main coupling factor and almost no power is radiated in the form of radio waves (which are electromagnetic waves, also involving an oscillating electric field); that minimises interference between such devices and any radio communications at the same frequency or with other NFC devices much beyond its intended range. NFC operates within the globally available and unlicensed radio frequency ISM band of 13.56 MHz. Most of the RF energy is concentrated in the ±7 kHz bandwidth allocated for that band, but the emission's spectral width can be as wide as 1.8 MHz[57] in order to support high data rates.

https://en.m.wikipedia.org/wiki/Near-field_communication


100%. Every time I read the term "antenna" when referring to the coil used for NFC/RFID I suffer inside...


...and yet, efficiently transferring a 1kb file between two physically adjacent smartphones remains an apparently unsolved problem.


AirDrop it?


I'm sure that works beautifully, unless one or both parties happen to be running the most popular mobile OS in the world, Android. That 'AirDrop' even exists as a proprietary technology really proves the point.


On Android, use Android Quick Share which uses Google Nearby. It works very similarly to Air Drop including support for UWB.

There are lots of things where Apple and Google have solutions that don't work together.


Another trick, which I haven't really appreciated for a long time, is that it's VERY dark in the radio frequencies. Black bodies radiate barely any energy there. It's quiet so if you shout even moderately loudly you can be heard halfway across the globe. It's permanently night and even small lamp shines quite far.


Essence of radio is very simple ... radio station broadcasts a high power RF signal into which some human listenable audio (your favorite singer's song) is mixed ... so the radio station propagates this RF signal modulated by the song being played ... across town at your house your radio synthesizes an RF signal identical to that broadcast from the radio station ... your radio amplifies the result of combining its local RF signal together with the received RF signal from the radio station ... this combining will add together both RF signals it will also subtract both ... when it subtracts ( wave interference ) the result is the difference between the radio station modulated RF signal and your radio RF signal (both at identical frequencies) which leaves you with just the song


That's not what the article is about... Many people lack details on how antenna designs are optomized and the application of Faraday and Lenz's law in signal propogation. This is what the article addresses.


>A perfectly uniform waveform is still not useful for communications...

It is if you encode information by switching it on and off in standard patterns. These uniform waveforms--or "continuous wave" (CW)--allow very simple devices with very little RF power to be used to communicate with Morse Code.


One could argue that technically it's no longer uniform if it's switched on and off, though.


It is no longer uniform. It's counter-intuitive (unless you've really internalised the Fourier transform and/or the Shannon-Hartley theorem) but a pure sine wave stops being a pure sine wave if you key it on and off and occupies progressively more bandwidth as the keying rate increases.

An even less intuitive result is that you can decode a signal that is weaker than the noise floor if the data rate is sufficiently low and/or the bandwidth is sufficiently high. This has practical applications in amateur modes like JT65, ultra-wideband communications and even GPS.


You can see it happening in! [1] is a waterfall display (time is vertical axis, frequency is horizontal) of a few CW signals and compare the harsh braodband clicks on the right to the nice dotted lines on the left. That kind of broadband noise happens when your signal goes from on to off too fast (or something else like just not generating a clean sine wave). If your radio can shape your keying to have a little ramp-up/ramp-down you get a much cleaner looking signal like those on the left.

The noise is effectively AM, since you are modulating the signal from 0 to full amplitude, and with the very fast amplitude change you get what looks like characteristic AM signal with a center carrier and symmetric sidebands.

[1] https://imgur.com/BnagQzb.jpg


This reminds me of a really cool video on superheterodyne receivers that Technology Connections did. https://www.youtube.com/watch?v=hz_mMLhUinw


For sure they do not work the way the "Path Loss Equation" would have you believe they do. The path loss equation violates conservation of energy ie the frequency or wavelength term depending on how it's structured cannot be in the equation. And the receiving antenna does not have any 'gain' other than physically getting bigger or smaller, though the transmitting antenna can have gain depending on shape and size. That is, the transmitting antenna and the receiving antenna work very differently. Yes, end to end the path loss equation gives the right answer but in between it's scientifically illiterate.


Yes and no. I emphatically agree that the way the Path Loss Equation (Friis) is taught is misleading. I much prefer the way you interpret it, with the transmit antenna represented with gain and the receiving antenna having only an effective receive area. It's much more intuitive because I can visualize a spherical shell of power radiating outward.

That said, a receive antenna does absolutely have "gain", which is evident by the antenna receiving a stronger or weaker signal depending on its orientation with respect to the transmit antenna. The key is this: for an arbitrary antenna, the (transmit, if you like) gain has a one-to-one relationship to the "effective receive area" at a given frequency, so talking about area and gain are equivalent, if not intuitive. We usually assume for point-to-point links that the antennas are oriented at each other, and in such cases (for good aperture antennas), you are absolutely right that the physical area and effective area are approximately equal. For ideal wire antennas, however, the physical area of the antenna is 0, but the effective area is nonzero (because of magic).

Now, I disagree that the path loss equation violates conservation of energy. The link to the effective area and gain depends on the wavelength. When I increase the frequency of operation but I keep the gain of the antennas constant, the areas decrease, so my receive antenna is physically smaller and the power goes down. Not breaking physics. A lot of people will say "path loss gets worse as you go up in frequency", and this is extremely misleading if not "scientifically illiterate" as you pointed out. Sure, there are molecular absorption bands from oxygen/water that literally dissipate power in the atmosphere, but generally speaking, the path loss didn't get worse, your receive antenna just got smaller.

Now wait a minute, what if I just made my receive antenna larger? Well, you can do that! The problem is that because gain and area are linked, efficiently receiving power in a given LARGE area (with respect to the wavelength) implies high gain. High gain implies a very narrow beam (more like a laser pointer than a normal dipole spilling energy everywhere). So it becomes really important that I "point" my receive antenna perfectly at the transmitter. Satellite dishes are really big, and they absolutely have to be pointed accurately at the satellite.


How can an equation that does not represent a balance of energy violate energy conservation?

With path loss equation I assume you refer to Friis equation which is just the ratio of power received at an antenna to power given to the transmitter. It is correct and does not violate conservation of energy since it says nothing about the power not received at the receiver


What they're saying is that the geometrical interpretation of an outwardly expanding spherical shell of power shouldn't depend on frequency. In this respect they are correct and they have a good intuition for the problem.

Now here's the catch: If the receive area were not changing as a function of frequency when the receive antenna gain is kept constant (it does), this would break physics (it doesn't). However, the effective area of an antenna with fixed gain varies as 1/lambda^2. In effect the geometric interpretation is still correct, but the variation of antenna area with gain resolves the seeming paradox and saves physics.


> the geometrical interpretation of an outwardly expanding spherical shell of power shouldn't depend on frequency

I think nobody says that is does. I believe the problem is to call Friis transmission equation "Free-space loss". Actually the Friis formula is composed of 3 terms: the receiving and transmitting antennas gain and the actual free space loss which has the 1/R^2 dependency (which actually isn't a "loss" in energy balance terms, since it's not lost energy, just energy not received at a certain point, so we could argue about that term too...)


Yep! Fully agreed with all your points, I was just trying to get at the original poster's line of thinking.


"And the receiving antenna does not have any 'gain' other than physically getting bigger or smaller..."

Well, it depends on one's definition of gain! If you were to say to the designers of the ELT (the Extremely Large Telescope) that it had no gain over isotropic then they'd fall about laughing (remember, its method of operation also relies on collecting and concentrating incoming EM radiation as do RF antennae). An antenna's effective gathering aperture and directivity for both RX and TX is just about everything, and the coupling efficiency from the antenna to the feeder and RX/detector, and vice versa for the TX just about covers the rest.

"...though the transmitting antenna can have gain depending on shape and size."

Uh? How? What's the difference? Physics says the law of reciprocity applies, a good transmitting antenna also makes just as good a receiving antenna. The only proviso being that a transmitting antenna has to be designed to withstand high RF power levels (even then, this only applies to TX power levels where I²R losses can cause enough heating to damage the antenna and feed lines, similarly, high power TX levels can lead to very high voltages which can arc over; TX antennae are designed to handle this.)

I used to work with microwave transmitters and receivers and my microwave dishes and other types of antennae were directly interchangeable—in fact, they were identical.

Re the Path Loss Equation, it works in the practical sense and is used everywhere. Fighting over technicalities here is akin to arguing the difference between laws of motion under Newton and when they're subject to the rules of Einstein's Relativity. It's damn obvious when one's applicable and the other is not.


> The path loss equation violates conservation of energy ie the frequency or wavelength term depending on how it's structured cannot be in the equation.

Why is that?


It's because energy created by the transmitter must degrade as one over R squared in the far field. The frequency (or wavelength, have your pick) has nothing to do with the energy transmitted because energy must be conserved. Putting in the frequency term then violates conservation of energy between the antennas. Then, at the receiving antenna the error of conservation of energy is then patched up by assigning a bogus 'gain' at the receiver. The transmitter and receiver are asymmetric but the path loss equation pretends that they are because that's easier for most people to understand and it works out 'end to end'.


Absolutely I agree that the geometry of the problem dictates 1/R^2 dependence, regardless of frequency. The gain, which I agree is a misleading way to think about the area, is related to the area of receive through the frequency terms. If you don't like that form of the path loss equation, I understand (I don't either!), but physics is not broken.

Where the "bogus" gain really shines, though: I can take my original receive antenna, operate it as a transmitter (so gain is now relevant), receive with my original transmit antenna (where I now care about area) and get the exact same result in terms of loss!


The formula on wiki has a distance squared term in the denominator tho?


Yes it does but it's the lambda symbol that's the impostor. An analogy of how it really works is to think of the transmitter as a water hose that is spraying water into a bucket, which is the receiver.


Short answer: it doesn't, though I understand why it's misleading. Read my response above.


Transmitting and receiving antennas work the same way. Flip the sign of time in Maxwell’s equations, and radio waves will run perfectly backwards.


Antennas have always been black magic for me, and this article blew my mind with the "capacitor you pull apart". Thank you for posting this article, this is fantastic.


Not an RF engineer. But I mess with radio's for a living.

Most of the time when people try to explain antenna's they start talking resonance. Which really describes a 'good' antenna.

What an antenna does is create a alternating magnetic field with a alternating electric field 90 degress out of phase with each other. Blah blah blah quantum electrodynamics blah blah blah radiates photons.

Resonance means the antenna stores energy as resonance. That increases the electric and magnetic fields making the antenna radiate more efficiently. Some antenna's are very wide band and 'flat' and used for cough cough military cough cough and other applications.


I work RF world pretty regularly, and I still consider the Superheterodyne Receiver to be tantamount to magic.

Edwin Armstrong was a brilliant brilliant man.


Not that it matters much, but it seems to be somewhat unclear who came up with the idea for the superheterodyne receiver first. Could be Armstrong, or Lévy, or even Schottky. The patent in the US was eventually awarded to Lévy.

Armstrong definitely was a genius though. Before the superheterodyne receiver he also invented the regenerative receiver.

And you're right, the superheterodyne is such a marvelous technology. The principles it's based on aren't super complex in itself, but the combination of them is genius.

https://en.wikipedia.org/wiki/Regenerative_circuit https://en.wikipedia.org/wiki/Superheterodyne_receiver


Ha, not magic but conceptually the superhetrodyne is an absolutely brilliant design and it's still not lost is 'magic' even after a hundred years, and likely never will despite newer digital concepts (they being more complex to implement).

"Edwin Armstrong was a brilliant brilliant man."

Right! ...And as you'd likely know, Armstrong's tormentor and nemesis was an arrogant, despicable bastard of the first order!

(Believe it or not, but decades ago I worked in a prototype lab at RCA and actually met David Sarnoff albeit briefly. That never changed my opinion of him.)


It was in part reference to "Any sufficiently advanced technology..."

So, you worked for RCA.. here is a question that there is no good book on, but what killed RCA, and what was it like working there?

I dont know what I think of Sarnoff - not sure I'd use evil - he was a "no niceties" fiercely competitive capitalist for sure - and how you feel about that may vary, Armstrong was also an extremely hard headed man, and thats not something that result in successful litigation - even if I am normally biased towards the underdog, he isn't always the most sympathetic underdog.


"...what killed RCA, and what was it like working there?"

Two good questions. Haven't time to reply in detail now but I'll do so shortly with what I know, and how I met Sarnoff, and about working there (in short, it was an excellent job).


Sorry, I nearly forgot about this and the deadline for replies is close. Easter got in the way, so did other matters. ;-)

Come to think of it, I reckon you're right, I've never seen a book on the history of RCA either, and there ought to be a detailed one given the worldwide importance of the company.

It's strange that I've not given much thought about RCA's history from a broader perspective because in recent years I'd thought that some of the work in the prototype lab where I worked was pioneering enough to be documented for posterity (as far as I'm aware it's never been documented). Given RCA's enormous size and its huge product range what we did was small scale (RCA was the largest electronics company in the world at the time), but the broadcast and telecommunications products we developed were leading edge and were not only copied by other RCA divisions but also later by competitors.

Let me explain. Although I've been there I didn't work in Camden, NJ but rather for RCA Australia and I'll use that fact to segue into the little I know about RCA's demise. Unfortunately, RCA closed down its Australian operation in the 1970s as part of its reorganization/rationalization under Sarnoff's son Robert. I'm not privy to inside information as to why RCA closed down its operation here but I've heard it was a combination of reasons including that at the time the US Government was seeking to reign in foreign investment abroad in the wake of the Vietnam War and it put pressure on US companies, this neatly dovetailed into Sarnoff Jr's rationalization plans for the company.

The argument that reigning in of foreign investment never made sense to me so I assume the closure of the local operation was primarily to satisfy Sarnoff Jr's reorganization plans for RCA in the light of falling profits. By the time RCA Australia closed, I'd already left so I wasn't around during its closure. Also, for a time I'd lost contact with most of those with whom I'd worked although over the years I caught up with a few of the key players including the chief engineer (my immediate boss) at broadcasting engineering exhibitions such as SMPTE. Nevertheless, I never thought to ask them about the division's demise which I now much regret (most are now dead including my old boss whose funeral I attended several years ago).

Thus, for me, the local closure is still a matter of conjecture (it's also another reason why a well researched history of RCA is needed), but I suppose it's not surprising when we examine RCA from when Robert Sarnoff became president in 1965. It's public knowledge that by the late '60s under his leadership RCA's fortunes slowly started to change for the worse and unsurprisingly they accelerated after Sarnoff Sr was ousted as chairman in 1970. One would assume that when Sarnoff Sr died in '71 any constraints imposed on Jr by his father would have lifted and he'd been freer to pursue his reorganization. Nevertheless, it didn't work out and the company never fully returned to the good times of the '50s and '60s. Robert Sarnoff wasn't a patch on his father, he was disliked and ousted around '75 in a boardroom coup. Subsequent presidents were mediocre which led to the company's further decline.

Whilst no doubt RCA suffered from bad management under the leadership of Robert Sarnoff I believe there were also other mitigating factors that lead to its decline. From my outside perspective here in Australia I was able watch the growth of the Japanese electronics industry with perhaps a little more objectivity and I believe that not only was RCA caught out by its spectacular growth throughout the 1960s through to the mid '80s but so too were many other American companies such as Ampex. The rise of Sony, Ikegami and many other Japanese manufacturers had huge negative impact on both the manufacture of domestic televisions sets and professional television equipment, cameras etc. both here and in the US, and elsewhere, Philips for instance (I have a cousin who once worked for Philips in Eindhoven and he recounted the considerable financial difficulties Philips had with Asian competition including the piracy of some of its television designs).

That's the short version, there's more. For instance, Western manufacturers suffered from complacency and didn't easily adapt to Asian competition. There was arrogance too, I recall an incident around 1982 when a group of us was touring Silicon Valley companies, at one point we were sitting at boardroom table of computer manufacturer Sirius Systems and quizzing its CEO with questions about the company. When asked if he was concerned about competition from Japanese computer manufacturers such as NEC he absolutely pooh-poohed the very notion that there was any threat from Asian manufacturers, his reaction was such that he seemed almost insulted by the very question. I was immediately shocked at his naivety, and about two years later Sirius went belly-up. Perhaps a bad example, but you'll get the gist.

It's been a while but if I recall correctly it was 1968 when I met David Sarnoff, he came to Australia to lay the 'foundation stone' for a new record plant that was to being built in the Sydney suburb of North Ryde which is about a dozen or so miles to the west of the RCA Australia factory at Artarmon. Most of us traveled over to the site for the afternoon, we were fed lunch and Sarnoff gave us pep-talk about RCA after which he met us individually to say a few words and to shake our hands (I can clearly remember my short conversation with him). Incidentally, RCA's new record factory was just down the road from Australia's once top electronics company AWA (Amalgamated Wireless (Australasia) Ltd) [1], which made world-class telecommunications equipment, military electronics, broadcast equipment and measuring instrumentation. Unfortunately, AWA also suffered an ignominious end but no so much from competition but rather by its own hand.

At the time I met David Sarnoff I was young and not overly familiar with his background. My dislike of him grew only slowly—after I read about his ruthless business practices, unfair fights over patents—how he hounded Armstrong (in the book Man of High Fidelity: Edwin Howard Armstrong) [2] (a copy of which I still own), and Philo Farnsworth, and his looseness with the truth such as the 'Titanic incident', also his exaggeration of RCA's early involvement in the development of television. Listening to the 'Dynamic Duo'—Zworykin and Sarnoff—praising their involvement in the development of television in the 1956 documentary The Story Of Television [3] is rather sickening, as one would never learn of the many other important people involved in the development of television and of their pioneering inventions. Zworykin and Sarnoff are on a self-aggrandisement kick and history, as always, has caught them out.

Sure, Zworykin played an early role but likely not all of his ideas about the iconoscope were original but likely those of Rosing, his teacher. Watching both the 1956 doco and the earlier 1939 Word's Fair one [4], one would never learn of the importance of people like Willoughby Smith (of photoelectric fame), Tihanyi, Rosing, Nipkow, Farnsworth, J. J. Thompson (of CRT fame), and Isaac Shoenberg and his famous Marconi-EMI team including Alan Blumlein—those who developed most of the essential electronic circuits for modern television systems (they essentially developed the electronic 405-line television system from scratch, all RCA had to do was to adopt it to the 525-line US system). Moreover, the BBC went live with television in 1936 which was three years before RCA even demonstrated its television. Even the Nazis broadcast the 1936 Berlin Olympic Games on television!

And that's just the short list: many contributed, even now-obscure people like William Eccles and Wilfred Jordan who invented the flip-flop (multivibrator) circuit around 1918 are an essential part of the development of television, their Eccles-Jordan circuit as it's known today was an essential part of mid-to-late 20th Century television electronics, it was used in the critical scanning and sync lock circuits essential for modern TV systems, and in many other electronic circuits (oscilloscope triggers for instance).

What's so annoying about Sarnoff is both his arrogance and his attempt to rewrite history. Sure, RCA did develop great technologies such as the image orthicon (a truly remarkable device but nevertheless an evolution of earlier tech) and the shadow mask color tube, and much, much more, but much of it was after much of the early pioneering work had already been done. Essentially, at no time did Sarnoff give credit to any of these earlier television pioneers but also he effectively went out of his way to write them out of history.

Whilst I am very glad to have worked for RCA and I am forever thankful for the valuable experience I gained whilst working there, I nevertheless consider Sarnoff dishonest and his behavior impertinent, insulting and offensive.

This post is close to its text limit, if I've time and I can beat the timeout then I'll post some info about the products that we produced at RCA Australia and why they are an important, albeit small, part of RCA's history.

__

[1] https://en.wikipedia.org/wiki/AWA_Technology_Services

[2] https://www.amazon.com/Man-High-Fidelity-Howard-Armstrong/dp...

[3] https://www.youtube.com/watch?v=ombUhZHddho

[4] https://www.youtube.com/watch?v=3y32ZxGfiXs


Re RCA Australia.

At the time I had applied for two position in the electronics industry and both were in the television, and as both my applications had been successful I had to decide which one to take. The other position was closer to home but I chose RCA because I thought I would gain more experience working there. This, I reckon, was one of the best decisions I've ever made, working for RCA turned out to be an extremely rewarding experience as it cemented much of the type of work that I would be involved in through a significant part of my career. From RCA I went on to work in the engineering side of television broadcasting, and much later in my career I was involved in setting standards for industrial television systems for use in surveillance applications for a well-known international organization. The requirements were such that standards had to incorporate both encryption and authentication (the stakes were high and authentication a critical aspect of the tech). I was also involved in the early days of FM broadcasting in Australia, this work not only involved FM engineering but also politics (convincing government to introduce a new broadcasting service wasn't easy and not for the fainthearted but the dedicated work of many individuals made it possible and Australia's FM service first commenced in December 1974—this being the 50th anniversary year). My background working with RCA helped equip me for this task.

RCA Australia manufactured a range of electronic products including domestic stuff such as Hi-Fi amplifiers but its primary focus was on professional broadcasting and telecommunications products of the type broadly classifiable as line, terminal and distribution equipment as used by telcos, broadcasting stations and program production companies. For example, audio line equipment such as sound distribution amplifiers (SDAs), video processing equipment—video clamping and video distribution amplifiers (VDAs) and the like. Other work included setting up and maintaining imported film equipment (for telecine), studio television cameras, broadcast quality monitors (all usually of RCA manufacture), RCA Australia even had manufactured high powered television broadcast antennae for local consumption (those were the days when no one gave a moment's thought to them being plated in bright cadmium).

One of the huge advantages of working for RCA Australia was that I got to work with a very broad range of electronics and related equipment. One of the statements that Sarnoff makes in the The Story Of Television with which I wholeheartedly agree is that television brings together a very broad range of technologies, to quote "...I marvel at their accomplishment [referring to RCA researchers] bringing into focus the principles of radio, optics, electronics, photography and chemistry so that they might all work together to make color television practical..." To give him credit where credit's due it's a damn good summary of the technologies involved in television engineering although I'd have added physics (given that a good understanding of fundamental physics is necessary to design, say, image orthicon and vidicon camera tubes or shadow-mask color picture tubes), also one had to have a good understanding of advanced color theory such as CIE 1931 color space parameters (a brief look at the RCA Electro-optics Handbook will attest to this https://www.amazon.com/RCA-Electro-Optics-Handbook-Author/dp...).

I think perhaps the best way to illustrate why I believe RCA Australia ought to rate in any detailed history of RCA is the high performance of its professional products. Let me cite an example. The VDA (video distribution amplifier) type TA-100B was a solid state video amplifier with both bridging and matching input options and five outputs. It offered adjustable gain together with a plug-in cable equalizer module (which could be changed for optimal performance depending on the type of coaxial cable used), it was made in very large quantities at the Artarmon plant in Sydney and was used by both the common-carrier telco and by television stations, primarily the national broadcaster the ABC (then known as the Australian Broadcasting Commission). I knew one of the principal engineers at the ABC, Neville Thiele, and he kindly sent me copies of manuals written by him on pulse-and-bar video testing. Neville was a brilliant engineer, he was just as at home when designing television IF amplifiers (which he did at EMI) or when designing loudspeaker enclosures. He was also a very nice and helpful fellow (and it was through RCA that I'd gotten to know him).

Even by today's standards, the baseband performance of the TA-100B video amplifier for PAL, NTSC etc. would be considered very good but for its time of 50+ years ago its performance was exceptional. It had a differential phase and gain performance on 10 MHz T pulse-and-bar, modulated saw-tooth and other test signals of less than 0.1% which was around the limit of measurement of the differential phase and gain test equipment used at that time (today, a top baseband VDA should reach around 0.03%).

I recall seeing a test setup where a TA-100B launched a high speed 10 MHz T pulse-and-bar test signal from the equipment test bench at the Artarmon plant into the telco's coaxial feed which was received by Sydney's Test Room (the telco's main switching hub), which was about 10 miles away, the test signal was then rerouted to Melbourne via the then new intercity coaxial cable—a distance of about 545 miles (877kms), in Melbourne's Test Room it was then looped and rerouted back to Sydney's Test Room via the coax thence back to us at Artarmon—a total distance of well over 1000 miles. (Incidentally, back then the only telco in Australia was the Government's Postmaster General's Department which was responsible for all communications (telephone lines, spectrum management etc.))

I have a figure in my head for the number of VDAs used in the loop but I cannot guarantee its accuracy so I won't mention it but it was many hundreds. The outgoing feed and the received loop signal were fed into a Tektronix 465 dual channel oscilloscope on the test bench (which incidentally used RCA-designed Nuvistors in its input amplifiers), the two signals were then superimposed over each other using delayed sweep. What I saw almost defied belief, the T pulses when expanded and superimposed over each other appeared almost as one single trace albeit with a slightly fussy appearance caused by noise (the noise only increased the trace thickness by about half as much again (perhaps it was a just a whisker more—I wish I'd had a photo of the 465's screen that would have documented the event). Keep in mind the pulse-and-bar test signal wasn't the lower performance 5 MHz 2T pulse but the full T/10 MHz one. As far as I am aware, this exceptional performance was never published in journals and I only wish I had more information about the test (I wasn't responsible for setting up the test, after all back then I was the youngest and most inexperienced member of the prototype lab team).

Incidentally, the TA-100B VDA was designed by an engineer who left to work in the US just before I joined RCA but the chief engineer to whom I reported was quite brilliant especially in the area of filter design. BTW, the TA-100B used a very low impedance totem pole-type output stage which fed into an array of 75-ohm resistors to define its output impedance, this was quite novel for the time. I could say much more about this device and the work of the prototype lab [1] in general but unfortunately there's not the time nor space.

One final point, I was informed that after RCA Australia closed that RCA Switzerland continued to make the TA-100B. Sometime later a Canadian company whose name I've temporarily forgotten made a copy of the VDA (I've no idea whether it was made under license or just reverse-engineered). About a decade later I was visiting AVE (Australian video Engineering), a manufacturer based in Sydney that began operation after RCA Australia had closed, and the company's chief engineer showed me their VDA together with a copy of its circuit diagram, they also informed me that they'd based their design on one that originated in Canada. I instantly recognized it for what it was—an almost identical copy of RCA's original TA-100B albeit with a different circuit board design and mechanical construction! He was somewhat taken aback that I already knew so much about its operation and of its background. Right, the design that originated in the RCA prototype lab in Sydney had returned home after traveling full circle.

Finally, I can only repeat that I was very privileged to work at RCA Australia but it was just sheer good luck and happenstance that I landed the job when I did.

It would be good if anyone knows more about the history of RCA Australia and of say the test results of the equipment that was designed there, and or engineering details of the Sydney-Melbourne coaxial cable. I again failed to ask my former boss for these details before he died, that now is a matter of great regret. Incidentally, I still possess a handbook for the TA-100B, it includes a description of its operation together with a circuit diagram and complete parts list.

__

[1] There was one important aspect about RCA's prototype lab that I forgot to mention. Its windows looked out and across the nearby railway line directly towards Channel 9's television tower. From the windows we had an excellent view of the antenna's radiating panels which meant the RF levels from the TV channel were exceptionally strong within the lab. Thus, breadboard and bird's nest type mock-ups of prototypes suffered considerable interference to the extent of it being common to see directly-demodulated TV video complete with TV sync block appearing within these circuits.

This was both a curse and a blessing. We became quite expert at suppressing and or eliminating the RF interference altogether from our designs. Ferrite beads and various other RF suppression methods came to the rescue. Our lab environment not only forced us to harden our designs against the direct injection of RF but also we paid particular attention to mains-borne interference including mains hum. It was commonplace to see ferrite beads on transistor leads and such, the beads also eliminated high frequency parasitics that were occasionally troublesome. In essence, the lab's poor and inadequate RF environment forced RCA Australia to build better products.


Learning radio can help improve your skills in understanding the analog underpinnings of networking, CPUs , electronics & circuits .

So many issues come about by taking these for granted. EMF interference , a short circuit or a noisy power supply can cause non-deterministic issues that will drive you mad unless you are aware of the root cause.


I'm intrigued by things like this that used to be high technology but now are mature and pushed way down into the infrastructure. No one is going to make much money being really good at radio, any more than they will be really good at machining steel, but it's still necessary for higher levels of the tech stack to function.


The US (and Chinese, and Russian, and European...) government spends billions a year on companies that are good at radio. Radar, satellite communications, 5G, etc, etc. are all critical parts of modern technology stacks, that are "high technology", and key for forward innovation. If you think it's a solved problem, why doesn't every telecom company have nationwide 5G deployed yet?

There is A LOT of money to be made in the space, if you're good.

But, it's not AdTech, so HN isn't familiar with the field I guess :^)


I promise people still make piles of money being really good at radio and really good at machining steel. The complexity of the deliverables has increased, yes, but the expertise and technical skill to do modern radio and machining is very much rewarded in the marketplace.


"No one is going to make much money being really good at radio, any more than they will be really good at machining steel,"

How do you know? For instance, I'd suggest that not every method of modulation has been invented or even yet implemented. Also, we've hardly begun to design and implement meta materials into antennae and RF filters—the field's still wide open for innovation and invention.

And new methods of 'machining' steel have recently been invented and are just coming into use (if I owned the patents I'd be sitting pretty for life).


The point I was trying to make was that these things are no longer what determines success or failure. Incremental improvements are possible, but do not dominate.

Consider the fight between a company that's great at RF design, and a company that's on top on software. Which do you think will win in the market for cell phones? (This is a trick question.)

Mature technologies tend to be things that can be outsourced. So one can make some money (if not a lot) as a supplier of these things in a horizontally integrated industry.


In some instances perhaps that's so. And I'd mostly agree with you re mature industries and outsourcing. But in the instances I've given (and they were just off-the-cuff, but there are many more) I'd argue that they are new and developing tech that's nowhere near full maturity. New modulation methods combined with newer polarization techniques have the potential to revolutionize spectrum management (that's where many more channels can be made available in already-overcrowded spectrum space). With new meta materials the sky's the limit—from new types of specialized antennae to optical lenses that are an order of magnitude smaller whilst bettering optical sharpness and abberations, and that's just for starters.

Re cell phones, I'll just say this. RF engineering went through a revolution about 10 to 15 years before cell phones became commonplace. The electronics shrunk in size by an order of magnitude, the RF front ends including its transistors increased their FT (transition/cutoff frequency) by over an order of magnitude, and the noise figure of RF front ends was reduced to well below 1dB.

The net/combined result of these three improvements in RF engineering was a truly remarkable achievement. Fact is, there'd be no smartphones or integrated GPS without these developments all having happened. I'd also add that by and large programmers simply haven't a clue about this very special aspect of RF engineering. They just take for granted that the hardware does what it says, to them it's just a black box.

Programming, which cannot be really called engineering because many decades on, it's still an undisciplined way of working that's more akin to an art form than true professional engineering, can however provide the glitz factor that sells phones, but it's completely lost without the state-of-art RF hardware substrate upon which it sits. Moreover, the next revolution in phone technology such as direct satellite services, will only be practical after a further revolution in RF hardware (what passes for some showy demos today is nowhere near the mark if the tech is to be widely adopted).

If you think I've been too hard on programming then I'd suggest you read this near 30-year-old SciAm article as starting point: https://www.researchgate.net/publication/247573088_Software'....

Thirty years on essentially programming has adopted some new techniques for developing software but the fundamental underlying problem of the lack of rigorous standards together with proven methods to solve problems by computer remains unresolved, and in some instances the software industry has gotten demonstrably worse.

I've not the time to put current references that support my claim but a search will soon find them—ones written by both authoritative people within the programming industry and elsewhere such as in computer science research.


You can make a lot of money if you are able to implement the WiFi and cellular protocols in hardware, which are a complicated mess, and riddled with patent traps.



I think this undersells the trick behind radio.

Say we have the technology to broadcast a signal from an antenna to receivers, with some bandwidth B. Without getting clever, we can only send or receive one signal, since any others would interfere with each other.

The trick is, can we do something to shift the bandwidth B to some other base frequency F such that B + F > B? Or B + (N - 1)F > B? And if we can do that, and then downshift from B + NF back to B, it means we can broadcast to multiple channels, and receivers can tune their antennas to F and downshift the decoded signal to 0 and receive it at the original bandwidth B.

A cheap way to do this is amplitude modulation, where multiplying a signal with bandwidth B by a carrier signal of frequency F shifts it up to the range F +/- B and we can space channels apart by 2B to get however many channels our antennas allow for.

The real question is, why is it 2B and not B? Well that lies in some Fourier analysis, where the bandwidth of a signal extends into negative frequency ranges. But neverless, there is another trick, called single-side-band modulation (SSB) where we can shift a signal into the range F + B instead of F +/- B, and demodulate it into -B, B to get the original.

And that gets us to the 1950s in terms of radio technology.

The trick behind FM is to understand we can get more bandwidth by shifting the frequency response not into a series of non-overlapping channels centered at carrier frequencies like AM, but to distribute most of the information across many non overlapping bands over the entire spectrum of the antennas. To do this we don't modulate the amplitude of the carrier, but its frequency. This makes it possible to distribute far more bandwidth across a wide range of frequencies, and it's how FM radio works today.

These concepts create the foundation for modern radio communication, We can modulate data signals to different bandwidths and receive them, provided we know where to tune to. And these bandwidths can either be continuous chunks of spectrum (AM), or interleaved (FM). The next step is to think in terms of time, which is to say that we can have receivers negotiate not only which ranges in frequency they care about, but which time frames they want to listen before waiting for their next time slot.

For those interested in the theory, the fundamental problem is that we can design antennas that can transmit or receive at some fixed maximum bandwidth, bounded by physics. The engineering problem is to find out how to share that bandwidth to maximize the number of receivers and/or senders by sharing the same bandwidth. Amplitude modulation is excellent, but it divides the bandwidth up into a fixed number of channels of maximum individual bandwidth. FM is a bit more efficient in how it can allow many broadcasters to even more receivers choose which channels they receive. But for modern communications, where we need high bandwidth for distinct transmitter/receiver connections, we need protocols to figure out how to share the bandwidth over the air and the two tricks are to divide that bandwidth by frequency (like AM and FM) or time (sharing the same frequency channels, but only picking the frames that we care about), or both.


And for those even deeper into the theory, one question you might ask is, if we can divide spectrum and time to get some bandwidth B per channel, how many bits can we send/receive over a distinct channel?

The answer is C = Blog2(1 + S/N) where B is the bandwidth and S/N is the signal to noise ratio determined by the environment (how much noise is present relative to the signal being transmitted). The crazy thing is this was proven in the 1940s and everyone interested should go read The Mathematical Theory of Communication by Claude Shannon. This is referred to as the Shannon-Hartley theorem, and it determines the channel capacity (C, in bits/second) of any communication channel in the presence of noise.

The math concepts might seem heady, but it's actually fairly approachable and available online. It's fascinating that the fundamentals were proven out in one work nearly 80 years ago by a handful of people, and the math is not that bad.

The thing that makes this nuts is that if an engineer picks some target bitrate for a device, say a cellphone watching video, they can work backwards to determine the channel capacity they need, do some experiments to figure out noise, and then determine what the target their modem protocol needs to reach to be suitable. And this is how we get 5G and fiber or whatever comes next.


Shannon was pretty ridiculous. He basically invented information theory, proved all the major theorems involved, and applied it to communications and error-correction codes. If you work in RF you can't do much without encountering his work. (It did take a while before anyone figured out how to get close in practice to the limits he proved, though)


And before his work on information theory, his master's thesis showed that Boolean algebra could be used to design digital circuits and invented logic gates:

https://en.wikipedia.org/wiki/A_Symbolic_Analysis_of_Relay_a...

https://spectrum.ieee.org/claude-shannon-information-theory

One of the all time greats.


> divide that bandwidth by frequency (like AM and FM) or time

Ah. The real magic is when we separate by space (beyond just frequency or time). The ability to do this was discovered relatively recently, in 1996, by a guy called Foschini, though radio astronomers will say "Meh". By adding multiple antennas and doing space-time coding engineers found they could pump an order of magnitude more data through a radio channel. The maths involved is high school level (linear simultaneous equations), and it's magic to understand Foschini's work and think "Why didn't we do that before?"

The other bit of radio magic is error control coding. This is the stuff that lets us reliably talk to Voyagers I and II.


Fascinating how we keep being inspired by fundamental physics and astronomy to keep cramming mode information in our channels. I'm still trying to understand Orbital Angular Momentum multiplexing https://en.m.wikipedia.org/wiki/Orbital_angular_momentum_mul...


I'd agree with the Wikipedia article, that it sounds like MIMO, in that it requires the beam to have a spatial extent.

From the Wikipedia article:

> can thus access a potentially unbounded set of states

That's what people originally thought about MIMO. MIMO's not unbounded. The limit to the number of states is related to the surface area of the volume enclosing the antenna, with the unit of distance being the wavelength. A result radio astronomers already knew when the comms people derived it. With absolutely no evidence to back it up, I'd guess that the same limit applies to OAM multiplexing.

As an aside, when one expresses physics in terms of information theory my understanding is that the maximum the number of bits that can be stored in a volume of space (also the number of bits requited to completely describe that volume of space) is related to the surface area of the volume with the linear unit being Plank lengths. Is MIMO capacity in some way a fundamental limit in communications?

[1] https://physics.stackexchange.com/questions/497475/can-anyon...


"Radio communications play a key role in modern electronics, but to a hobbyist, the underlying theory is hard to parse."

I don't believe radiocommunications and the electronics of radio is hard to understand—at least that's so at a level where a hobbyist can gain enjoyment from the subject.

I say that as someone who obtained a radio amateur's license in junior highschool at age 15.

Yes, radio engineering and its physics does get very complicated at the high end, and for a good understanding one requires advanced math including partial differential equations such Maxwell's equations and their SR/Special Relativity extensions, and beyond that one needs to understand the physics of electrodynamics and that requires knowledge of quantum mechanics including QFT (Quantum Field Theory), which is top-echalon physics and close to as complex as physics gets.

However, the hobbyist doesn't need to know an iota of that advanced complex stuff to enjoy radio as a hobby. Absolutely none of it.

All that he/she needs to know are very basic principles such as how antennas receive and radiate signals, how radio signals are amplified and detected, and later on how signals are mixed, multiplied and hetrodyned, and how radio transmitters and receivers work—even the principles behind how the common superhetrodyne receiver works is pretty standard knowledge for a radio hobbyist.

Back when I was learning about radio I doubt very much if an article would have been written in the tone of this story, especially so one that implied that to understand the subject could be difficult even at a hobby level. Why, you may ask? Well back then, if anyone had a hobby interest in electricity and electronics then essentially the only outlet for their interest was radio and perhaps television, as the other branches of electronics would not have been as readily accessible to hobbyists.

Nowadays, that's changed, there's much more to keep a hobbyist's interests such as programming, computers, computer games, and other electronics not based on radio technology—digital electronics for instance, so knowledge about radio tech and radiocommunications theory have become much less commonplace having been diluted amongst all these competing interests. Obviously, the knowledge is still out there but it's more widely dissipated and not as easily accessible in the practical sense, especially so for hobbyists of a young age.

When radio was essentially all that there was around there were many more elementary books on radio available for younger readers and these increased in complexity as the hobbyist gained practical experience. For instance, when I first became interested in radio my first introduction to the subject—like most others—was building crystal set radios, and from there we advanced to incorporating tubes and transistors into our more advanced designs. For beginners, hands-on practical books such as how to build crystal sets which included many different designs were commonly available.

(Back then, a well known author of books on crystal sets and basic radio was Bernard B. Babani, an unforgettable name if ever there was one. His books are still available but you'd never know to look for them unless told about them.)

Today, many have never heard of crystal sets let alone their 'cats' whisker' detectors, so when they become interested in the subject they're thrown in at the deep end. And not having the basics already under their belts, the more advanced radio theory comes as a bit of a shock.


I am looking for books which are for complete beginners but goes on to intermediate level, hobbyist and practical book where I can try out the concepts. Can you please recommend books for me?


Magnets


Came here to see this. Thank you.


Same here, glad I'm not the only one.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: