I'm a math dropout, now self-taught software engineer who has been spending the summer filling in gaps in my CS education. Programming language implementation finally started to click for me -- bytecode interpreters and compilers are much less intimidating for me than they were several months ago. And now having implemented my own (toy) interpreters, I'm interested for the first time in understanding some of the nitty gritty details of how hardware components work. So I started working through the nand2tetris projects and tinkering with breadboards, and wow. This stuff is so much fun! I'm bummed I didn't start sooner
I never really imagined I would be interested in this stuff, much less find it kind of beautiful. Pure math brain, if you aren't careful, can train you to view anything applied with disdain, and something as earthy as hardware even moreso. But it's kind of beautiful how there are deep and inevitable connections between some of the most abstract theory computer science has to offer (plt, pl design and implementation) and the way physical objects are harnessed for computation. I guess if you consider that von Neumann was a mathematician first and foremost then it starts to make more sense
Either way, I'm having a great time -- definitely recommend anyone in a similar position learn this stuff whenever you're ready for it
Nand2tetris[0] is hands down the best course I took in my CS BSc, Nissan and Shocken did such a wonderful job making this great content accessible. It was a fun and rewarding experience in which I built a toy computer bottom-up from scratch.
And for an online interactive take on the same material, try NandGame[0]!
I've gone through both and they're both excellent. Nand2tetris requires a bit more set up, but goes deeper into things like language development.
For people who have previously looked at NandGame, there has been quite a bit of extra content added over the last year, well worth taking another look.
Thanks for the link, just what I was looking for the other day. Was listening to a Steve Wozniak interview and his early tinkering with pong and the like.
Share more about your journey. These days I am feeling to having an interest rising about such things. Picked Codes book one night and got really hooked up.
I opened this thinking I was the target audience - a longtime developer who has always wanted to get into electronics. However, this first paragraph...
> In order to present our paradigm in learning electronics, we can take an example of a full adder chip
that enables binary addition. Let us use a top-down approach, and look at the datasheet of the TTL
(Transistor-Transistor Logic) chip 7483. We will immediately notice that the chip performs the addition of
two 4-bit binary words, A and B.
...and the associated diagram, just immediately left me feeling like I was standing in a room full of people who were smarter than me. Is this really where one starts while learning basic electronics?
I think this is mostly poor writing on the part of the author. You being able to "immediately notice" is contingent upon having read the data sheet, which was not linked nor reproduced, but the way it was presented made it seem like the diagram was sufficient. If you were to have read the data sheet, it would have stated up front that the circuit performs four-bit addition. But the page did not facilitate that...
I had to do a lot of navigation to get to the first page of content; which dropped me straight into a 4-bit full adder. To my mind, that's logic circuits, not electronics.
To be fair, most intro-to-electronics pages start with Ohms Law and so on, which to my mind isn't electronics at all - it's basic electricity. This site starts with diodes, which is indeed the entry-point to electronics proper. But jumping in with gates is arse-over-tit; gates are a special case of amplifiers, and I think explaining transistors as amplifiers should precede explaning them as switches.
Regarding navigation: The whole thing is set in something like Courier, and links don't look any different from body text, even when you hover. And I generally expect no more than one click from the homepage to get me to page 1 of the content. A contents page that links to contents-of-the-contents pages seems like a rabbit-warren.
If you look up a data sheet for any 7483 variant, it is immediately notable: the first lines are something like "4-bit binary full adder" and the description says "…accept two 4-bit binary words" If you skip to the next paragraph, you’ll see they are onto transistors. Ideally the author could show a data sheet, but there may be reasons they can’t.
I found the material that followed to be a clear exposition of the fundamentals – this is as somebody who has tried to learn electronics over the years but nothing ever stuck. This did a better job than anything else at a pace that is about right for me.
But YMMV and I personally relish being in a room full of people who are smarter than me.
In my experience in a university ECE program, you'd start with understanding the high level properties of transistors, then combining transistors to make AND and OR gates, then XOR and other gates, then MUXes and half/full adders, then flip-flops and eventually into synchronous (clocked) logic.
The lab component of such coursework did start with TTL chips but the timing of the coursework was such that you'd have most of the asynchronous logic theory taught by the time the chips came out.
Was it an Electrical Engineering, Electronics Engineering or Electrical Engineering Technology Program? My digital course skipped over transistor level and spent that time on basic FPGA's instead.
Not OP but I did Electrical and Electronic Engineering undergrad and we started with diodes at the materials level, then BJT and FET transistors, then logic gates, flip flops, timers, ALUs and eventually working up to build a Motorola 68K micro controller from mid level components. There was some VHDL and FPGA in the later stages as well from memory.
In my experience all serious electronics/electrical engineering learning material is written like this, as if the student knows everything about electronics except the one topic the author is explaining. Probably an artifact of being written for industry users. You can get used to it though, kind of putting boxes around certain circuits and just looking at behavior without asking how it works until you need to understand then doing a deep dive.
This is the equivalent of introducing programming by giving a piece of ARM assembly and stating that it is immediately obvious that we are dealing with a merge sort implementation.
I’m not sure why, but I see this often in other domains as well. Math in particular. I guess it’s the curse of knowledge.
Except that when I was in highschool, programming in assembly was not in the curriculum, but designing simple logic circuits was. I am pretty sure we practiced designing a half adder in physics class as part of the electronics chapters.
The bad part of the writing is the assumption that all that technical language and knowledge is stil at the top of your memory when you just picked up this book.
I no longer have the textbooks obviously, but it was just a regular gymnasium in a provincial town (Leeuwarden), we were using the standard government recommended VWO textbooks too, just under 20 years ago. I'm pretty sure it was in the textbook because the teacher was not the type to go off book and teach us something not needed for the tests.
We had those electronics circuit practice boards and you could hook multiple up together to make a half adder.
A group of students that were acing the tests and finishing their homework were spending a couple weeks of time in the back of the class assembling a full adder out of literally all of the practice boards the school owned. I wasn't acing any tests and wasn't finishing my homework on time but I joined them anyway because it was more fun. My contribution was realizing that we could use the relais as and and gates, doubling our resources enabling the completion of the full adder.
Dear kaishiro, thank you very much for the observation. Really makes sense! I changed it a little bit to be less 'discouraging', because you really are a target audience. It is my "poor writing", as noted, that sometimes takes over :) Please let me know if this makes more sense.
I once took a computer architecture course from the designer of the Burroughs 6700, who had us do a similar exercise. But that was back when people actually built things out of 74xx TTL. Few people do that any more. It would be very unusual to use a 4-bit adder chip today, unless you're deliberately doing retro stuff. And even more unusual to start there. Also, a 4-bit adder, a stateless device, is only useful when surrounded by latches and clocks so that something useful happens.
Here's a real beginner level presentation, from Adafruit.[1] This may be too simplified for some.
The Art of Electronics by Horowitz and Hill is highly recommended, but the original audience was
physics grad students who needed to build instrumentation for physics experiments.
The order of presentation is good, but it's a big book. Because it mentions current components by part number, the book ages rapidly.
> Is this really where one starts while learning basic electronics?
Hmm did you miss the introductory chapter that starts with diodes, transistors and basic logic gates that they told you vaguely about in high school physics?
There's quite a bit of info before getting to using the TTL chips.
Having played a bit with redstone circuits in Minecraft helps too - or knowing about diodes and transistors when doing redstone in Minecraft helps :)
The real start is on the next pages which goes into _real_ deep intro for Diodes and transistors. I think it was just a matter of giving an intro but overseeing that people don't know what these symbols mean.
For me, an electronics engineer by degree (not by trade), it was so basic I could see the author forgetting this.
I would say no, unless that is the kind of stuff you are specifically interested in.
You will probably never see a full adder chip outside of retro and tinkerer stuff. If there are commercial uses left, they might be gone and replaced by FPGAs soon.
If someone asked me to teach them the more modern way, more appropriate for someone who wants to make props or puzzles and stuff as opposed the more retro experimental stuff, I'd advise a bit differently.
I would probably tell them to start with an ESP32 powered Arduino module, and some op amps, because op amps are both easy to understand and practical.
And at the same time, brush up on your pure mathematics side stuff, if you don't want to be like me, not really able to do any of the super high end stuff.
At the basic level you won't even need algebra, as you do more advanced work the math gets heavier.
Before even touching a real circuit, go immediately to the Falstad Simulator, and check out their example circuits. No, it's not perfectly accurate, but it is an amazing tool and runs right in a browser.
I think there's a lot of stuff that is kind of cool to try out for educational purposes that might be best just left in the simulator, at least at first, so you don't have to buy a bunch of stuff just to try out, and then not know what to do.
Probably don't buy a name brand soldering iron, a Pinecil V2 is probably about what you want, or a random T12 station.
The current good cheap multimeter changes every week it seems, the no name stuff is always changing, IIRC right now it's the HT118 or something that a lot of people like, but I could be wrong.
Oh, and don't invent a brand new standard and decide you're going to make all your stuff compatible with it, make up a bunch of custom cables, etc. There's a high risk of being bored, or finding some new universal standard for everything, and winding up with piles of useless junk.
Just buy the parts you need for one project at a time. USB-C and barrel jacks are good. Don't use too many weird connector types, because cables take up so much space and are a really annoying kind of junk to have around.
Wagos and premade pigtails are your friend for making up adapters.
Learn the E3 resistor and capacitor values, see if you can keep to mostly just those, if you want to reduce the number of different parts you need.
Most intro tutorials don't cover a lot of this stuff I wish I had known....
It might have to do with how longtime "longtime developer" means.
I took a look at TFA because of this. My experience caps out at doing a few heathkits in the early 80s and one single soldering of a resistor on my Synology to repair an issue a few years ago. I *mostly* understood the diagram, most of which was due to seeing it during CS adjacent classes in the early 90s.
Unfortunately it's often the case that these ELI5 type articles assume baseline knowledge that's less than baseline.
I'd recommend the book "Practical Electronics for Inventors". For me it had the right balance between not making too many assumptions but still teaching interesting circuits.
To the folks I've seen in this thread despairing about how this seems really, really hard at first brush: it is. This is not really a page of intro level electronics concepts. Doesn't mention a whole lot about voltage, current, time vs frequency response, or how to get a grasp on any of those concepts.
If you want a gentler intro that walks you through some of the foundational concepts, Analog Devices has some great free courses on Circuit Theory:
It goes without saying that Ben Eater's 8-bit computer kits [1] are like Legos for wannabe-electronics nerds. The kits themselves were the response of the outpouring of requests from viewers of his amazing YouTube channel where he first rose to notoriety for his breadboard CPU and computer.
I am currently assembling his 6502 breadboard computer (with the 16 x 2 LCD character display).
Someone put together an awesome "1-100 Transistor Projects" as a PDF [2] for learning how transistor circuits work. The PDF + breadboard + a dozen or so transistors and small parts will keep you busy.
There's a sequel "101-200 Transistor Circuits" [3], one on IC circuits [4] and one on the venerable 555 timer chip [5].
The above should keep you busy for the rest of the year. If not, be sure to skim through some of the electronics hobbyist magazines [6].
I agree that learning with an Arduino/Pi Pico is personally more fun and gets you to the stage of making something practical, or at least interesting, much more quickly. That said, doing things fully at the resistor and diode level still has its place, since someone has to understand stuff well enough at that level to design the Arduinos and Picos that the rest of us play with.
If that's the route you want to take then that's absolutely fine.
That area of electronics is "modular digital electronics" and doesn't take you through the principles of basic passive components (resistors, capacitors, inductors) and then into analogue circuit theory and semiconductors, digital logic and so on.
The only thing that bugs me is when that approach is suggested as a response to 'how do I learn electronics' without any qualification or elaboration.
Idk, I needed to understand resistors just to power an LED. As soon as you have any raw signal, you basically need a cap to smooth the signal.
I havent had a reason to use my own inductor myself, but obviously plenty of components have them.
Def has digital logic, registers are a clear example of this, especially if you are going to use an esp8266 which has only a few GPIO. There have also been cases when I needed to buy and/or/not gates, but they come as ICs with the caps inside for you.
I feel like making your own logic gates is reinventing the wheel(as the link that we are commenting on suggests). Even in school, doing EE, we only spent a blip on Assembly. I spent way more time doing higher level C/C++ stuff.
Analog electronics [0] uses a continuously variable signal while digital electronics interprets the signal with thresholds that define states like 0 and 1.
Here is a simple example: using a few discrete parts, like two transistors (Darlington pair), a LED and resistor, you can create a simple circuit that shows varying brightness of the LED depending on how close you move your hand or an object to an antenna connected to one of the transistors (forming a sort of proximity sensor). No microcontroller, SBC or even a hint of a digital signal involved at all.
> Here is a simple example: using a few discrete parts, like two transistors (Darlington pair), a LED and resistor, you can create a simple circuit that shows varying brightness of the LED depending on how close you move your hand or an object to an antenna connected to one of the transistors (forming a sort of proximity sensor)
Do you have some book/video/etc. recommendations for this "type" of Electrical/Electronics circuit engineering? I only know how to program a MCU :-(
"Digital electronics" communicate using discrete values, 1s and 0s.
"Analog electronics" communicate using voltage/current/temperature/etc levels.
One of the simplest examples is a voltage divider: if you put two resistors across a DC voltage source, like this:
V+-[R1]-¢-[R2]-GND
The voltage at the ¢ point will be:
V+ * (R2 / (R1 + R2))
There are infinite possible values for that voltage, depending on the voltage source and the two resistors. It cannot necessarily be expressed exactly in a digital circuit, and it will fluctuate over time as the environment changes in temperature, humidity, EM noise, and so on.
I usually recommend The Art of Electronics as a well-written, beginner-friendly textbook which covers the basic concepts.
Classifications are messy, but in addition to the other items mentioned already, I would say that some people would break out "power electronics"[1] as its own field.
Probably the wrong way to look at it. Digital electronics doesn't really exist outside of theoretical spaces. It's all analogue underneath and any experienced digital designer will know that and what the consequences for things like signal integrity, noise immunity and latency.
It's better to describe analog and digital electronics as a subset of electronics. For the most part, they look at different domains. Even though they are based upon the same underlying principles, the simplifying assumptions are different. A more dramatic example is with RF electronics. While it may look like you are dealing with the same sort of things as the more common low frequency analog electronics, you are going to have a difficult time coaxing an analog circuit to work in the RF domain.
Contrast that to web developers. They are dealing with very different principles from web browser developers, who are mostly working with different principles than operating system developers, who are working with entirely different principles from those who design hardware. It's not that they are working with a different subset of the same thing because one layer of abstraction is directly on top of the one below it and (ideally) the layers below completely hide how they work from the layers above.
I mean... Analog electronics doesn't exist either, or for that matter, electronics in general.
All of electronics assumes Kirchhoff's Current Law and Kirchoff's Voltage law, which does not truly exist in reality. Electrons often escape a circuit (see antennas, which throw the voltage / current into a wave that is emitted out of your designs). All wires are antennas, so even the most basic circuit doesn't have all the current return in a loop.
The assumptions of KVL and KCL are just over-simplifications of true physics, Maxwell's equations. Because working with Maxwell's equations directly is too much effort in practice.
--------------
Electronics itself is a huge abstraction upon physics. You could, in theory, calculate all the voltages and currents using Maxwell's equations, except this isn't useful at all.
Similarly: most of "Analog Electronics" uses simplifications as well: OpAmps are often assumed to be ideal (aka: infinite gain), which is good enough in most cases.
Digital systems, using relays, predate analog electronics.
There were relays used for railroad signaling in the 19th century. Union Switch and Signal was formed in 1881. The first active electronic device, the deForest Audion, was developed in 1906.
Please check this book by Ex-Google, Cisco, Sun engineer and adjunct professor of UC Berkeley,Ed Lipiansky on electronics fundamentals (analog and digital):
Electrical, Electronics, and Digital Hardware Essentials for Scientists and Engineers:
Thanks for sharing!
Does this teach one how to design circuits too?
Also, someone suggested ‘basic electronics: theory and practice’ by Westcott & Westcott [0] for learning hobby electronics. Could someone familiar with both explain how they compare?
The former is more abstract on individual component theory and dives more deeply into the maths as it goes along. It is best suited to study for an electronics certification (or to supplement a main cert such as electrical engineering) if you want to understand the whole gamut of basic principles because, for example, you will be designing complete analogue and digital circuits from scratch.
The Westcotts' book is more practical and gets stuck in to building things more quickly. It's not a course syllabus type book and is better for the hobbyist/tinkerer working with tried-and tested, classic chips (timers, amplifiers) and platforms (Raspberry Pi, Arduino).
I miss electronics retail, even with its issues. Radio Shack had the Forrest Mims engineer's notebooks for learning electronics, and it was great to be able to drive down to a Fry's to pick up a component that you needed in a hurry on a weekend.
Shout out to Anchor Electronics in Santa Clara, San Mateo Electronics Supply, and of course Jameco, which are still alive.
I'm also lamenting the loss of SF Bay area electronics surplus: Weird Stuff, Halted/HSC, and the latest casualty, Excess Solutions. Is there any place left around here to find used/surplus electronics?
Someone rewrote a bunch of text books from the 1970s and put them on the internet. Unless you work in IC development you don't need to know how comparators work internally. The logic gate examples are slighly less obsolete (early 1980s). This is yet another joke introduction written by a hobbyist who wants to mess with individual transistors for no purpose. There are no practical applications for these examples and there is no learning value in them.
Learn Basic Electronics... Build a legendary ping game.... that escalated quickly!
Looks like a really cool resource if you want to do the more old fashioned analog/discrete stuff.
Definitely a lot different than what I would probably show someone if they wanted to do hobby level more modern digital work, but it seems perfect for someone who wants to do stuff with transistors and logic gates.
Is what this website teaches useful in the real world? What I mean is - is there a demand for people who know digital electronics or is it mainly for people who are curious how chips work?
I never really imagined I would be interested in this stuff, much less find it kind of beautiful. Pure math brain, if you aren't careful, can train you to view anything applied with disdain, and something as earthy as hardware even moreso. But it's kind of beautiful how there are deep and inevitable connections between some of the most abstract theory computer science has to offer (plt, pl design and implementation) and the way physical objects are harnessed for computation. I guess if you consider that von Neumann was a mathematician first and foremost then it starts to make more sense
Either way, I'm having a great time -- definitely recommend anyone in a similar position learn this stuff whenever you're ready for it