Hacker News new | past | comments | ask | show | jobs | submit login
Ultimate Electronics (ultimateelectronicsbook.com)
624 points by tosh on Feb 13, 2020 | hide | past | favorite | 85 comments



I don’t want to be a jerk with this comment.. this looks like a great resource, and it obviously represents a lot of work by a lot of talented people. But the premise seems to be the same premise that kept me away from a fun and practical hobby for decades.

I got a Radio Shack 50-in-1 electronics kit for Christmas when I was a little kit. I thought it was the coolest thing ever, but when I tried to go further with my interest, all I got was equations and theory. For years I just wanted to understand how a transistor, with only 3 connections, could be like a relay, which had two pairs of connections and made logical sense to me. Keep in mind this was pre-Internet, so resources were somewhat limited. In any case, I eventually gave up on hardware and focused my attention on programming.

It wasn’t until the advent of the Arduino that the spark was rekindled and I found a new on ramp to electronics. Start with a breadboard and cookie-cutter circuits that you don’t really have to understand at first, and control everything with software, because you know software. Gradually phrases like “current-limiting resistor” and “bias voltage” start making sense. Watch people like Dave Jones and Big Clive take things apart on YouTube and reverse engineer them. The circuits get bigger and more complex. The next project needs an op-amp, and to your surprise, you can understand the theory now. The data sheet parameters make sense. You can pick out which 4 of the 5 answers on the electronics Stack Exchange are wrong, and adapt the right one to your needs.

I’m having a lot of fun making things, and to answer the question in the book’s introduction, “have you considered how electron collisions lead to Ohm’s Law‘s linearity”, the answer is a resounding “no”. And if people hadn’t told me I needed to consider that to play with electronics, I might not have missed out on many more years of fun.


(Ultimate Electronics Book author here.)

Thanks for the feedback. I had a similar experience starting with one of those 50-in-1 kits! And for me it didn't entirely click until I studied more of the theory and then looped back to the practical.

I've decided to mix both the theory and the practical here, and if "conventional academic textbook" is on one end of the spectrum and "Arudino" is on the other, then I'm aiming somewhere in the middle.

For a recent anecdote: this week, a coworker asked me at lunch, "How does the wall receptacle know to send more power to a 1500W space heater than to my MacBook charger?" And that's a great question. And in 15 minutes we ended up talking about fields and forces and resistive materials and the microscopic origins of Ohm's Law and hydraulic analogies... and somewhere in there, it seemed to start to click for him. And that foundation opened the door to his next question, "So why can't we hear the electrons colliding into the wire?" And that opened the door to a new topic: of course we can "hear" them if we just "listen" at the right frequencies, as analog electronic noise.

I'd love to be able to communicate that sort of intuition on a wider scale than the lunch table!


I appreciate the response. I hope my story didn’t give the wrong impression. There are a lot of paths to enlightenment, and who knows what 8-year-old me could have done with an interactive book.

I could only view it through a mobile device earlier, and now that I’ve had a more detailed look, I see how the simulation and experimentation aspect adds another dimension to the theory. I intend to work my way through the whole thing and see how knowing more of the math will influence my future tinkering.


This is a really cool project! I dig the clear explanations. Although I have a base of this knowledge already I look forward to when the more advanced topics are discussed. I'd love for a breakdown of more advanced, real-world, circuits that are broken down by subsystem and explained. cheers


This is really good. Please keep it up.


I'd draw a distinction here between electronic engineering and the art and craft of electronic design.

This is clearly an engineering textbook, starting with theory and moving on to analysis, understanding a system through mathematical modeling. If you really want to know everything (or at least most things) quantitatively about a circuit before you build it, these are the tools you need.

The parallel skill set is electronic design, sticking components and circuits together to make things. As you build bigger and more complex things, you learn more rules of thumb about different components and how to combine them into something useful.

Both of these skills together are needed in some measure in order to really be good at electronics, but everyone is more suited to one approach or the other, and learns best when they lead with that approach and fill in with the other one. For myself, I tried to learn electronics with those 50-in-1 kits but just 'didn't get it' because I didn't know what the different components did. With first year EE under my belt it all made sense and from then on I was able to learn the tinkering side.


Maybe this is a problem of semantics, or something that I simply can't quite understand any longer because I am an electrical engineer (or at least I have a diploma that says I am :-) ). But this distinction is very strange, and I'm not sure why/how you're making it. Electronic engineering is nothing like that.

Knowing rules of thumbs and how to "combine components" into something useful is not a parallel skill set to electronic engineering, it's a part of electronic engineering. Rules of thumb, models and reference schematics (from manufacturers or whatever) are pretty much how you come up with the first draft of a circuit.

Also, electronic engineering consists precisely of designing circuits. Quantitative analysis is a big part of it because without it you can't always know whether the circuit you've designed works according to specs (and in practice, yes, experimentation and testing plays a big role, but there are only so many prototypes that you can blow up before you run out of time, and only so many things that you can test for in a regular lab). The whole point of one's activity as an engineer is to combine components into something useful.

Sure, you don't sit in a lab drawing schematics and building things all day, because there are a lot of other activities that go into building a good product. There's a lot of validation work and a lot of analysis and a lot of planning. Some engineers focus their time only on one of these things, especially because they really are so complex, and so complicated, that you can spend a lifetime studying just one of them and you're still left with a lot of stuff to know.

But at the end of the day, designing electronic gizmos is pretty much what you do, regardless of what role you're playing there.

You can certainly make things by sticking components and circuits together without a thorough understanding how they work. But the idea that engineering somehow mostly about understanding a system through mathematical modelling before building it, and design is mostly about making things by sticking components together by intuition, is very much absurd.


In my time this was basically a distinction between an engineer and a technician - technicians generally having a lot more practical experience, but lacking in the theory, while fresh out of Uni engineers knowing the theory well, but lacking practical skills in solving the real-world problems. With time a good engineer picks up those skills too.


Good summary. I'd also add that 80% of the difficulty is spent on optimizing the last 20% of performance (i.e. the last dB), which is what separates the hobbyist from the pros, or the art from the quantitative.


I guess the mathematical introduction is what I was always looking for. Do you have any good beginners reference to learn Electronics Design?


user?id=taneq was contrasting "electronic design" with the mathematical approach, so not sure which one you want.

The ur-book for EE is Horowitz and Hill's The Art of Electronics. It can be found online with a little scrounging or for $100+ on amazon. H&H is extremely dense, fairly comprehensive and more like a handbook than an introduction. If you are very math-oriented maybe it will work for you, but be warned that this is over a thousand pages of formulas, greek letters, graphs, and subscripts and superscripts. It's... taxing.

As for a higher-level friendly introduction like taneq was talking about, there are a ton of resources that all cover small but essential parts. Unfortunately that means a ton of repeating things and difficulty in bringing everything together. Arduino resources are great, the adafruit/sparkfun articles/blogs are great, whitepapers from TI and others are great. I don't know of a single atlas to bring these together or say them in a single place, which sucks. I may try giving it a shot- I'm certainly math-dumb enough to understand how to translate.

The EEVblog and sparkfun youtube channels are excellent, particularly for PCB design. IMO PCB design is essential to transition from tinkering to a true hobby. Most sophisticated components only come in PCB-only packages. PCBs are far cheaper than breadboards, and mandatory for any project with more than a dozen parts. PCBs make debugging far easier. They're required for anything operating over a few MHz, and most digital stuff. Unfortunately the software still kind of blows- Kicad is the best, but still a huge pain.

I can't recommend electronics enough as a hobby! It's more intense than brewing beer, but the scene has blown up exponentially in the past two decades and is incredibly accessible. Electronics are more affordable than any other engineering discipline- PCBs are simple and incredibly cheap to order in single lots, and 80% of components can be ordered in single units. Compare eg metal prices, which are easily 20% the price in bulk vs. small units. Single electronic components are 75-50% of the cost in bulk. Entire industries are dedicated to making cheap, simple modules that handle incredibly sophisticated tasks like location tracking, video, wireless communication, or battery power. You can do anything you can think of.


Lots of great points, but perhaps a bit unfair to AoE. As madengr says, the math gets way worse than what's in AoE.

AoE isn't really supposed to be a textbook in itself. It was written for physics students, typically at the graduate level, who need to design experimental apparatus without a formal engineering background. It's not an ideal introduction for newbies, and unfortunately it's recommended for that role way too often IMO. But it's a great sophomore resource, so to speak.

What's needed is something between the Forrest Mims "cookbook" level and AoE... something that gives you the theoretical underpinnings needed to know what chapter in AoE to turn to. People who are interested in the RF and communicstions side have always had the ARRL Handbook as a resource, but that book has limited appeal to those who are more interested in microcontrollers and other electronics topics. This page looks like it might make a useful contribution there.


That's exactly what I mean by math-oriented- it's good for people who understand things in terms of equations. If that's all you need to be comfortable, it's great and you can very quickly find what you want to know. That's what makes it good as a reference handbook.


Ha ha, "If you are very math-oriented maybe it will work for you". The AOE is definitely not math oriented, and not what is typically used for EE courses. If AOE was used for undergrad EE, then 2/3 wouldn't flunk out. Don't get me wrong, it's a good book, but it's definitely not for the math oriented.


I found All About Circuits to be a great resource when I was a teenager. after skimming this site, I'd say it's too focused on math fundamentals, and not enough on "yeah but how does it ACTUALLY work?". I feel like All About Circuits does a good job of skimming over details that are not going to be important until you've built up some more knowledge.

Anyway, it's a great resource, and was a big part of why I decided to do EE instead of CS. It's also got pictures; lots of pictures!

https://www.allaboutcircuits.com/textbook/

and for the specific example you mentioned, their section on the bipolar junction transistor is simple and accessible:

https://www.allaboutcircuits.com/textbook/semiconductors/chp...


From the intro:

> The motto of this book is “one level deeper.”

Really we need both. Lots of people do learn by experimentation. There are (now) a lot of tutorials for them. But on the other hand there are people who really aren't satisfied until they've pinned down all the details - what is an electron really etc - and this is more suitable for them.

The more the merrier. We just need a bit of signposting too.


This mirrors my experience even beyond the hobby stage. I was quite a bit luckier because I had access to the internet and various dev boards as a kid and could work with them as a low level programmer but trying to move into schematic and PCB design was downright impossible. I couldn't wrap my head around any of the online resources and like you stuck to programming.

Fast forward to my 20s when I could afford to take a short sabbatical to apprentice for a practicing electrical engineer (self-taught as well). Within four months I went from basic breadboarding and soldering skills to designing high speed digital PCBs from start to finish and began contracting as an EE immediately. There was a little math but mostly just a bunch of specialized calculators for impedance matching high speed traces, capacitance, and length matching high speed buses.

I went back to software for the pay but when I designed a small control board last year, the whole exercise from inception to sending off to fab and assembly took four days for an 8 layer PCB. Between all the online parts databases (with footprints and schematic components!), reference and open source designs, and software like Altium and TopoR, electrical engineering has never been easier and involved so little actual theory taught in schools.

It hurts my heart that I haven't yet found a resource that just dumps people into the deep end with a proper focus on the engineering instead of the theory.


Can I ask how did you found someone to apprentice for and what did you do?

I think its exactly what I need, I feel really lost on where to go with respect to electronics learning, and I think a mentor could help me a lot.


The big breakthrough for me was the realization that a transistor is similar to a 4 pin relay, but with a shared ground between the main circuit and the control, resulting in 3 pins.

The next step is to realize that solid-state devices usually control current instead of voltage, but that we can add voltages, currents and resistances to the circuit that let us work with voltage in a linear region of a device's response curve. So our normalized 0-1 control corresponds linearly to 0-1 on the main circuit. From there, it's straightforward to build amplifiers where the control voltage or current is thousands or even millions of times smaller than the main.

After that it gets.. complicated. It took me 4 years of math and physics to finally understand solid state theory and be able to analyze large circuits by subdividing them into simpler linear sub-circuits for my degree. Then the really interesting stuff happens when we abandon all of that and convert to the frequency domain using the Fourier, Laplace or Z transform. So discrete signals have periodic frequencies and periodic signals have discrete frequencies. Which lets us analyze the transient and steady state portions of a signal separately and gain valuable insights about what a circuit will do.

Of course I've mostly forgotten all of that. One word of advice: keep your college textbooks. I still find concise explanations there that haven't made it onto the internet over 20 years later.

Edit: IMHO, the above way of translating between abstraction and application is the primary advantage of getting a degree from a university. For anyone young reading this, I highly recommend applying to the best schools you can. If you just settle, then you might miss out on the underlying theory behind each discipline. You'll want the theory later when you've forgotten everything like everyone else, because you'll be able to re-derive everything you've learned from first principles when you need it.


I think for me some of the difficulty in understanding electronics comes from the way components are often used in ways that don't make any sense unless you happen to know the 'trick' involved. For example, using a reverse biased diode as a constant voltage reference, using an inductor to create a high voltage pulse etc.

Also, for what it's worth, "Practical Electronics for Inventors" is one of my favourite books on electronics. It's well balanced between theory and practical applications.


I guess there are at least two ways to understand an inductor:

1. if you disconnect it from power, you'll get a huge voltage spike 2. V = -L (d/dt) I

1. seems like a practical explanation. 2. seems like a theoretical explanation. But an expert can play with both, and (under the assumption of first-order linear ODEs for this primitive components, which is an assumption you can only make after the fact, so... bare with me for the following BS statement), they are both equivalent. (That is, I can't think of another relationship that achieves 1.

Your idea of a "trick" has mostly to do with 1, I think. And it's an essential part of understanding an inductor, and it's also most likely how humans discovered inductors.


Sure but neither of those explanations immediately scream you can use this to filter high frequency noise or you can use it to cause a phase shift in an AC signal (like in a mechanical electrical meter or a start winding in a single phase induction motor).

Maybe 'trick' is the wrong way of looking at it, I guess the real issue is that from seeing a single component in a circuit diagram you can't know which of the effects of a component is being utilised.


I think I better understand your point. There's behavior, and there's intent. Seems related to the distinction between syntax and semantics.


The way I learned how an inductor works: it doesn't like changing current. When you turn the power on, it sluggishly and reluctantly lets the current flow. Then if you try to stop the current flow, it uses the energy it grabbed earlier to crank up the voltage in an attempt to keep the current flowing.


An NPN transistor, when saturated, is like a SPST relay where one side of the coil and one of the contacts are internally connected to ground. You don't however end up with isolation between the input signal and output signal in either case. Of course, the way a transistor actually works is dramatically different than a relay - there are plenty of useful things to do with a transistor when it's not saturated (plus it's a lot easier to blow the transistor up).


I wouldn't invoke a notion of "ground" when describing the operation of a component. That's some really a notion that comes out of looking at a circuit (in the sense of a circuit diagram), I would say. It might also confuse a beginner.

Otherwise I think your explanation is great. I can't claim to understand transistors super well, but your explanation touches on a key point for someone who is trying to understand the relationship between relays and transistors. Transistors cannot achieve the impossible--with 4 pins, you can achieve electrical isolation between the switcher and the switchee (and share a "ground" if you want to). A transistor cannot do that.

This might also be a high-level way to see why we need two "kinds" of transistors. Because suppose the electromagnet in the relay has some polarity, and only activates if current flows "north to south". Well, there are two possible choices of what side of the coil you tie to a pin on the switch, and what side of the coil you expose as the base/gate.


True ... ground in this instance could be a confusing word. I guess choosing an NPN transistor led me to choose that word but it's really more about the emitter being negative with respect to both the base and collector (kind of wordy). And the transistor becomes even more interesting because if the emitter becomes more positive than the other two connections, the base-emitter junction becomes a reverse-biased diode and blocks current flow completely.


I went through the same progression as you did. Dave Jones' OpAmp video was a godsend, I wish he'd make more videos like that. I think there's a great deal of value in "slow learning" when you have the time; let complex ideas percolate through your subconscious through repeated exposure over time.

It's been said many times before but I'd like to recommend The Art of Electronics to anyone who's graduated from the YouTube School of Electrical engineering and has some motivation to keep going. It has some mathematical theory (which it insists is optional) but I've found it useful for filling in the holes in my education and has loads of practical advice like the EE mentor I've never had.


Or just buy The Art of Electronics :)


Great book, but IMO too advanced for a beginner. The risk of being discouraged by the level of its topics is high. I would rather suggest to a complete newbie to start experimenting with analog electronics by following well known old magazines, soldering kits, then attempt to modifying them, destroy components by applying the wrong voltage/current/polarity (or sometimes just by touching them with ungrounded hands) and learn why and how to avoid in the future; in other words learn by failure and success. I would also initially stay away from Arduinos, Raspberries etc. They're great but they also hide the inner workings of a circuit behind the code (and often also the code behind a linkable library). Building a single transistor wah guitar pedal for example will teach a lot more about electronics (and some mechanics too) than any small computer module out there: they will be wonderful addition to make great things with the knowledge gained through simpler projects.


So you want to be a plumber? Here take this graduate course in fluid dynamics.


I’m an electronics tech and i feel similarly about music, my hobby is to make electronic music but whenever i wanted to grow my compositions i encountered lots of theory and maths but i realized that without the theory i was limiting myself to just follow instructions. So if you want creative freedom in whatever you do from electronics to baking you need to understand why you’re doing what you are doing.


Bret Victor is really big on this. Many people (including myself) learn a lot by experimenting and learning the dynamics of some system. One just needs a system to play with, control and feedback.


I was the other way I wanted to know how it all worked in great detail.

Like magnets and EMF the forces is explained by showing lines of force but what are they? From what I understand the force is energy due to the exchange of virtual photons between the magnetic poles.

I've also read that EMF is also due to relativistic effects. Moving and stationary charges differ in quantity due to contraction of the moving charges. Like charges repel so the difference creates a force.

Take all that with a huge grain of salt.


That's right. Here is a pretty good video explaining magnetism as a relativistic effect of electrostatics - https://www.youtube.com/watch?v=1TKSfAkWWN0


The Radio Shack 50-in-1 electronics kit was also my onramp to technology, and I too landed in programming rather than, e.g. electrical engineering. I recognize the loss you describe... and I'm really grateful, all the same, for the door it opened into the future. Glad to see you found an alternate on-ramp.


Avoiding “equations” when you want to understand theory is a mistake IMO in the same category as avoiding “code” when you want to understand software: it’s possible but it’s definitely the harder way to do it.


Not joking or being sarcastic here, Electronic Circuits for Dummies is great for what you describe.


Are you my twin brother from a parallel universe? My experience has mirrored yours.


I was wondering about the authors of this book. It's in the Introduction:

> Michael F. Robbins holds the S.B. in Electrical Science and Engineering and the M.Eng. in Electrical Engineering and Computer Science degrees both from the Massachusetts Institute of Technology. Mike is the co-founder of CircuitLab, Inc. and developer of the CircuitLab circuit simulation software used by universities, hobbyists, and practicing engineers in 196 countries.

It looks like an introductory book to electrical circuits with interactive simulation exercises built into the book. This should give you an intuitive understanding of how the math that was presented actually works at the circuit level, which, I think, is very cool.


Certainly makes it clear that this is an attempt to commoditize CircuitLab's complements.

More circuit knowledge = more need for simulation resources. Also serves as a pretty good SEO resource for CircuitLab.

There are plenty of great analog electronics tutorials out there already that mostly differ in that they don't point to CircuitLab's material. My personal favorite is Analog Devices electronics course.

https://wiki.analog.com/university/courses/electronics/text/...


(Ultimate Electronics Book author here.)

I'll add that it's also great documentation / example material for how to use CircuitLab. In fact, that's how it started.

In many ways, circuit simulators are a power user tool, and so I started this project by just making a larger library of example circuits. It then became clear that I needed to put the circuits together in context, and the idea for the textbook popped up since the context many of our customers share (at least the hobbyist and education segments) is in learning electronics.

Now, I'm inspired by communicating in this part-theory, part-practical style that seems hard to find in conventional resources.


It's a good idea on many levels. Power to you!


I was obsessed with electronics from about age 3 or so... I was introduced to Ohm's law around age 10 or so but having no Algebra knowledge it made practically no sense. The way current flowed (or didn't flow) was quite mysterious.

It wasn't until I learned current and voltage laws that the way current flowed actually made sense.

Analogies were super helpful too. A capacitor is like a rubber membrane that pressure can build up on. Voltage is pressure. Current is rate of flow. An inductor is a wheel in the pipe with inertia. A transformer is two mechanically linked inductors. A Mosfet is a pressure-activated switch. A BJT is a current-activated switch. A diode is a check valve.

It wasn't until I took Physics in high school/college though, that it all finally "clicked". All of electronics could be described quite well by those finicky electrons that repel each other. How a capacitor actually works. How a charged pointy surface draws the electrons to the point, sometimes enough to escape. How magnetic fields are actually produced.

Digital electronics was actually much easier especially coming from a background in computing. It's all logic, you don't really need algebra or calculus.

Anyways I would like to offer this as a counterpoint and say that learning the fundamentals and physics made the world of electronics way more satisfying than to always be tinkering but not really understanding.

I do think though that these topics are really quite difficult to understand and internalize. Even most practicing EE's I think struggle with really able to internally translate the physics, and Maxwell's laws into the day to day work. It is just quite abstract. I know I still struggle with it.

The truth is most EE work is not so different from software where you can get by most of the time by reading datasheets, basic laws, following working circuit examples, and relying on what's worked in the past. You don't have to derive everything from first principles all the time. There is just a wealth of well understood designs out there to pick from. Additionally a lot of the work is plumbing and being careful, such as basic PCB design.


Recently picked up an interest in electronics, almost finishing up with Ben Eater videos, this looks amazing!


I recently got interested in the IoT world, home made.

Unfortunately I could not go to far because I am missing an "electronics cookbook".

As an example I wanted to build an internet radio off a Raspberry Pi, hooking a small amplificator "chip" (a pre made circuit with an IN from the rpi and an OUT to small loudspeakers).

There was a buzz from the loudspeakers and I am sure an electronician (of there is such a word) would immediately say "you need a 1 pF capacitor here and a 200 ohm resistor there because this is a basic Schmidt-Landau-Trump bridge, obvious in loudspeakers".

I will never understand why a parallel resistor and a capacitor in series is the way to go, but would love to have a cookbook for such circuits, with some explanation such as "if it buzzes, uncrease the resistance between 100 and 1000 ohm" - for the typical circuits one would make in IoT (say, a plant humidity detector where I have the sensor, the nodemcu and need to kniw what to use in electronic parts to hook them up).


There are different ways to approach electronics. You can spend a few years getting formal background in math and electrical engineering, but I don't think that will necessarily answer the hypothetical question of "why does this buzz". Anecdotally, I struggled to even get digital circuits working after several years of electrical engineering university education until someone who ran our labs told us about bypass capacitors.

Which brings me to another way to learn, which is by hands-on tinkering and picking up theory where it is needed. I am pretty sure the guy that ran our labs didn't have a degree in electrical engineering, but he was amazing at building things and troubleshooting them. Building up intuitive knowledge and experience also takes years though.


Absolutely. This reminds me of a similar difference between a ‘computer scientist’ and a ‘real programmer.’ You do not need any knowledge of the so-called computer science to be an excellent software developer, and I have seen more than a few self-professed computer scientists who couldn’t write a simple program.


I found healthy proportion of designing your own projects, debugging/studying existing electronics and spending time on understanding theory behind what you are working on to be the best way for me.

I could never go far learning just theory because I seem to forget soon after I think I have understood it.

Studying/tinkering with existing electronics is fantastic -- there is so much knowledge in most products that I feel drunk with excitation to see how something can be implemented way better and more efficient than I thought. It is very interesting to see how different designers approach their problems and it builds my repository of solutions I can implement.

This is no proxy for actually trying to solve the problems. I think only after you have really tried to solve a problem you can actually appreciate alternative designs.

Now, rinse, lather, repeat.


"Anecdotally, I struggled to even get digital circuits working after several years of electrical engineering university education until someone who ran our labs told us about bypass capacitors"

You're kidding, right? Electrical engineering from University and never heard of bypass capacitors?


Not kidding. I was a few years in having taken a few analog electrical engineering courses with labs, signals and systems, a course and a lab on power before we got to digital circuits. The first course in digital circuits didn't have a lab. It wasn't until we got to the microprocessors course where we had to work with a Motorola 68HC11 on a breadboard that the concept of bypass capacitors was explained to us in a lab. It was 20+ years ago. Maybe things are different now.


Bypass capacitors are used in analog circuits as well. Not sure what is wrong about that uni. Anyways I am a former physicist and do not have any formal education in electronics. I learned electronics on my own and bypass capacitors were definitely not the last thing to come by.


I think your experience reinforces my point that you can get to be proficient in electronics from either end: formal theoretical education leading to hands-on, or hands-on with theoretical learned as needed. Your case seems like the latter, and you learned about bypass capacitors early on. My case was the former, and I learned about them later.


> Not sure what is wrong about that uni.

Same experience in my university. I think the curriculum has analogues with the CS/SW dichotomy. CS programs are not meant to teach you programming. Similarly, EE programs are not meant to teach you how to build stuff.

In fact, the electronics for physicists course often had more practical material than you'd get from an EE course.


Engineering programs in the US are required by accreditation boards to have lots of lab courses. One of the primary reasons is the industry advisors on these boards want graduates to be able to build stuff. And from what I've seen, it's a lot easier to get an engineering degree by being able to build stuff but struggling with the math, versus learning a bunch of math but not being able to build anything. Assessment courses in your final year of school are required to test you at real-world-level tasks.

Similarly CS is required to have labs which force students to be able to program. If someone gets a CS degree but can't write a program, they either cheated their entire way through every programming assignment, or got that degree at some international university that would never be accredited here.


This is pretty much the same in EU, modulo some controversy about the exact point during the course of higher education when you should be expected to be able to design things on your own.

I have no idea how an EE curriculum doesn't cover bypass capacitors. They're in virtually every real-world circuit. Even if it's not an item that's specifically covered in a course, lab or seminar, there's no way you can put a real-world schematic on the projector and not run into one.

I can't point at a specific course I took where they were covered but I am sure everyone who made it to the third year knew what they were. I definitely remember talking about them extensively in at least three courses (Circuit Theory, Digital Circuits and Digital Instrumentation).


> Engineering programs in the US are required by accreditation boards to have lots of lab courses. One of the primary reasons is the industry advisors on these boards want graduates to be able to build stuff.

Let me summarize almost all my engineering lab courses:

"Design and build X" where X could be some kind of amplifier, etc.

The "design" part is identical to a HW problem. You already know in advance the circuit (one of the standard ones in the textbook), and you just need to figure out R/C/L values to get the desired output. Then you build it and show it to the lab TA who'll check it is actually behaving as desired.

This isn't a good lab assignment: It's just a theory problem masquerading as a lab exercise. After the first semester, any idiot can put the circuit together on the breadboard if they already know the circuit topology.

And yes, while I was there, the accredition board actually reaccredited the program. And yes, they looked at the lab assignments we were getting.

After graduating, I visited my department a number of times, and I did give them the feedback that "your lab assignments are useless".

> And from what I've seen, it's a lot easier to get an engineering degree by being able to build stuff but struggling with the math, versus learning a bunch of math but not being able to build anything.

Definitely not the case at my university. If you struggled with the math, you'd get really poor grades. And we had a mandatory requirement to get a B in second semester circuits (and pass the final with a score of 8/12 or better). Until you did that, you were not allowed to take junior level EE courses. The labs were trivial, but the exams were tough.

The only time when not being able to build anything was a barrier was for a Senior Design assignment we all had. And lo and behold everyone either got an electronics book (notably not a textbook), or searched the Internet.

> If someone gets a CS degree but can't write a program, they either cheated their entire way through every programming assignment, or got that degree at some international university that would never be accredited here.

Eh. It's not so much that they couldn't write a program, but that they would forget the stuff fairly quickly. In my university there definitely was a fair amount of nontrivial programming required in some CS courses. But on the EE side most of the lab assignments were just trivial.

I would quibble a little with you on the EE/CS comparison (even though I'm the one who introduced it to this thread). IMO, EE is a lot broader a discipline than EE. Stuff that is considered part of EE: Electromagnetics, control theory, acoustics (believe it or not), information theory, semiconductors, signals (e.g. Fourier transforms, etc), power, electronics, and others. There are quite a few professions that fall within the aegis of electrical engineering but have little to no circuit aspect. This is less true of CS. So it is a tad bit more understandable that someone gets an EE degree but sucks at electronics.


What's funnier is to see circuit simulations with bypass capacitors in parallel with ideal voltage sources.


I agree, but I am still missing (naively) the equivalent of the Python or node module fuck will generate UUIDs. Sure I child code it myself knowing the theory (algorithm) but I just want it to work because I am not that interested in the how.

I think this is not really possible in electronics as there are maybe too many cases (though some are probably quite common)


The word would be electrical engineer :)

There are at least two O'Reilly "Cookbooks" for electronics.

The best way to do this as a hobbyist, if you don't want to read textbooks, is just to mess around and soak up information from blogs. There is an awful lot of "do this because it works" in electronics. Most of it is backed up by theory somewhere - e.g. why is 100nF so often used as a bypass cap value? - but really you don't care as long as you follow the recommended application circuit in the datasheet.


I thought about electrical engineer but it does not have to be an engineer. Like a driver does not need to be a BMW engineer and know everything about the car.

I think your second example summarizes it quite nicely.

I am a physicist by education and an IT guy by career, so to speak. When my son asks me about basic physics I usually give him an erroneous answer, good enough for him to go ahead. This is what I would love to have in electronics.

But as you say (and with which I agree), years of tinkering is probably what builds a cookbook in your head.


I think there are three main levels in the approach to hobby electronics: 1: do it as the book says so it will work, 2: build experience also by making modifications so one can know in a pure empirical way why it does or doesn't work (for example using an electrolytic cap for RF decoupling), and 3: learn the theory behind parts and how they work, in order to be able to design from scratch or make heavy adaptations. 1 and 2 require time and will, 3 requires also math knowledge, and can be slow to conquer. I have personally been stuck too long time at 2, and now am still somewhere like 5% of 3, which all things considered is not bad at all.


It's a myth that electronics design is math-heavy, because you can do it with pretty much just +-/ and some ² and roots. Even for e.g. filter design you don't actually need to be able to do (or even understand) any of the calculations yourself, because tools automate it away. Analog design with "basic blocks" is pretty straightforward, "free form design" not following basic blocks (usually resulting in part count reductions or performance enhancements) is much harder. The latter is what often produces "trick circuits" which most people cannot analyze on their own and possibly cannot analyze using SPICE.


The art of electronics is pretty good at combining theoretical concepts with practical circuit tips and tricks, definitely worth (non-negotiable!) having on the shelf.


This looks great.

I tinkered with circuits as a kid in the 60’s and subscribed to Popular Electronics back then—it was may favorite magazine when I was 16 years old.

I always viewed electronics as only a hobby, kind of like being a Ham Radio enthusiast and went to college to be a mathematician instead. However, I had a great elective that covered how to build electronic apparatus for scientists. After all the years of Math, it awakened my old interest in electronics and I stayed in college for a second degree in EE.

Popular Electronics was a hobby magazine and it was a lot of fun. On the other hand, a book like this one is more like what one learns when studying for an engineering degree. It teaches at a level where you can really understand the circuits you see and gives you the ability to put together your own, not just wire up projects someone else designed. I wish the author good luck it and look forward to future chapters. It could be a nice complement to Paul Horowitz‘a The Art of Electronics.

By the way, I know that many HN readers are computer scientists. That’s what I ended up being, but having a deeper understanding of electronics than most of the other software guys has opened many doors for me.


I love interactive books but will the links to Circuit lab will still be alive 40 years from now? I can pick up a book from 1860 and still read it with no loss in fidelity or information.

I think interactive books are destined to go to the graveyard. I admire the effort and the medium in every other way - except for aforementioned issue.


Browsing randomly through the content I've noticed that you're discussing impedance, capacitive/inductive loads and transitional states in the DC section (and also fairly early in it). I think it'd be way easier to grasp for someone new to this if it was a section on its own, and after the AC circuits are already well explained. To tackle this properly mathematically and understand it well you need to understand a lot of very abstract things like impedance, phase shifts and complex numbers very well, and IMHO that's much easier to explain on a stable state AC circuits, and then apply that knowledge on the on/off transitions too.


> Studying electronics will enhance your understanding of calculus

This is absolutely true for me


Studying [X heavy field] will enhance your understanding of X.

Studying electronics did not enhance my understanding of calculus, but maybe I didn't study it enough. On a related note, I have grokked logarithms only when I needed them to implement some graphing module


You can tell they're not messing around because there are 7 MIT math and physics courses listed as corequisites. Looks like a fantastic resource. On the other hand I have no idea when I'd get the time to do this until I retire. Maybe I'll spend my retirement years like fixing up vintage amplifiers and designing circuits.


This looks great, but while one is playing with it (or, alternatively, while one has classes on electronics), it may be good to know that you do not need your browser to simulate electr(on)ic circuits: try ngspice (should be available on all operating systems where one could expect such software; Unices, Windows, Mac): it is an implementation of the SPICE simulator, but with many modifications, including scripting support (it has its own language built-in, or one can use tcl or C to control ngspice). The main difference while getting started is that one does not lay out schematics while working with nspice, rather one just describes the circuits as graphs where the edges are electronic components. It is more abstract in that sense, but I think it is closer to programmers in these ways, and it provides more power to the (programming) user. (Note that one would probably draw the graphs on paper before describing them for SPICE/ngspice.)

EDIT: see https://wiki.archlinux.org/index.php/List_of_applications/Sc... and, especially, https://wiki.archlinux.org/index.php/List_of_applications/Sc... for other simulators.

Side question: I would like to get to being able to design PCBs which I could then get produced and populated in very small quantities, just as a hobby. What are the fields of knowledge which one should approach before designing PCBs oneself? Any good books? After that, I would need to choose between gEDA and KiCAD, while the latter seems to be quite a bit more popular, gEDA's supposed design in the UNIX philosophy (as a suite of tools that the user uses through programming/scripting) appeals to me, so does somebody have any experience with gEDA to share (pros, cons, etc.)?

Another question, more on-topic: could somebody clarify in what cases can we simulate electronic circuits while representing values with phasors/complex numbers? As far as I understand there are some limitations, like it is always a correct identification if all components are linear, or if the AC parts of the signals have "small" values compared to the DC parts of the signals. Am I mixing stuff up? What are some of the real world situations in which simulating with phasors breaks down?


Back in the day I used TKGate. Not so UNIXy, but it's TCL/TK.


Somewhat related:

I want to build my own electric car, well, convert an ICE car to an electric one. I have the fabrication skills but I want to learn more about of the _electric_ stuff (the motor, AC vs DC, the controller, differences in batteries, etc). Does anyone have a good resource for that?



Thanks!


Very nice. The interactive examples really make it very appealing for people learning.

I hope it get completed someday.


https://www.allaboutcircuits.com/textbook/direct-current/ this is the one I used, very comprehensive and useful.


Just read through a couple chapters and this is fantastic. Does anyone have a resource like this (theory-based, but still very hands-on and practical) for other subjects related to engineering, like say building engines?


I have bookmarked it and will certainly refer to it in the future. Thanks!


Cool, waiting for the more fun parts to come up


What a lovely book. Thank your for sharing.


Looks like a neat book. I’d stress early on, once you get to inductance and capacitance, that the energy in a circuit is contained in the fields, which are the consequences of charges (or vice versa). It’s the fields that matter.

When you flip that switch, the drift velocity of charges in (on) the bare copper is only a few cm/s, but the field propagates at the speed of light, between the wire and it’s return (mathematically convenient ground). The energy is around the wires in the fields. The only true ground is at infinity.

If you can get that drilled into your head from early on, it will be beneficial to high speed circuits and RF. Introductory books never do that.


I am an EE with a degree from a big ten university. Now turning 66. What amazes me about electronics/computer science is how much I do not know and how much I am still learning.


(Ultimate Electronics Book author here.)

Thanks for the feedback. I agree so much with you about the importance of fields and keeping track of energy! And so much confusion about ground... "The only true ground is at infinity." is perfect.

When I get to inductance and capacitance, I'll definitely be looking at them from a field and energy perspective.


Maybe I'm not the right audience for this as I came from a EE background, but I really do not see this as a good resource. I feel like the material from allaboutcircuits.com covers most of the material in this site up until chapter 11 or so and this site only has two volumes written out of 19 planned chapters.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: