Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: How to Self-Study Integrated Circuit Design?
178 points by hsikka on May 13, 2019 | hide | past | favorite | 39 comments
Hey HN,

I am a graduate student doing ML research, and lately I've been thinking a lot about designing learning systems from the hardware through the software layers.

I have no experience with what is going at the processor levels, and I was wondering what prerequisite subjects or general curricula I should follow to learn and reason at these lower levels of abstraction.

To be clear, I'm doing this to build intuitions about new computational systems and how different chips, from ASIC to neuromorphic, may be designed.

Any resources or advice telling me I'm a fool is welcome!




If you're looking at this from an architectural perspective, consider grabbing a copy of Hennessey & Patterson's "Computer Architecture - a Quantitative Approach". It covers topics like branch prediction, instruction set design, memory/caching, and so on. A classic.

If you want to get really deep into the physics of IC Design, one of my favorites is "CMOS Design, Layout & Simulation (Baker). It covers spice modeling, physical transistor construction, and a variety of digital/analog/memory circuit concepts.

Finally: the link to this article was literally right underneath the link to yours when I opened HN news this morning: https://medium.com/@veedrac/to-reinvent-the-processor-671139...


Edit: Apparently this has replaced Weste and Eshraghian: "CMOS VLSI Design: A Circuits and Systems Perspective (4th Edition)" by Weste and Harris https://www.amazon.com/CMOS-VLSI-Design-Circuits-Perspective...

Weste and Eshraghian was (and may still be) the Bible for a very long time: https://www.amazon.com/Principles-CMOS-VLSI-Design-Perspecti...

It's a lot newer (and nicer) than most other references.

I would avoid Mead and Conway because it's really dated. It's not wrong, but a beginner won't know which parts to skip.

If you're looking for something about VLSI layout design, "The Art of Analog Layout (2nd Edition)" by Hastings would be the choice: https://www.amazon.com/Art-Analog-Layout-2nd/dp/0131464108/r...


The complete beginner should start with Harris & Harris "Digital Design and Computer Architecture" (David Harris is one of "CMOS VLSI Design" authors), it provides a gentle introduction into the underlying physics but focuses more on the logic of the whole endeavor, i.e. how to get from transistors to CPUs.


Looks like there's a 2nd ed (mips based?) and an "arm edition" (more recent?) of the same title by the same authors. Online reviews are sparse, wonder which is better now?


A Quantitative Approach is the more advanced text, iirc. Should probably start with the same authors' Introduction to.


Brilliant, thank you!


Unfortunately most of the tools used in the EDA industry are proprietary and stuck in the 90s. So it is both expensive and pretty painful to use them, although after a while a kind of Stockholm syndrome sets in. In other words you really need Cadence and access to a Designkit from a foundry in a relatively recent process (65nm TSMC, 22nm SOI Global Foundries, etc.) to implement something serious. Similar things apply to digital design, although with tools like Chisel and Verilator (system verilog to C++ compiler) you can do more things with freely available tools. Once you want to tape out your design a lot of very expensive proprietary software is still involved. It is also pretty unrealistic to do everything yourself. It is a multi year single person project to setup an understand a new analog design kit itself.

So my best advice is to find a group that has already setup most of that support infrastructure and obtained licenses for all the tools and design kits (We've got one person alone that handles licenses, software installation, and the ASIC cluster).

Ideally the group has permanent staff with plenty of hardware experience and a professor with a proven track record of successful chip tapeouts. The number of groups that are capable of this in the field you are interested in can be counted on two hands at best.

I can't talk much about analog design, because I've only witnessed people doing it and haven't done any design with them myself, but you need to be a very particular kind of person to enjoy it, especially if you have no-one to do the layout for you.

Oh and you should think long and hard if you really want to go into this direction, it involves a lot of really hard and tedious work and is not glamorous at all. A tapeout might fail because someone accidentally checked a box in the last step before sending the design to the manufacturer, removing all blackbox pins on the edge of the design. Or someone might have though that it is a good idea to put a latch in the PLL reset path, which causes the clock to never turn on, which you only discover in the back annotated simulation 5 days before tapeout.


Oh, oh, oh. I can answer this — I’m a software engineer on sabbatical and I’m doing the same.

First off, everyone says Hennessey & Patterson and Patterson & Hennessey and while I think those books are worth reading at some point, for me they concentrate on things that I don’t care about (like what the cutting edge is) and gloss over the finer details of actually how the processor works.

I’ve found that Mano’s Computer System Architecture is great for the higher level stuff. I have the 2nd edition from 1982.

I like Hauck and DeHon’s Reconfigurable Computing from 2007 for FPGAs.

I like Hill and Peterson’s Digital Systems: Hardware Organization and Design from 1973 for describing the design programmatically using an RTL, in this case they use an APL derivative rather than the more recent (and horrible) VHDL or Verilog. This is a dense text and I’m still working my way through but it’s super awesome in my opinion.

I’ve been working through the Visual ARM1 animated transistor-level simulator and have made my way to Bryant’s 1984 paper A Switch-Level Model and Simulator for MOS Digital Systems and recursively reading what I need to understand it thoroughly. I have Mead & Conway on the way.

I also have a copy of Sedra & Smith (2nd Edition) for understanding the actual electrical circuits but I rarely reference it.


Also these series of blog posts describe what's going on in the Visual ARM1 simulator and they are both really really well written:

- http://www.righto.com/2015/12/reverse-engineering-arm1-ances... - http://daveshacks.blogspot.com/2015/12/inside-armv1-register...


Have a look at the Wikipedia pages for Carver Mead (https://en.wikipedia.org/wiki/Carver_Mead) and Lyn Conway (https://en.wikipedia.org/wiki/Lynn_Conway). Their book "Introduction to VLSI Systems" and other links on those two articles should be useful jumping off points for your research.

Most university libraries should have copies of "Introduction to VLSI Systems" or similar on their shelves.


Given how Carver Mead is a father of neuromorphic computing, a more relevant book would be https://www.amazon.com/Analog-VLSI-Neural-Systems-Carver/dp/...


Thanks for sharing. I'd never head of Lynn Conway, one would have assumed a very zeitgeist character with reference to Chelsea Manning's recent media presence.


I'm not sure what they have in common other than being transwomen? ARM's Sophie Wilson comes to mind, too.


Another notable trans neuromorphic engineer is Jennifer (Paul) Hasler: http://hasler.ece.gatech.edu


Hey, I realise you probably didn't mean to do this, but using a previous name of someone who's transitioned is not a good thing to do.


Why? An overwhelming majority of this academic's papers [1] are published as Paul Hasler.

[1] http://hasler.ece.gatech.edu/Published_papers/index.html


I know a professional accountant who kept using her maiden name in business contexts after marriage because otherwise people had a hard time finding her skillset and work history etc., but for all other purposes she’s Mrs $NEW_NAME.

As regards trans names, I’m told quite a lot of people (not 100%) find their old names an uncomfortable reminder of an uncomfortable past.

Plus it’s a sign of respect to use someone’s chosen name, and a sign of disrespect to use any other name — for non-trans examples, what does it say about someone’s respect for the other party if they call David Tennant “David McDonald” or Boris Johnson “Alexander Boris de Pfeffel Johnson”? (At least, without the brackets that the previous poster used; with is useful for the same reason as my friend using her maiden name, but it’s still often frowned on).


If you have no experience in digital electronics at all, I would actually start with a book like Code by Charles Petzold. It's a pop-science book about electronics and goes from light bulbs and switches to simple logic blocks like adders. It covers the basic building blocks of virtually every cpu in existence - how to turn transistors into calculators.

It's well written and provides some context to otherwise pretty dry textbooks.

There is quite a bit of difference between how an FPGA works and is programmed versus a totally custom chip. You could also take a look at reverse engineering walkthroughs eg Ken Shirrif has a few on his blog.

Nand2tetris is another good thing to look at.

Finally, reverse engineering the MOS6502: https://m.youtube.com/watch?v=fWqBmmPQP40


Not sure how experienced you are with digital logic, but I would start there. Start with combinational logic then sequential logic. Then move on to timing analysis and basic pipelining. Then get a FPGA dev board and design some basic things like a simple processor. You could also use the FPGA as a more concrete way to test what you have learned in the aforementioned topics.

Then once you get that down/understood. I would recommend looking at the papers and literature regarding the topics of interest and common architectures and structures. See what optimizations have been tried. The optimization targets are generally performance, power and space.

Also with ASICs the standard cell libraries make things much easier since generally they are characterized pretty well. If you have the funds to make an ASIC that is. Still because that you spend most your time just verifying and testing your designs.

-EDIT-

If your thinking custom analog circuits your going to have to learn a lot more. Starting with circuit fundamentals, but your also going to need solid understanding of semiconductor physics. There is also the problem that outside simulation testing ideas/designs is quite expensive. Unless what you have is something that could be constructed out of discrete components. So unless you got a company or university backing you it's unlikely you would be able to make any physical prototypes. Although prices have been getting cheaper for older processes, and there are services like MOSIS.


For a solid foundation of microelectronics, Sedra and Smith's "Microelectronic Circuits" is the classic go to. Back when I worked at TI, they actually gave every new engineer a copy.

This is a very entry level book that covers a huge number of topics well and will serve as a launching point to more specific topics. It starts with basic transistors and works up through Op-amp design and digital VLSI touching on filter theory and clocks along the way.


I don't know what your current level is but for complete beginners I recommend this game: http://nandgame.com/ It will walk you though the basic components of a very simplified CPU.

Alternatively if you are interested in designing custom circuits which are merely used as accelerators then you could start off with an FPGA and a RISC V core to which you add your own designs.


Integrated circuit design is a vast field with many different areas of expertise.

Given your goal, your best bet is to buy an FPGA development board and learn an HDL (Verilog or VHDL), and start experimenting.

When you to deeper in the field and more over to ASICs, then you would have to learn the details of synthesis, placement, routing, DFT (design for test) and a whole bunch of other details, which usually exceeds the willpower of a hobbyist :-)

And then if you want to really go all the way into the latest and fastest transistor nodes, then you go into custom circuit design and custom gate layout, which is not fun at all.


Learn Verilog and practice implementing logic on a FPGA. A MicroZed board (Xilinx Zynq) is a great choice. You can even implement your own CPU.


Or an Arty! [1] It's got an Arduino shield connector and super affordable.

[1] https://www.xilinx.com/products/boards-and-kits/arty.html


Which FPGA vendor is most "Linux-friendly"?


Having worked with both Xilinx & Intel/Altera recently, I can agree that Xilinx works quite nicely.

But Altera works equally well and feels much more natural to me!

Their entire toolchain is very unix-like, which can be seen from an example Makefile like [1] and I had no problems until now to move projects etc. from Windows to Linux and back.

Whereas for Xilinx there have to exist tools like [2] or [3] to be able to build your project on the console and I won't start with the problems I've had with moving projects around or the Xilinx Platform Cable under Linux which ended up in me doing only remote ILA'ing via a Windows box (at least that works w/o a problem).

Unfortunately since Intel bought Altera it seems that they are pulling more and more out of the normal FPGA market, which can make it harder to get support.

Regarding the free tooling, Altera also provides a "Quartus Lite" version which brings you basically the same feature-set than the Xilinx WebPack, for the Altera world.

The Altera toolchain is unfortunately harder bound to SuSe or Redhat than the one of Xilinx.

I haven't worked with lattice yet, but e.g. [4] looks super promising and their toolchain is said to be very linux friendly as well.

[1] https://github.com/cferr/dsp/blob/master/quartus/Makefile

[2] https://github.com/cambridgehackers/fpgamake

[3] https://github.com/slaclab/ruckus

[4] https://www.crowdsupply.com/1bitsquared/icebreaker-fpga

Edit formatting


Xilinx's tool suite works quite nicely on Linux, and there are excellent "starter kit" FPGAs from some of their partners. I have nothing but good things to say about the Arty line of boards from Digilent (I own one or more of each of them). As you start to approach the higher end of things with larger designs, the boards from Xilinx begin to become your best option unless you've got connections. Note that for small-ish designs the Artix 200 has a reasonable amount of space, and you can simulate and implement for anything up to the Kintex UltraScale+ 5 (including things that don't actually fit in the 5, for simulation) using the same free-as-in-beer WebPack license.


TinyFPGA BX is pretty good (all open source tooling), but generally everyone has linux tools. edit: unless you meant which one would boot linux best :P Answer is "the expensive ones"


Ha, yes I meant tooling. But what if you want to scale your design to a different platform, would the open source tools still work?


For simulation and formal verification, yes, absolutely. Either Icarus Verilog, Verilator or SymbiYosys have been used for large commercial designs.

For synthesis, the only FPGAs families currently supported by an open source flow are the Lattice iCE40 and ECP5 [0]. The latter is something you can be decently productive with and can fit quite a bit of logic (think: Amiga reimplementation, PCIe interfacing, etc.).

If you'd like to port synthesizeable code from the open source world to the commercial world, this _should_ just work as long as you're willing to rewrite any physical interfacing code (since those depend on hardware blocks available in a particular family) and stick to high-quality Verilog. But that's the same as porting across any other FPGA families.

Disclaimer: I work with SymbioticEDA, who develop and provide commercial support for some open source digital logic tooling, like Yosys and Nextpnr.

[0] - https://github.com/YosysHQ/nextpnr


Here is an online course based on the book; "Digital Design and Computer Architecture, 2nd Edition by Harris & Harris" which teaches you processor design using FPGAs.

https://blog.digilentinc.com/teaching-computer-architecture-...

The boards are from Digilent, whose "Analog Discovery 2" and "Digital Discovery" multi-tools are extremely useful for learning Embedded Systems Programming.


Silicon Catalyst (https://siliconcatalyst.com/) operates an incubator that supports IC design and development with access to commercial tools. Book learning only goes so far, then apprenticeship and experience becomes necessary. To really understand how to make integrated circuits you have to make a few.


Pick up some data sheets for IC parts that might be useful in a system you'd like to design. These are published by the manufacturer, and they contain a lot of design requirements about how to layout properly a PCB and some about the theory of operation of the parts. You can piece together a lot of practical information this way.


I see a lot of digital design resources mentioned, but if the application is ML, then perhaps analog computing is an interesting alternative approach because here the intermediate results need not be exact.


The great Nicklaus Wirth wrote a book;

"Digital Circuit Design for Computer Science Students: An Introductory Textbook"

It is different from other standard texts and well worth studying.


What kind of ML research are you doing?


search youtube for Onur Multu, he shares his lectures.



Start here:

Boolean Logic & Logic Gates: Crash Course Computer Science #3

https://www.youtube.com/watch?v=gI-qXk7XojA

It's a nice, simple, easily watchable and highly visual overview of what you're going to start learning.

From there, it's virtually guaranteed that YouTube will suggest more relevant videos, and you'll be on your way.

Then later, you might want some books on "Digital Logic" (that's the keyword to search for).

To go up from that level of abstraction, I recommend "The Personal Computer from the Inside Out: The Programmer's Guide to Low-Level PC Hardware and Software (3rd Edition)" By Murray Sargent III and Richard L. Shoemaker: https://www.amazon.com/gp/product/0201626462

To learn Assembly Language: Randall Hyde's Art Of Assembly Language http://www.plantation-productions.com/Webster/www.artofasm.c...

To learn Compilers: Compiler Construction, by Niklaus Wirth http://www.ethoberon.ethz.ch/WirthPubl/CBEAll.pdf

Let's Build A Compiler, Jack W. Crenshaw & Marco van de Voort: https://www.stack.nl/~marcov/compiler.pdf

A Small C Compiler, by James E. Hendrix (sorry, this one is behind a pay/registration wall) https://www.scribd.com/document/289762150/Small-C-compiler-v... https://en.wikipedia.org/wiki/Small-C

Tiny C Compiler https://bellard.org/tcc/

Wikipedia articles related to compilers: Compilers, General: https://en.wikipedia.org/wiki/Compiler BNF: https://en.wikipedia.org/wiki/Backus%E2%80%93Naur_form Shunting Yard Algorithm: https://en.wikipedia.org/wiki/Shunting-yard_algorithm Parse Trees: https://en.wikipedia.org/wiki/Parse_tree

Here's a great way to visually explore how various compilers create assembly language instructions: https://godbolt.org/

Up the abstraction level from all of that is LISP and LISPlike languages, Tensorflow, ML, and related high-level abstractions -- but you probably are more aware of those than the lower levels...

Anyway, good luck! It's a wonderful journey to undertake!




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: