I've programmed both, though, not a PIC for many years now... PICs are just weird.
The AVR is a pretty typical Harvard RISC: 32 registers, memory load/store, direct/indirect jumps, stack pointer pointing at ram, etc. It's still 8/16 bit, so some pairs of 8 bit registers work like 16 bit registers.
The PIC has one accumulator register; ram is a set of (manually selected) register banks. Instructions operate on the accumulator and an operand in the selected bank. The stack is totallyseparate from other RAM, so it's fixed size (8 maybe?) and can only be used with call/return.
I don't want to be judgmental; there's certainly some strange virtues to PIC, like the lack of any real state to save/restore on ISR entry/exit; but they really just don't fit my mental concept of a CPU.
Exercise for the reader: Given the stack and register bank selection, which one of these has better C compilers?
(Still, if you're learning something new, gamble is right: ARM is the future.)
PIC is clearly designed for non-C assembly code, with all global variables. Its ancient.
PIC32 is actually a MIPS core (sorta, kinda, maybe). It actually uses GCC to compile (good luck on compiling your own GCC though). However, there isn't much appeal with CM3 now on the scene.
PIC24, the 16bit PIC variant, is much imporved in this respect. You have 16 working registers (though one of these is used for the stack), four of these are shadowed for fast store/restore (eg, so the registers can be used in an interrupt handler), memory is directly accessible (most instructions can operate on both memory and registers equally) and the instruction set is very simple and regular.
I have not used it with C (I have so far written my code in PIC assembly), but they claim the instruction set is very compiler friendly and at a guess, I'd say they're right since the memory access is very regular and most instructions also come in convenient three-operand forms.
I'm not sure if the instruction set for the PIC24 is basically a 16bit variant of the PIC32 (have not looked into these in detail yet), but from the code samples I've seen, it seems to be. So PIC24, like PIC32, seems to be a RISC-based core.
I actually like the PIC24. Besides the painful lack of (assembly) documentation outside the datasheet and various reference manuals (I guess nobody programs them in assembly these days? I suppose the same problem probably exists for other newish microcontorllers too?), it is very nice and easy to work with. The assembly language is very simple and easy to use. But, since the Cortex-M3 is similarly priced, but is faster (~100MHz vs 40MHz for a PIC24H), is 32bit, has a ton of features and peripheral support, its hard to choose PIC over these. I will probably look into switching to CM3 for my next project.
Seriously, Cortex-M3 microcontrollers are much more powerful than any AVR or PIC at the same price, with roughly the same level of complexity as a high-end PIC/AVR.
ARM already dominates high-end embedded systems, and is quickly penetrating the low end. Atmel and Microchip both see the writing on the wall; control of Atmel's ARM line was a major factor in Microchip's failed hostile takeover bid for Atmel several years ago. If you have any professional interest in embedded systems, it may be a better bet to invest time in ARM rather than an 8-bit architecture.
There are a few arguments left in favor of PIC/AVR: ultra low power consumption, pin count, and price for the low-end chips, DIL package availability, and support for hobbyists in the forums. I'm not sure a difference of $1/unit and a few uW/MHz is relevant for hobbyists, and the Arduino is probably a better solution for people who aren't comfortable working with SMD. (Which honestly isn't that hard, anyway...)
ARM is awesome. Specially the low end Cortex chips.
However, for hobbyists these chips are not that easy. None of them come in DIP packaging so instead of simply soldering them in a simple prototype board or sticking them in a breadboard you will have to find SMD adapters or create your own PCB.
How do the low-end ARM devices compare to 16bit and 32bit AVR and PIC offerings?
Recently, I have been doing a lot of development for both the (8bit) AT90USB162 and the (16bit) PIC24HJ. The PIC24HJ range of 16bit PIC microcontrollers packs an awesome punch at 40 MIPS, has mostly 1-cycle instructions and two hardware SPI peripherals (which, for me, is important. On the other hand, the AT90USB162 only has one). I have been interested in ARM, though, so am wondering if it would be worth switching before I become to reliant on other microcontrollers.
Cortex-M3 runs circles around any 8-bit parts in most settings - single cycle instructions, hardware multiply and divide, 32bit registers, good interrupt processing support. There are even power efficient parts (Energy Micro EFM32, STM32L, etc). The other nice part about CM3 is the family support: Its one compiler toolchain and one debugger toolchain. With CMSIS (a software library standard) its "easy" (if you don't use a specific peripheral in a specific way) to move code across vendors as well.
PIC24 is a good half step, but the proprietary toolchains are a major turn off. PIC32 is at least a MIPS core (with some vague gesturing on Microchip's part for their GCC port - good luck actually building their horrible code dump though), but doesn't appear to be energy-use competitive with CM3 or CM0.
The LPC1768 is quite popular and you should never have problems with stock. I got started with a $30 development board from NGX (google "1768 blueboard") paired with a $30 USB jtag programmer from the same store. The older LPC2148 was also quite popular for hobbyists, but it's ARM7 not CM3.
Since I was more interested in learning the micro then screwing around with toolchains, I just spent the money and got a personal license for Rowley Crossworks ($150). It took me under an hour to go from software installation to flashing an LED, so I'm pretty happy with that decision.
I've never done any development for the non-8-bit Atmel/Microchip products, so I can't really compare. The NXP Cortex-M3 I'm currently using for a project has one SPI and two SSP peripherals.
My impression from talking to other engineers is that there isn't a lot of investment in those architectures (financial or emotional...) and most people think that economies of scale will make ARM dominant in the long term. From a business perspective, neither company seems to believe they can afford to be without an ARM product line.
They come with a free license for code_red's customized version of Eclipse with support for firmware up to 128KB. It's apparently also possible to cobble together an fully-OSS toolchain, but I haven't tried yet.
Recipe for an open source, GCC-based ARM toolchain:
-GCC
-Newlib (an implementation of the C standard library aimed at embedded systems)
-Binutils
-GDB (optional; great if you pick up a jtag debugger and openOCD)
I pushed out a cross-compiler via this recipe that survived an entire semester of usage in labs along with an ARM Cortex-M3 board.
No limitations on RAM usage, full optimizations available. As far as I've seen, GCC does a nice job using the ARM thumb2 instruction set when you enable optimizations.
I felt compelled to put together my own linker script; one of the more painful parts. I'm not sure if the default linker script will work or not.
I've been meaning to put this toolchain recipe online somewhere, along with a tool I put together for flashing the GCC-produced executables to the Cortex-M3 via USB DFU.
- CMSIS for your part should already come with a binutils compatible linker scripts. The three I use (ST, NXP, Energy Micro) all come with them already. Obviously you'll need to modify the script if you're rolling your own bootloader or similar, but it should work otherwise.
- I like defaulting to the CodeSourcery G++-Lite toolchain. Its also ARM, Thumb, Thumb2 multilib capable for non CM3 parts (like ARM7/ARM9).
- USB DFU is not universal. NXP has cool idea (mass storage device, though buggy when used with Linux without mtools). ST's version kinda sucks (try cribbing from the Maple code maybe). No idea on Atmel.
I mostly just wanted to roll my own cross compiler, plus it was easier (for me) to distribute to the network machines that way.
USB DFU is a device class specified by the USB standard. I like the idea of following a standard more than getting locked in to a company because I have only implemented their bootloader protocol.
I've been using the STM32 series Cortex-M3's, and unfortunately ST puts it's own little layer ("DFUSE") on top of DFU.
It would be nice if these companies would just go ahead and conform to some bootloader protocol (USB DFU!) so developers can stop wasting time there trying to talk to boards.
I agree on the bootloader front. I wish they would just settle on a single implementation. NXP's version is however useful, since it requires no tools to operate (delete file from "disk", copy new file to "disk").
Something based on the Cortex-M series (http://www.sparkfun.com/products/10664) probably is a better ARM starter board for someone who's getting started on the embedded stuff. Cortex-A series ARMs are completely different beasts in terms of complexity.
I second that.
The BeagleBoard is amazing. It can do real time (the original one can handle roughly 25 fps) computer vision, thanks to its built in DSP. The newer one should be even better (but I've not heard as much about it).
Leaflabs makes Maple boards that are somewhat similar but less expensive, under $50.
They are based on ARM Cortex M3. Worth checking out. ( http://leaflabs.com/devices/maple/ )
Some of them come in DIL packaging (at least the little ones). The architecture has a GCC port and two free-as-in-small-glasses-of-beer commercial IDEs. The assembly language is bearable. The cheapest development board (with single-stepping debugger support) is $4.30.
Have they sorted out the Mac support? I have one and haven't been able to get the toolchain to communicate with my MacPro. and no way am I booting into Linux just to push firmware.
"PIC vs. AVR" means two different things depending on your context. The intelligent context is discussing which family best suits somebody new to microcontrollers (which thankfully is what this link is about). The moronic context is entrenched hobbyists arguing about why the family they chose is superior.
Anybody who reads this headline and already understands what the acronyms mean is probably going to assume it's the latter.
The time stamp from the webpage is "May 17, 2011 20:07"!
But if it is outdated a PIC-AVR-ARM update would be interesting. Personally I used all three, but prefer nowadays PIC because of the wide-ranging PIC24&dsPIC and Microchips great stacks offering.
At the time there were many schematics available to program a PIC right off the serial or parallel port. I used one of those until I built a WISP628 which is an upgraded version of the idea.
This article manages to mention just about every insignificant detail (who cares about the crappy IDEs they ship with?) and almost none of the actual properties of the chip!
You would expect at the very least memory comparisons, clock speeds, and power consumptions. Maybe a mention of timers, ADCs, UARTs, etc.
I think it concentrated more on the aspects of how pleasant each is to use. The analysis was light on the technical side because ladyada targets more hobbyists (like myself :D ) and prototype creation rather than people that want to create a product that will be manufactured, reproduced and distributed on a large scale.
Though I haven't used PIC personally though, I have used AVR before, and found it quite pleasant and flexible, though in all fairness I can't compare it to PIC. I will say though that there seems to be more resources and documentation on PIC than AVR (I bough a really good book on embedded devices that was based on PIC, and though the examples were helpful universally, they were targeting the PIC crowd specifically.)
You'd think so.... I have no idea why the embedded world is so into what IDE they can use. Too many hardware engineers turned firmware engineers wanting an easy solution?
I know that a lot of people hate Arduinos, but I like the flexibility of prototyping on an Arduino, moving the same code to a raw ATmega328, and then (for teeny projects) maybe port to an ATtiny for cost and space savings.
One thing that Lady Ada forgets to mention is that Microchip is working on MPLAB-X. A cross-platform version of their MPLAB IDE. I've used it on both Windows and OS X and it works well.
It runs on Wind, OSX and Linux.
This might be their way to get back into AVR territory. At least when it comes to hobbyists/makers.
The AVR is a pretty typical Harvard RISC: 32 registers, memory load/store, direct/indirect jumps, stack pointer pointing at ram, etc. It's still 8/16 bit, so some pairs of 8 bit registers work like 16 bit registers.
The PIC has one accumulator register; ram is a set of (manually selected) register banks. Instructions operate on the accumulator and an operand in the selected bank. The stack is totally separate from other RAM, so it's fixed size (8 maybe?) and can only be used with call/return.
I don't want to be judgmental; there's certainly some strange virtues to PIC, like the lack of any real state to save/restore on ISR entry/exit; but they really just don't fit my mental concept of a CPU.
Exercise for the reader: Given the stack and register bank selection, which one of these has better C compilers?
(Still, if you're learning something new, gamble is right: ARM is the future.)