Hacker News new | past | comments | ask | show | jobs | submit login
AJIT, a ‘Made in India’ Microprocessor (researchmatters.in)
287 points by mkbkn on April 22, 2019 | hide | past | favorite | 108 comments



Most people miss the real value of such projects. They serve two important purposes: First: they are intended to build a local pool of expertise and capability around semiconductor design and fabrication within the country. Second: although the chip itself is laughably underpowered compared to a modern ARM or Intel processor, it has one characteristic that makes it very interesting to the Indian government: they know exactly what is in the chip. No Intel Management Engine crap. No hardware backdoors. This could be a serious point in favour of the processor, if it is used in SCADA applications in sensitive industrial manufacturing processes.


Most people miss the real value of such projects. They serve two important purposes: First: they are intended to build a local pool of expertise and capability around semiconductor design and fabrication within the country.

This has real strategic implications for India. So much modern military hardware depends on microprocessors. So what happens to India, if most of the chip fabs fall into the hands of hostile foreign powers? (Or, what happens if foreign powers who have or are near the chip fabs become hostile in the future?)


In the first stage, AJIT has been manufactured in the government-owned Semiconductor Laboratory (SCL), Chandigarh, with a technology that offers the smallest building block of the size 180 nanometers. The researchers also plan to commercially manufacture the processor using more advanced techniques that provide the smallest building block of size 65 nm or 45 nm

-- TFA

Older nodes, being well-understood mature technologies, can be manufactured domestically.


Older nodes, being well-understood mature technologies, can be manufactured domestically.

On the modern battlefield, one side having considerably more processing power than the other side might swing the strategic balance. This means more sophisticated missiles, ECM, battlefield coordination, targeting, etc... all on the side having access to the superior chip fab technology in wartime.

Now think about the chip fabs in Taiwan.


> On the modern battlefield, one side having considerably more processing power than the other side might swing the strategic balance. This means more sophisticated missiles, ECM, battlefield coordination, targeting, etc... all on the side having access to the superior chip fab technology in wartime.

At least for space applications, it is common to use very old processes because of certification and, more importantly, processors that use larger processes are easier to radiation-harden. I don't know whether this also holds for military applications, but it seems at least plausible to me.


Do the sophisticated missiles etc. have powerful chips in them? I would assume the primary objective would be things like reliability, redundancy, etc. Beyond that, probably enough horsepower to deal with some control systems (so numerical differentiation, etc), but these I imagine would typically be microcontrollers and not full processors, right?

I imagine the design process requires a lot of high power computing, but when one gets down to implementing field devices, they typically aren't doing anything too high powered.


It depends, it really does. One of the local aviation companies I got to tour had two processes- one for SMD (mostly automated) and one for THT (mostly soldered by hand). On the newer SMD processes, the most popular computing chips were Xilinx FPGA's, which were everywhere. The way it was explained to me was that the FPGA can handle internal redundancy [1] without any additional space cost, and that one of the main reasons that they used FPGA's was that it makes it a lot easier to handle backwards-compatibility with older stuff. Another reason FPGA's are so useful is that they can cram a lot more calculations into less space, making the entire unit more compact. [2]

With that out of the way, yes, missiles now have a lot more computing power in them, even if they do more or less the same job- accuracy is generally the thing that improves here, as the physics models can be expanded to be more correct, as the number of sensors, and the way they are combined evolves, the end result is a better product. (And more Defence money, to boot!)

I've gotta say one more thing, though- a lot of the stuff in the military, and that goes to it, is built off now-ancient processes, because A) contracts, B) If it works, don't fix it, and C) standardization, so you may not see an FPGA in every missile, but they're getting there.

[1] this doesn't count as far as redundant parts are concerned, but it does give the engineers (and, by proxy, the customers) more trust in the outputs from the devices.

[2] It also costs way less space (and therefore money, to some degree, because mil-spec enclosures are expensive), to get two 1U servers than it does to have one vacuum-tube operated mainframe.


I don't know whether this also holds for military applications, but it seems at least plausible to me.

AFAIK, it is generally true. Military versions of the Intel x86 chips were generally one branding cycle behind. When hardened 486 chips were being used in F-16's 1st gen Pentiums were the norm for civilian use.


I don’t get it, you have a home grown chip how does that translate into spare parts for a mig? Or ammunition for the mig


It’s a step towards a fully self-sufficient defense industry. Something the US had better learn (especially in infrastructure) if we are at war with China and suddenly nobody in the US can buy any electronics.


I think it's more-so an important stepping stone for centralising their future military hardware, and have it built in India - not necessarily that it will compatible with their current hardware.


How is the backdoor situation when you license ARM and implement it yourself on an FPGA? I'm assuming there are big sections that are still black boxes you just paste in, but I would think that would still help contain the potential exploitation vectors a lot— like, if you were willing to design the ethernet peripheral/stack yourself, you immediately eliminate a lot of unknowns.


That's a good question. I've worked on a couple of products with those and never thought about it.

It seems to me that the interesting angle to this would be to avoid manufacture and instead spend your time on radical CPU designs on FPGA to be made from scratch later. No doubt that doesn't meet the needs of the folks in India.


Long ago i remember briefly reading some research articles about FPGA vulnerabilities. Don't know if it's practical, but people designing critical system are justifiably paranoid.


The ultimate security perspective is to assume that you're defending against an unlimited resource nation-state that's specifically targetting you.

Obviously, there is no such thing as "complete security" but you can try to secure what you can, that's why people concern themselves with what may seem like unlikely revenues of attack.

I mean depending on how critical your system is, it can really change how you operate. Get a virus on your computer? Some people use anti-virus, and when that says it "removed the virus" they're happy. Other people would only be happy with a complete OS reinstall from a fresh ISO made on a different machine as acceptable. Still other's would require new hardware, it's possible (and has been seen in the wild) to reflash the ROM of the motherboard, or potentially many other low-level chips through software, and to persist the virus through that. Likely no, just depends.


FPGA’s consume heck lotta power. Totally not Indian.


The first point seems to be the major focus of the project - it was developed entirely within IIT Bombay, which is one of the top 10 schools of India. As to the second point, the people in power within the Indian government, even the self-appointed "technocrats", are ill-informed on cybersecurity and generally only take on endeavors that appeal to a largely uneducated electorate. The Indian government seems to be focused on technology that helps meet basic bureaucratic needs, like Aadhaar (https://en.wikipedia.org/wiki/Aadhaar).


They could have used RV32 or even RV32E architecture (RV32 with 16 registers only, gets rid of a lot of area occupation in the smallest chips) and make it far more useful. RV architecture can scale down quite a bit, I don't buy the claim that it just won't fit in 180nm. That's the node that the first Pentium 3's were designed for!


> No hardware backdoors. This could be a serious point in favour of the processor, if it is used in SCADA applications in sensitive industrial manufacturing processes.

This is an important point.

In the future, polities who can't or won't fabricate their own chips will lose their independence to ones that do.


[flagged]


Please don't do this here.


It's based on spark V8 which means it's only 32 bit. The blokes who built it say that it can used in servers. Why would some want to use a 100Mhz 32-bit chip on servers !

They say that the price would be around 100rs. At slightly less than that you can get the Allwinner A13 CPU, which is 1Ghz Cortex A8 ARM with a built in Mali GPU. Why anyone would want to use this in a commercial setting is beyond me. But it's a brilliant achievement on India's part, to make their first indigeniously developed processor.


The article makes it quite clear that the project is aimed at jump-starting India's industry. To achieve their goal, their first step was a relatively low-complexity project conducted by an inexperienced team.

Although their PR effort is outlandish, I'm pretty sure they are aware of the limits of this sort of project.


The driver seems to be this:

"We are planning to use AJIT in the receivers being developed for NAVIC or IRNNS (The Indian Regional Navigation Satellite System), an indigenous navigation system for the Indian subcontinent," said one of the processor's designers in a Reddit AMA (ask me anything).

Article: https://www.indiatimes.com/technology/news/meet-ajit-1st-ent...

Reddit AMA: https://www.reddit.com/r/india/comments/bfrh0g/indias_first_...


Do you have another resource that mentions it being Sparc V8 based? I didn't see that in this article. Also doesn't Oracle own the Sparc ISA would they have to license this? Any insight into why they wouldn't have done with RISC-V instead of Sparc?


AJIT uses SPARC-V8 [1]. There's another project, the Shakti, which uses RISC-V [2].

[1] https://www.indiatimes.com/technology/news/meet-ajit-1st-ent...

[2] https://shakti.org.in/


Yes I was also quite surprised not to see SHAKTI mentioned here since it seems to be a far more advanced project. [Edit: After reading a bit more I see the aim of this project is to build local FAB experience, whereas SHAKTI seems to be about developing an Indian-designed server chip which is manufactured abroad. Both are independent and worthwhile goals.]

Some of their team contacted me about porting Fedora/RISC-V to SHAKTI. (Unfortunately because SHAKTI doesn't support the Compressed extension, which both Fedora and Debian require, straight Fedora/RISC-V binaries won't run on SHAKTI and so the whole distro will need to be rebuilt).


Hi rwmj,

I think there's been some blur in the goal here. SHAKTI's aim is not to just build a Indian-based server chip manufactured abroad. For instance, if you see their tapeouts page (http://shakti.org.in/tapeouts.html), you can see that they have taped out a chip in India as well with 180nm..

Also, Compressed instruction support has been supported on SHAKTI since and a beta version is now available publicly. The reason for that contact was to try to port Fedora on the already taped out chip which does not support compressed instructions.


I didn't know that Intel made their fab lines available but apparently they do.


More on the India built version of SHAKTI: https://link.medium.com/L25lUFUo6V

Runs at 70 MHz


The key difference between the AJIT and Shakti seems to be the fabrication partner. AJIT is fabricated locally. Shakti was built at an Intel Fab using its 22nm process. They serve different purposes.


What makes the design tied to the fabrication partner though? Why couldn't this team have used the Shakti design?


Because it's 64 bit and significantly complex, probably far too large for the 180nm process.


The SHAKTI they taped out with Intel 22nm FinFET is AFAIK conservatively an order of magnitude less complex than the Pentium III, which was successfully manufactured on 250nm and 180nm.


Sparc V8 would mean 32 bit. Which is somewhat curious, as servers are mentioned as a potential market in some of their material.


Does anyone know the license for SPARC-V8?

I would imagine it would be important to use an open ISA for such a strategic project.


It's mostly open. There are some annual dues you have to pay, which range from very low for academia, to 5 figures/year for commerical manufacturers. https://sparc.org/faq/


This is insane. With this in mind, why would they not use RISC-V?

My better guess is that they started it a long time ago. Long enough for RISC-V not to be in usable condition.

Hopefully, they'll redo the frontend.


SPARC already has an ecosystem thanks to Sun/Oracle. Gaisler did open implementation, Leon3. Sun/Oracle did T1 and T2. Registration to say it was SPARC compatible was only $99 when I looked at it years ago. My only question was why the heck isn't someone cranking them out for security or FOSS markets?

Didn't get a chance to find out. Now we have RISC-V chips being developed with lots of companies getting behind it. Still easier to get ahold of SPARC chips, though. I'll also add that quite a few projects in academia used SPARC. OpenPiton being one that might justify using SPARC over RISC-V, esp if NUMA machine. :)

http://parallel.princeton.edu/piton/#


RISC-V is getting quite the ecosystem, though.

IIRC, an effort made by Western Digital has made it possible to cross-compile stuff for RISC-V.

Debian works just fine on RISC-V silicon. Right now the big thing seems to be getting real-time, hardware-supported JavaScript processing working so that the modern web works on it well.


>real-time, hardware-supported JavaScript processing

You mean JIT, right?

The benefits of being interpreted falls flat when, in practice, everyone needs to run a highly tuned just in time compiler.


Yes, that. Words were failing me, as was Google.


The cross compiler looks like it was mostly contributions from SiFive, not WD. WD uses it, and references it.


At scale, it's much cheaper than, say, ARM, where you pay a percentage of revenue based royalty.


Yes, and it likely made sense if risc-v wasn't ready back when they started this project.


There are actually a few open source SPARC implementations, including the first two Niagra revisions.

https://en.wikipedia.org/wiki/SPARC#Open_source_implementati...


Thanks. I was about to write about no need for yet another instruction set.


Why? There's still a lot of opportunity for advancement in this area.


AMA by the designer of this processor in India subreddit.

https://old.reddit.com/r/india/comments/bfrh0g/indias_first_...


I interacted with the founder of Powai Labs back when I was a master's student there, working as a sysadmin :-).

Good on them!


With the type of comments here, I think they also expect everyone to start learning to program with writing a kernel instead of a hello world. Yes, it is underpowered and relies on old tech, but I guess if they just want to test the waters for a commercial chip industry, it makes much more sense to start with something small that can check the infrastructure, instead of going for a 7nm chip on their first try.


There's also a huge market for small, low-cost, low-complexity chips. Even if they produce something that's underpowered compared to the competition, it still may be perfectly suitable for a kitchen appliance or similar.


It brings home what a high bar modern chip fabrication tech is.

How many countries in the world have the resources and ability to manufacture a relatively modern CPU without outside assistance?


Slightly off topic; My favorite clothing brand is made in India; Robert Graham. Allegedly, they were the only place that could make what the founders wanted. He'd been searching for nearly a decade till he found a manufacturer in India that showed/proved what he wanted could be done.

Sure, some of the clothing from RG can be a bit, gaudy, it's the more subdued stuff I'm into. Most importantly, build quality is literally off the charts. I have shirts from RG that I've had for 4 years that still look brand new, even the collar is still crisp and looks new.


http://www.robertgraham.us/men/tees.html

$298 T-Shirts, ha. At that price I'm surprised they don't try to upsell a weave that affords limited ballistic protection.


$128 T-Shirts, $298 for one with a fancy beaded graphic.

For materials fancier than cotton you can easily spend 10x that, https://us.loropiana.com/en/p/Man/Shirts/Girocollo-FAF6689?c...


They finally made a chip. It's on 180nm: a node that chipmakers were on 18 years ago. To put into perspective, even eASIC retired their offering that was on 90nm node (2004-2005). University students regularly do deep sub-micron. SiFive targets modern nodes with micro-style cores running faster than this. I'm probably just going to keep recommending modern alternatives.


"AJIT is currently able to run one instruction per clock cycle and at speeds between 70-120MHz, but they expect to achieve 400-500 MHz clock speeds in the next upgrade. It's built on a 180nm technology, though that will eventually be bumped up to 65nm."

https://www.indiatimes.com/technology/news/meet-ajit-1st-ent...


Intel went up to 1.8 GHz on 180nm, just saying.


At what point did it become apparent that adding more pipeline stages to boost the GHz number wasn't going to help? Wast it at 180 or 90nm?

Seriously asking, because just "xyz did N GHZ" makes no sense in this era.


I thought the bigger problem with the claim was that it didn't mention that Intel did that with:

1. Ultra-expensive, full-custom design.

2. A massive budget from a massive amount of sales to fund No. 1.

Most ASIC's being made use standard cell wherever possible because the companies, even very profitable, either couldn't afford full-custom or didn't think the cost was worth it. The few that do it are kind of like elite class of chip makers with piles of money. So, it seems improper to compare what a smaller effort on low-tech nodes can do with an elite, chip maker on what was then cutting-edge node. Maybe smaller players can do it today with lessons learned, current tech, lower cost of labor, and so on. Still be skeptical.

Note: I am keeping up with old techniques and processes at 350nm and up for subversion-resistant chipmaking. We stopped being able to visually look for backdoors at around 250nm or so. That means chips that will be verifiable by large numbers of people will have to be above that. I think I'd get 100-400MHz like those in this article, though, given my counter-arguments above.


Everything you say is accurate, but, grad students have to start somewhere. If they can make a 65nm version that runs at say 500Mhz, that would be better. Remember that "real work" was done on desktops on a Sparcv8 chip that ran at 75Mhz or less.


I remember doing real work on CPUs such as a Z-80 at 4MHz. How our standards and expectations have changed...


Yeah, didn't Windows 95 run on 100 Mhz x86 processors? And Linux used to be blazing fast on it too (compared to today where Linux seems to be slower than Windows on the same hardware - yeah, I am looking at you Ubuntu).


There's an LTE chip that Zoho (backed Engineers built) https://timesofindia.indiatimes.com/business/india-business/...


Amazing they've managed to get this done despite all the government corruption and bureaucracy. But let's see if it actually gets manufactured and widely deployed.

India's government is the countries own worst enemy....


Any evidences on corruption or bureaucracy? Or, its just an opinion. Last few years India has specifically worked on corruption and bureaucracy. Specially in ease of doing business it has gone up from being at 100+ position to 53 in last 10 years. - https://en.wikipedia.org/wiki/Ease_of_doing_business_index

Indian government can definitely improve (and so can every governments of world). But since 1991 all Indian governments are taking slow steps in the right direction, with increase transparency and lesser corruption and bureaucracy (opening up the economy, digitizing governments, simplifying taxes, right to information act)


I am an Indian expat and can rant endlessly on government corruption in India. Most Indians living in the US are well-aware of the graft in their mother country but many Indians in India choose not to acknowledge it. A truly remarkable corrupt event occurred recently - https://en.wikipedia.org/wiki/2016_Indian_banknote_demonetis...


You posted the evidence in your own comment. India is far from the top rank (just above Mexico) and last year they were ranked near the bottom.


Indian government is still extremely corrupt. Carefully managed PR campaigns are making it seem like things have improved considerably, but the fact remains that things are still pretty bad on the ground.


They specifically designed it for use within India's GPS satellites and probably there too had very limited scope and usage which would explain it being under-powered.


If compliant with the Sparcv8 ABI it would immediately have a lot of software and compiler support.


(•_•) Oh no, how much longer before we can expect an.. ( •_•)>⌐■-■ Ajit Pai?


[flagged]


> Please don't post shallow dismissals, especially of other people's work. A good critical comment teaches us something.

https://news.ycombinator.com/newsguidelines.html


India disagrees.

FTA:

> A processor made in India offers more than just the cost benefits. It provides the country with autonomy and self-reliance in the electronics sector and reduces our dependence on technology imported from other parts of the world. It also ensures a secure system with no opportunity for any backdoor entry, thus preventing digital sabotage by other countries or malicious organisations.


Did nobody watch Gandhi? Home spun is important!


Home spun was important. Now business has changed and companies are in a race to get up the learning curve quickly and capture new markets. It might be better if they partnered for foreign direct investment (FDI) in exchange to tech assistance. They don't have much to offer otherwise. There are many who would like to get a piece of the action in India, and the companies that try to go it alone will miss out on the globalization wave.


Elsewhere in this thread people mentioned that these CPUs are based off designs from the 90s. Maybe they're useful for controlling your nuclear reactors or something, but it's not going to be useful for general computing needs.


Embedded CPUs vastly outnumber the ones used for general purpose computing, so that’s not necessarily a bad thing.


This is silly, general computing has been done forever in lesser processors.


Hence why you start with the stuff for controlling your nuclear reactors (and, perhaps more importantly, fighter jets and guided missiles) to develop the local engineering and industrial base. And then gradually evolve that into general computing.


They have to start somewhere. Despite what you think it's almost impossible to create a 7nm FAB from nothing. The fact they've made a 180nm chip which works is quite miraculous given how hard this technology is to master.


yeah, the Russians went this way too, ... , anybody heard anything else than one or two press releases ?


Why don't you mention China? I've lost track of the amount of cynical comments regarding their initial projects, and now China relies on the output of those research powers to supply their HPC centers.

Hell, this blend of cynicism was also directed at ARM's offering.


what I'm trying to say is that it takes a hell lot more than a team of 20,30 to create something successful these days. it takes a lot more than a piece of silicon that executes code to create something successful. only advantage they have is that they have the support of nationalism ... ARM was massively successful in the mobile space before they considered desktop and server and people start knowing them.


India has a population of 1.339 Billion. They'll have no problem finding/training more engineers to build up that team.


Baikal MIPS is available on dev boards you can buy: https://www.cnx-software.com/2018/09/26/baikal-t1-last-mips-...


They cost $1.44 USD, or INR 100.


it costs a lot more if you consider the cost of developing an actual product and not just a demo, it costs even more when you have to iron out bugs in the silicon ... when you buy a well established chip you pay not only for the silicon but for the maturity of the platform and ecosystem.


that “a lot more” is relative to the country’s currency value and their average high tech sector salaries...in this case, it would cost “a lot more” to do the same in US vs India


[flagged]


They need to add some features from the Transmeta processor, using the technique of rewriting code on the fly. That way, this could also be "A JIT" processor.


Sad to see this as the most popular HN comment. HN~Reddit ?


I forgot that humor wasn't allowed on HN.


came here to see if somebody said this, thanks


Boooo. Your jokes are bad and you should feel bad! :-D


Moderators, I'm attempting to flag this post, but I do not see the flag on this post's page.


(I'm not a mod)

There's a small karma threshold before the [flag] link appears. I think it's about 30.


[flagged]


Why?


Pajit is a racist/ derogatory way of addressing or stereotyping people of the Indian subcontinent. Similar to how people call Muslims Abdul etc.


It also misses the mark because they were probably already going for a play on words with Ajit. Only, you know, without the hatred.


The SBC could be the AjitPai.


I thought it was a reference to the FCC's current anti-net-neutrality chairman, whose nick using the common "last initial + first name" scheme, would indeed be 'pajit': https://en.wikipedia.org/wiki/Ajit_Pai


Why would using a regular human name for your compiler be racist. If somebody calls a compiler Mike, is it racist ?

I mean, if a french guy create a board named Michel, I would find it pretty cool.

Am I missing an english specific use of Pajit ?


If a black person name's their son Tyrone, there's nothing wrong with that. It's a normal, human name. If someone walks up to a black guy that they've never met before and says, "Sup Tyrone?" that's racist. In the context of this, let's say that this was a group of black students from Morehouse, Tuskegee, or Florida A&M. It can be racist to say, "They should have named it Tyrone."

Racism is all about context.


+1 for you as you took the time to explain instead of down-voting blindly.


It's really hard to tell when someone is asking a good faith question and when someone is trolling. I try to put my faith in humanity. Granted it also helps that your posting history indicates that you're not a typical troll.


I'm Indian, and have lived in various parts of India for years at a time. I have never come across anyone whose name was Pajit. I've only ever seen "Pajit" because of racist trolls online.


I've only seen it written as Pajeet, but yes, also only in that context.


Never seen it online myself so I wouldn't know.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: