You are currently viewing SemiWiki as a guest which gives you limited access to the site. To view blog comments and experience other SemiWiki features you must be a registered member. Registration is fast, simple, and absolutely free so please, join our community today!
Not too long ago I was reading a blog post on SemiWiki, which started by listing reasons of why to invest in your own chip design.
I struggle to find the article on the page. It contained the one or the other argument I did not think of myself yet.
Does anyone remember and can pin point me to that article?
Not too long ago I was reading a blog post on SemiWiki, which started by listing reasons of why to invest in your own chip design.
I struggle to find the article on the page. It contained the one or the other argument I did not think of myself yet.
Does anyone remember and can pin point me to that article?
Not too much, I`m afraid. If I remember correctly, the list comprised like 5 items.
Of which 3 I also found by my own:
fit-for-purpose (e.g. just the right memory size, just the right power supply)
easy adaptation of software components, e.g. OS (no surprises in the microcode, no undocumented commands, registers or workarounds)
cost reduction (comes with both of the above)
"Doing our own chip" is one of these nasty topics that come around each year, but nobody is really looking into it.
The hand-waiving argument is always "too expensive" or "massive R&D headcount required".
That might have been true ten or twenty years ago.
With the uprise of IP and fabless design companies for hire many constraints are gone.
BUT thanks to the old-fashioned intransparency on pricing it is really tough to get accurate numbers.
One thing that non chip designers may not know, doing a chip requires so much research and due diligence you will definitely be much more informed. You become an integral part of a very powerful ecosystem that leverages trillions of dollars of R&D. Buying chips off the shelf not so much.
Fit-For-Purpose -- just the right memory size, just the right power supply Easy adaptation of software components -- no surprises in the microcode, no undocumented commands, registers or workarounds Cost Reduction -- comes with both of the above
Competitive differentiation -- building an economic moat. Difficult to replicate/ copy. Specialization -- Move key functions into hardware. Performance, lower power consumption. A earlier view of the technology roadmaps, and perhaps a say in them. Software bring-up
Thank you guys!
Actually the differentiation topic was highlighted by us. I forgot to list it.
The business I am currently working in is heavily penetrated by technological disruption (individual component --> embedded component --> integrated function on SoC, you know the game).
The embedded stuff is more or less done.
The integrated stuff is not and IP plays an interesting role here.
But to deal with IP in order to market your own chips for a dying business model is just non-sense. Nevertheless dealing with IP and the fabless eco system in a forward-looking way is definitely a thing to go for.
It was important to get this differentiation across.
That happened yesterday, so thanks for the quick aid!
True, co processors for long integer arithmetic are indeed important for speed.
I am always impressed by clever DES designs that encrypt within one CPU cycle.
On the other hand side these things are nothing new. You would not want to base an IP business on crypto co processors.
I'm always wondering why a company, such as Facebook or Microsoft, would like to share their know-how in a particular area and rely on an IDM like Intel to develop a specialized chip for themselves. Eventually Intel will utilize the knowledge they learned to build something to sell to the whole world. That means all Facebbok's or Microsoft's competitors will get the benefit too. It doesn't make sense.
Due to rising design complexity and costs at cutting edge nodes, I thought we were heading for a limited-range of commodity digital chips/ or IP, or perhaps FPGAs, with differentiation being achieved in software or the FPGA.
The big system-companies getting into asic design has surprised me. Early Google-hardware, was basically based on off the shelf PCs.