At TSMC’s OIP Symposium recently, Xilinx announced that they would not be building products at the 10nm node. I say “announced” since I was hearing it for the first time, but maybe I just missed it before. Xilinx would go straight from the 16FF+ arrays that they have announced but not started shipping, and to the… Read More
Tag: fpga
A Brief History of FPGA Prototyping
Verifying chip designs has always suffered from a two-pronged problem. The first problem is that actually building silicon is too expensive and too slow to use as a verification tool (when it happens, it is not a good thing and is called a “re-spin”). The second problem is that simulation is, and has always been, too slow.
When Xilinx… Read More
Secret Sauce of SmartDV and its CEO’s Vision
SmartDV started as a small setup in Bangalore in 2008 and by now is one of the most respectable VIP (Verification IP) companies in the world. Having a portfolio of 83 VIPs in its kitty and growing, it has a large customer base, including the top semiconductor companies around the world. The company has grown significantly and is raring… Read More
NIWeek: Xilinx Inside
Being from Britain, NI always means Northern Ireland when I see it. After all the official name of my country is the United Kingdom of Great Britain and Northern Ireland, giving us the same problem as the United States of America, the full name is a mouthful. So we abbreviate the country to UK and call ourselves British or even Brits.… Read More
More FPGA-based prototype myths quashed
Speaking of having the right tools, FPGA-based prototyping has become as much if not more about the synthesis software than it is about the FPGA hardware. This is a follow-up to my post earlier this month on FPGA-based prototyping, but with a different perspective from another vendor. Instead of thinking about what else can be done… Read More
Xilinx Datacenter on a Chip
I talked recently about the Intel acquisition of Altera which seems to be all about using FPGA technology to build custom accelerators for the datacenter. Some algorithms, especially in search, vision, video and so on map much better onto a hardware fabric than being implemented in code on a regular microprocessor.
So if the heart… Read More
Why Did Intel Pay $15B For Altera?
While I was at the imec Technology Forum someone asked me “Why did Intel pay $15B for Altera?” (the actual reported number is $16.7B).
The received wisdom is that Intel decided that it needs FPGA technology to remain competitive in the datacenter. There is a belief among some people that without FPGA acceleration available for vision… Read More
Xilinx in an ARM-fueled post-Altera world
When the news broke about the on, off, and on-again Intel-Altera merger a few weeks ago, I checked off another box on my Six Degrees of Kevin Bacon scorecard. That plus a $5 bill gets me a Happy Meal at McDonalds, but in a post-Altera world, it might be worth more.
On January 16, 2008, I’m sitting in a meeting with some Intel strategic marketing… Read More
Why is Intel going inside Altera for Servers?
You should be happy to listen that Intel will buy Altera FPGA challenger, if you expect always more power to be consumed in datacenter! In 2013 the power consumption linked with the Servers and Storage IC activity, plus the electricity consumed in the systems cooling these high performance chips has reached 91 BILLION KWh (or the… Read More
Will Dark Silicon Dictate Server Blade Architecture?
Does the evil sounding phenomenon known as Dark Silicon create a big opportunity for FPGA vendors as was predicted recently by Pacific Crest Securities? John Vinh posits that using multiple cores as a method of scaling throughput is flattening out, and the use of FPGA’s to perform computation can help off-load and thus overcome… Read More