You are currently viewing SemiWiki as a guest which gives you limited access to the site. To view blog comments and experience other SemiWiki features you must be a registered member. Registration is fast, simple, and absolutely free so please, join our community today!
Microsoft have formed a new research group around AI. I know some will see this as yet another satanic plot from the evil empire. Personally I am neutral on MS (and happy that Nadella replaced Ballmer) and feel the more companies that are investing in a domain, the better it is for that domain. But what do I know.
Quote:
"Larus describes the many redesigns as an extended nightmare—not because they had to build a new hardware, but because they had to reprogram the FPGAs every time. “That is just horrible, much worse than programming software,” he says. “Much more difficult to write. Much more difficult to get correct.” It’s finicky work, like trying to change tiny logic gates on the chip."
This situation can be greatly improved by using C# for design and debug, compiling the source code into a Syntax Tree, and emitting HDL.
Existing FPGA tools are focused on physical design rather than logic design and debug. HDL is not good for logic design entry.
I am an old time logic designer and wrote a basic parser that emits microcode for an FPGA design rather than HDL to show feasibility.
Microsoft also has support for logic programming and they need to do logic and data flow design using existing capability. Data flow design is key in FPGA design just as it was when they decided to process data bypassing the CPU. Another win for heterogeneous processing.
Does MS have CAD type tool for designing data flow diagrams? I have to use Altera Quartus for block diagram(.bdf) editing, but C# works for modelling, parsing, logic simulation, etc.
Probably intended for my blog on MS, FPGAs and datacenters, but still got to the right blogger I am guessing here, but I'm not entirely sure their problem was only with logic creation. Getting very high performance (essential in this context) probably also requires some tweaking (or more) at the gate level (as it does in generating very high performance ARM cores in ASIC design).
That tweaking/redesign applies to long paths that may not exist after design iterations and logic fixes are done.
Computers were once brought up using slow clocks that were gradually increased as tweaking was done.
I am trying to point out why the MS exec was so frustrated with the finicky unfriendly tools -- I am too. That is why I am simulating logic then creating Verilog for physical design.