Yesterday I met with David Hsu who is the marketing guy for Synopsys’s cloud computing solution that they announced at their user-group meeting earlier this year. It was fun to catch up; David used to work for me back in VLSI days although he was an engineer writing place and route software back then.
David admits that this is an experiment. Nobody really knows how EDA in the cloud is going to work (or even if it will work) but for sure nobody is going to find out sitting meditating in their office. So Synopsys took what they considered a bounded problem:
- High-utlization, otherwise it will be too difficult to get even initial interest
- Not incredibly highly-priced, so that the cloud solution doesn’t immediately undercut the existing business
- Consumes lots of cycles when used, so the scalability of the cloud is a genuine attraction
VCS simulation seemed to meet all those aims. Verification is, depending on who you ask and how you count, 60-70% of design. So certainly high utilization. VCS is not a $250K product. Of course nobody, not even Synopsys, really knows what customers are paying for it since they do large bundled deals. But not incredibly highly priced for sure. And it consumes lots of cycles when used. Lots. Especially coming up to tapeout when it may be necessary to run the entire RTL verification suite in as short a time as possible. Under those circumstances, a thousand machines for a day is a big difference from 10 machines for 3 months.
The cloud solution is sold by the hour, all inclusive. The precise price depend on how much you buy and all the usual negotitions but is of the order of $2.50 to $5/hour. The back end is Amazon’s cloud solution and the sweet spot is batch mode regression testing where the thing everyone is most interested in optimizing is wall-clock time.
The biggest challenge has been security issues. If a customer wants to buy a VCS simulator, an engineering manager can cut a PO. If a customer wants to ship the crown-jewels of their company out of the building then legal, and senior management and even, in some companies, the Chief Security Officer need to get involved. Solving some of this is pure emotion, not driven by numbers. David told me of one meeting where a lawyer asked an engineering manager “how can your risk the company’s survival like this?” I suggested that an appropriate response from the engineering manager would have been “how can you risk the company’s survival by delaying all our tapeouts?” What is rational to an engineer is emotional to a lawyer, and vice versa.
But the underlying driver for the business is strong. Large companies are doubling the size of their server farms every two years (Moore’s Law for server farms I guess). Increasingly companies want a base load, maybe 75-80% of peak load, done in-house and then offload the remaining 20-25% to something such as the Synopsys cloud solution.
I asked David if the business is growing faster or slower than expected. He admitted it was slower but also pointed out, reasonably, that nobody has a clue how fast it should grow–it is a new market. But it does seem to be starting to catch on. The biggest attraction is that knob to turn that an engineering manager has never had before: you can reduce the time to run a regression by spending more money. But not much more money. And you don’t end up with licenses and hardware that will sit unused until the next peak load comes along.
To me one of the biggest challenges for EDA in the cloud is that nobody has a single vendor flow. But if you have to keep moving data out of one cloud into another, especially with the size of design databases today, that is probably not tractable. David admitted this was a problem: they have customers who want to use Denali (now Cadence) with the Synopsys cloud solution, just like they do in their own farms. But in the cloud that requires more than just handing out two purchase orders, it requires Synopsys and Cadence to co-operate. How’s that working these days? Conceivably a 3rd party edacloud.com (I just made up the name) could integrate tools from multiple suppliers and deliver the true value of making everything work together, but it is probably not a realistic financial model to buy all the tools required up-front as a “normal” multi-year license. While it might be easier for a 3rd party to get Denali from Cadence and VCS from Synopsys and put them up on Amazon, it is not clear. Ultimately it is customers who have the power to drive this. But as with the open PDK situation, even that might not be enough.
Moving an entire design database (once it gets to physical) is not just simply a matter of copying it across the net. You can put it on a disk drive and FedEx it to Amazon, and they will do this. In fact that is apparently how Netflix got its database of streamable video onto Amazon, although they shipped whole file-servers. But in the iterative loops of any design process this seems unwieldly, to say the least.
Anyway, VCS in the cloud is clearly promising but also too soon to tell whether it is a harbinger of things to come or a backroad.
For further details, visit the Synopsys cloud page where you can find the white papers too.
Note: You must be logged in to read/write comments
Share this post via:
Comments
0 Replies to “Simulating in the Cloud”
You must register or log in to view/post comments.