WP_Term Object
(
    [term_id] => 17734
    [name] => Mirabilis Design
    [slug] => mirabilis-design-ip
    [term_group] => 0
    [term_taxonomy_id] => 17734
    [taxonomy] => category
    [description] => 
    [parent] => 178
    [count] => 4
    [filter] => raw
    [cat_ID] => 17734
    [category_count] => 4
    [category_description] => 
    [cat_name] => Mirabilis Design
    [category_nicename] => mirabilis-design-ip
    [category_parent] => 178
)
            
Mirabilis Webinar Banner
WP_Term Object
(
    [term_id] => 17734
    [name] => Mirabilis Design
    [slug] => mirabilis-design-ip
    [term_group] => 0
    [term_taxonomy_id] => 17734
    [taxonomy] => category
    [description] => 
    [parent] => 178
    [count] => 4
    [filter] => raw
    [cat_ID] => 17734
    [category_count] => 4
    [category_description] => 
    [cat_name] => Mirabilis Design
    [category_nicename] => mirabilis-design-ip
    [category_parent] => 178
)

CEO Interview: Deepak Shankar of Mirabilis Design

CEO Interview: Deepak Shankar of Mirabilis Design
by Daniel Nenni on 06-11-2021 at 6:00 am

Deepak Shankar Mirabilis The founder of Mirabilis Design, Mr. Shankar has over two decades of experience in management and marketing of system level design tools. Prior to establishing Mirabilis Design, he held the reins as Vice President, Business Development at MemCall, a fabless semiconductor company and SpinCircuit, a joint venture of industry leaders Hewlett Packard, Flextronics and Cadence. He started his career designing network simulators for the US Federal Agencies and managing discrete-event simulators for Cadence. His extensive experience in product design and marketing stems from his association with the EDA industry in multifaceted roles. An alumnus of University of California Berkeley with an MBA, he holds a MS in Electronics from Clemson University and a BS in Electronics and Communication from Coimbatore Institute of Technology, India.

Electronic Systems design as a topic has been muddled since Gary Smith first defined it in 1996.  How true has Mirabilis Design stayed to the original definition?

Gary Smith had a vision.  He introduced electronic system design (ESL) at the Hayes Mansion in San Jose, and I was lucky to be in the audience, a newbie straight out of college. He explained the trends for the next 10 years, why conceptual studies were important part of the product lifecycle, explaining the diversion of the designs across market segments, why product design was going to get complicated, and how system design was the new panacea.  It took almost 20 year for his vision to realized.  Why did he bring up System design and why did he think it was important?  At that time in semiconductor and product teams worked in analog, digital, DSP, networking and software silos with a huge wall separating them.  He said these silos will have to breakdown for the future products to survive.  At that time, companies were vertically integrated.  He said there will be developers to make sub-systems and integrators to put them together. 

System Design is the defining of an architecture to meet the requirements.  System design all-encompassing and covers an IP, processor, Electronic Control Unit, Radar, Satellite and a supercomputer.  The architecture can include electronic hardware, software, analog, digital, sensors, networks, and associated interfaces.  The concept was slow to take off because there was no single solution that could really bring the teams together.

This is where Mirabilis Design comes in.  We focused our solution on system design and the full architecture coverage.  While other companies went out of business, moved to virtual prototypes for software development, SystemC for performance validation and network design, we stayed focused on system design and assisting the Architect.  Of course, we expanded the scope of system design to include the hardware, software, networks, and power. Our product, VisualSim can be used for microarchitecture exploration of a Network-on-Chip and processor core, latency study of a SoC or processors, and the quality of service for a automotive Electronic Control Unit.

We bring together best practices from multiple domains, have a focused library of modeling components that cover all the analysis requirements of systems and semiconductors, and a methodology that flows from requirements to product validation, and collaborative engineering to field debugging. Mirabilis Design started off helping researchers study trends and explore what the next generation of interfaces, processors and standards should look like.  Early customers studied PCIe, Network-on-Chip and Teraflop processors.  This helped us be rooted.

You are the first to advocate power analysis as an integral part of system design.  What benefit does power-performance trade-off bring to a design team?

When we look at a system, there are several requirements.  In the past, performance was the only real requirements, of there was always cost, area and weight.  Engineering focus was on latency and throughput.  As we moved to distributed systems with a semiconductor and embedded, we added quality of service and efficiency.  Then came reliability and failure requirements with the advent of Functional Safety standards.  Power has now become critical because of the evolution to Apps and their massive power drain on everything- watch, processor, and automobiles.  The measured cots is the Activity/Watt.  Power is a major deterrent to adding more functionality.  If it possible for you to develop management algorithms for different use cases, then you can add new features and still keep the power below the threshold.

Performance-Power trade-off is not hard to incorporate into an architecture model. But the model must be conceived from the beginning with this goal in mind.  First benefit is it brings the power architect and the system architect to the same workspace.  The second is that all requirements of the architecture are studied in the same architecture models.  This means that you eliminate all surprises before you start integration.  The third is that you have a single document to work off.

Designers struggle with power modeling because they are not sure of the states, transition between states, power in each state and when to switch states.  VisualSim has a table that allows the user to enter power information as a complex expression.  As the workload is executed in the system model, each device moves from one state to another state.  The user does not have to add any functionality.  It is built into the library components and into the modeling language. The statistics and power consumption diagram is also provided.

Design teams have always complained about the lengthy modeling effort, questions on accuracy and lack of modeling resources.  Mirabilis Design has been around for awhile.  Why do you think the mode has changed on this?

When we started off, Mirabilis Design had a small set of libraries and a C++ interface.  This meant that customers had to build every component of their semiconductor or system using a combination of programming, queues, resources, traffic and report generators.  As we evolved with our customers, we learnt that the best way to raise awareness of system modeling and increase adoption is to build a massive library of fully configurable components and application-specific templates. It is impossible for one single company to build every possible variation of a processor or bus or memory.  So, we developed a template that incorporated all the required parameters to achieve 95% accuracy.  When the user entered the value pertaining to a ARM A78 or a SiFive u84 or a CMN600 or a DDR4-2400 from Micron, the respective block is generated.  The nice part is that this configuration can be completed in hours. This template-based model construction is the game changer and has really made every processor and SoC designer look at the VisualSim modeling methodology.

Architecture coverage is never discussed, except in academia.  Is it panacea or a game changer?

Architecture exploration has always been after-thought.  This is because companies did not have the resources or the expertise to develop models.  Companies that have contributed the resources and either built it around VisualSim or developed it from scratch have seen substantial benefits in project schedule savings, quality and preventing mismatch between sections of the products.  Based on our customer surveys, we have seen a schedule saving of 40% and reduction in bugs and rework by close to 80%.  For those that take advantage it is a game changer.  Today the discussion is always around corner cases.  That is completely insufficient.  It is really about boundary cases and identifying use cases that the architect was not aware existed.  The other important consideration is that the Architecture team has control over the justification of the specification.

Mirabilis Design has over 500 library components.  All of them have been built in-house.  Why not just depend on vendor model and how do you ensure accuracy to the real hardware?

Vendor models are not uniform.  They come in different languages, different levels of accuracy, different abstraction, and are typically focused on performance validation.  Majority of the vendors do not have models.  Vendors start off providing models and then stop after they get some level of size.  Another problem with vendors is they are focused on the behavior of their component.  They do not consider the integration with similar components from other vendors to create a system model.   Most vendor models support specific scenarios and are not setup for degrees of freedom required for architecture exploration.

Accuracy is extremely important.  Mirabilis Design tests their models for multiple levels of accuracy- no run-time errors, latency for tasks are correct, power consumption, functionality.  We use a combination of vendor data and publicly available benchmarks to test.  Also, we work with customers and utilize their data.  This way the whole industry benefits from the accurate.  Our accuracy has been as high as 98% for the DDR4 power consumption to 90% plus for a x86. A stochastic model can be tuned to 85% plus accuracy.

Accuracy is not a single point in time. The accuracy must be in the context of a full system and across multiple traffic rates, clock speed, flit/width etc.  Also, the accuracy must match behavior, timing, throughput and power consumption. We spend about 3-4X of the time in testing.    

Does system design have a methodology or is it simply a mish-mash of tools, ideas, documents and discussions?

System design is a methodology and a rigorous one.  The flow must always be top-down with the ability to loop back at every stage.  You first build and test small sub-systems or IP blocks. You use these blocks to build bigger systems.  At every stage you validate the block behavior against the requirements.  Then you assemble the full system.  You run diagnostics to test the model responses against the requirements.  You identify the scenarios or set of parameters that fail completely, configuration that pass and then the one’s that boundary cases.  Once the model is tested, you run the model for large number of configurations, topologies, workloads, tasks and other considerations.  Each of these runs go through the same diagnostic sequence.  Finally, the becomes the dynamic specification for the development team, demonstration and configuration vehicle for customers and early validation of both the hardware and software.  Any design team that follows this strict methodology will greatly benefit from the goals of system design.

Mirabilis Design

Share this post via:

Comments

There are no comments yet.

You must register or log in to view/post comments.