(I changed the title of this piece as an experiment) Paul McLellan recently wrote on the topic of new ventures crossing the chasm (getting from initial but bounded success to a proven scalable business). That got me to thinking about the EDA market in general. In some ways it has a similar problem, stuck at $5B or so and single-digit growth rates, on the left side of a chasm separating it from an at least conceivably much broader market. EDA isn’t going to get more of the semiconductor pie, so now we look for ways to expand upward into software and embedded systems. That’s one way to grow the market, but are there different, or at least complementary ways to expand? One opportunity may be network architecture design and analysis, an emerging (and therefore potentially fast-growing) domain to which it seems we could adapt EDA techniques and principles.
It doesn’t take a lot of thought to realize that a network looks a lot like a netlist. Of course there are differences. All or most connections are bidirectional, “signals” are a lot more complex than 1’s and 0’s and the nodes are a lot more complex than logic gates. However, if obstacles like that were insuperable we’d still be using Spice to simulate logic, so differences aside, perhaps there are opportunities to apply netlist tool concepts to networks.
This idea is not new, but neither is it fully developed. SIGCOMM, the ACM’s group on data communications held a conference this August in which they devoted half a day to a tutorial on network verification. An extract from the tutorial introduction makes this point: “One can also view a network as a circuit using an EDA (Electronic Design Automation) lens …. If design rule checking is analogous to static checking, what is the analog of synthesis? … These analogies have led networking researchers to frame a new research agenda, made compelling by the ubiquity of cloud services, called Network Verification. They ask: what are the equivalents of compilers/synthesis tools, debuggers, and static checkers for networks?”. If that isn’t a clarion call for EDA innovators searching for a new direction, I don’t know what is.
One example of static analysis checks configurations for potential errors where routers may learn routes that are not usable or conversely fail to learn routes that are usable. More recent efforts aim to formally assess reachability of IP addresses and to define semantics for networks which might provide a foundation for proofs of correctness. Motivated by Software Defined Networks (SDNs) there is work around how to specify requirements at levels above the fairly atomic level in which individual routers are programmed, moving the abstraction up to network policies, so that individual router configurations can be derived automatically from a synthesis / compilation step based on that higher level requirement. There are analogs to ATPG (in this case, automatic test packet generation) and coverage analysis to perform end-to-end testing and performance analysis in network. And there’s more – this seems to be a very fertile area of research.
A very interesting aspect of analysis in this domain is that it can and often will be applied to live networks, rather than networks in the design stage. Application in field deployment was always a holy grail for EDA because it would take you past the very small universe of designers who might use your tool to the potentially much larger universe of field-deployment and maintenance specialists. That is what may make this direction so compelling – to grow from a total market of say 10-20K chip designers to a market of hundreds of thousands or millions of IT/IS and networking engineers.
Of course this won’t be easy, but in a slowly-growing, mature market returning interesting value to investors isn’t easy either. EDA principles will carry over and maybe some techniques too, but a lot of invention and new development will be required. And you have to worry about the small detail of if or when this market will actually take off; you don’t want to get too far ahead of the parade. That said, there are indicators. The ACM tutorial referred to the growth of cloud services. The potentially significant growth of IoT will further compound the complexity of networks above those we understand today and software defined networks seem likely to become more commonplace. In this new reality, wouldn’t you think a need for automated design, optimization and verification tools would become essential? The ACM certainly seems to think so.Share this post via: