Any SoC or IC design project, whether implemented at the same design site or multiple sites requires some data management tools to manage things such as a central data repository, revision management of files, etc., for effective co-ordination of work among different team members. Given the challenge of meeting the shrinking time-to-market windows, semiconductor companies are creating design centers wherever it is economically viable to operate or where a substantial good talent pool is available. As a semiconductor professional myself, I have often wondered about the ramifications on design schedules and design efficiency when collaborating across globally-dispersed design centers.
I realized the impact when I met one of my old acquaintances during my regular morning walk last Sunday. On enquiring about his rather irregular morning walks, he explained that it was due to the very long hours he was compelled to work. He further elaborated that his design team was facing challenges in collaborating with different design team members in different parts of the world. Curious, I politely enquired whether it was due to communication issues. Based on my friends explanation, I understood that some groups within their company rely on an open source revision management system that uses mirrored sites, despite which file transfers into the repository as well as the ‘rsync’ between their mirrored sites was also taking a long time. Since it was open source software that they were using for data management, the support was very limited. Moreover, they had been using customized scripts to set triggers, which needed to be upgraded, and the person maintaining the scripts at his local office had left. Other groups were not using any data management systems and were using a name-based file versioning system.
My poor friend was stuck with late night sync-up meetings accommodating common time for all sites twice a week and of course managing to understand accents and diffierent ways English was spoken in different regions. Even though they had local sync-ups for the analog, digital and firmware parts of an SoC, they still needed to regularly consolidate all the work together to verify that they were working with the latest versions, for example, with the latest behavioral models from analog designers. And as a result of the chaos, his tapeout schedule had slipped, something which his managers were not too happy about.
Given the disorganized nature with which data management was set up, and the constant meetings my friend needed to have to ensure everyone was in sync, it was no wonder that my friend was stressed out. I remembered my old library design days when I had experienced headaches in regular mirroring of hierarchies for proper regression runs and when we had to babysit transfer of hundreds of design files, data or executable files to/from customer sites in the USA.
Figure: Globally-dispersed design centers collaborating on a design
In my opinion, owing to the complex nature of the design flows and designs being created, having a robust data management tool is no longer a luxury but an absolute necessity, especially when multiple design centers are concerned. It helps keep the engines of the design project well oiled and allows the designers to be more productive. Most importantly, it helps reduce the un-necessary stress in designers’ lives.
So, what are the requisites a data management must have in order to enable a design team, dispersed globally across different sites, to be more productive? Should we look at using open source software for data management or should we rely on commercial software? Here are some of the considerations I would think through:
· While I am a believer in open source software, and some of them are extremely good, the support is rather limited. The last thing I would like in a high-pressure project is to be stuck with some issues and being forced to browse through online forums to find a solution. Commercial data management solutions such as ClioSoft’s SOS and Perforce score in this area in terms of the level of support being provided. Moreover, commercial data management vendors are more amenable to work with you to customize the software to meet the unique requirements of your company.
· Commercial data management solutions provide a better feature set and roadmap compared to open source software.
· Although networks have large bandwidth these days, designers still suffer latency while accessing files from their central repostiories. Having a cache management system at each of the design sites, with regular automatic sync-ups with the data in the central repository, would help eliminate the latency problem. The local cache system should be configurable to store the commonly used revisions of the data, specific to the local design site. An additional advantage of this approach is that in the event of network down time, designers could continue to work without stopping at the local design centers.
· Network disks from vendors such as Netapp are not cheap. Hence from a cost perspective, it may not be prudent for the designers to copy the entire design database into their work area as the disk space usage can very easily balloon up. Ideally, the designers should be able to check out only the files that they want to modify into their local work area, while referring to the local site cache for the rest of the files.
· For the semiconductor insustry, espcially for analog, mixed-signal and backend designers, does it make sense to use a hardware configuration system such as ClioSoft’s SOS, which is built for big data sizes and performance, or a software configuration system such as Perforce? The requirements of the semiconductor industry to some extent differ from that of the software industry.
· No matter how much preacution a design team manager takes, a common problem he continues to face at the time of a design tapeout is the number of changes which are made at the last minute. As a result there are problems such as the central repository not being up to date, or unsolicited fixes being done at the last minute, which tends to de-stabilize the release. This can be resolved by maintaining specific release tags and ensuring that all files have them before the release and after fixes of all open defects against them. One of the requirements therefore would be the ability to set as many tags and labels as one wants during a design project cycle. This helps considerably while coordinating with team members at multiple locations.
· In order to restrict further changes in the files with release tags, there should be strict access control mechanisms that set the files to read-only mode the moment they are promoted with a release tag. In fact, access control must be used to allow or restrict access to particular groups whenever required.
· Design managers should be able to leverage off existing reports or create their own reports to determine the audit trail or to determine whether the same version of the IP is being used by all the team members.
· There must be an integrated defect tracking system that can be reviewed for the required fixes so that one can ascertain that the required fixes from other parties are done.
· There should also be a mechanism to take snapshots of the design very easily. There should be an easy mechanism for maintenance of tags and labels for hand-off of specific modules. This is very useful for designers when they are running different experiments in different directories to meet the performance criteria. It is etremely easy to lose track of which directory has the correct results. Having a mechanism to tag the directories would be extremely useful.
Do you have any other insights into what a data management tool for a semiconductor company must have? Comments are welcome!
Also ReadShare this post via: