“Testing can only prove the presence of bugs, not their absence,” stated the famous computer scientist Edsger Dijkstra. That notion rings true to the many college participants of the Hack@DAC competition offered during DAC 2018 in San Francisco. The goal of this competition is to develop tools and methods for identifying security vulnerabilities in the SoC designs using both third-party IP (3PIP) and in-house cores. The trustworthiness of such SoCs can be undermined by security bugs that are unintentionally introduced during the integration of the IPs.
During a 6-hour final trial, the finalists are requested to identify and report security bugs from an SoC that is released to them at the start of the day. The teams mimic the role of a security research team at the SoC integrator, in trying to find the security vulnerabilities and quickly dispatch them back to the design team –so they can be addressed before the SoC goes to market. The bug submissions from the teams are then scored in real time by industry experts. The team with highest score is declared as winner.
At the end of the competition, both Hackin’ Aggies from Texas A&M University and The Last Mohicans from IIT Kharagpur were both declared as winners. I had a subsequent interview with Professor Michael Quinn from Texas A&M, who has been actively shepherding the school’s team to take part in the competition and also joined by the Cadence staff who are coordinating the university programs: Dr. Patrick Haspel, Cadence Global Program Director of Academic and University Programs and Steve Brown, Marketing Director of Verification Fabric Products. Some excerpts from the Q&A session are included in the second half of this article.
DAC and Verification Engineers
Based on DAC 2015-2017 statistics, about 38% of the attendees are engineering professionals and about 10% are academia as shown in figure 1. Although there are other venues such as IEEE sponsored events that involved the academia, their participation in the industry sponsored events such as DAC or DVCon could be viewed an indicator for how much participation or interest is given in the ecosystems. Based on a subset of the statistics, verification engineer attendance consistently ranks third after CAD/application and design engineers (see figure 2).
A well rounded verification engineer demands proficiency in both the design implementation aspects as well as the functional verification techniques. We are accustomed to college programs providing training to be design engineers, computer scientists and process engineers –but no so much tailored for a verification engineer. This prompts the question on how we should prepare these professional candidates to be more adaptable to the industry requirements?
Cadence Academic Ties
Aside from their own R&D dollars, EDA companies innovate through the various synergistic partnerships among its ecosystems’ members, including their customers and the academia. Being at the forefront of the EDA ecosystem, Cadence has actively fostered a strong relationship with the academia through the Cadence® Academic Network program, which facilitates the exchange of knowledge by co-organizing educational events or trainings and providing access to the latest Cadence technologies. There are several notable subprograms related to this venture as tabulated here:
Interview with Professor Michael Quinn
Texas A&M University has been part of Cadence Academic Network Program and ranks first in Texas in term of student size. The university launches the 25-by-25 initiative, which targets an engineering enrollment of 25,000 by 2025 and this year boasts largest freshman female engineering class in the country. Its electrical and computer engineering programs recently were ranked 12th and 10th among public universities.
The following excerpts are from Q&A session with Professor Quinn:
Could you comment on current research emphasis in the area simulation/verification?
“From the verification standpoint, the biggest area getting looked at is associated with security. Texas A&M has a whole new department that has grown up for the past few years, very well endowed and it’s about security design, architecture and also verification. I think their biggest push in these area is in formal. Formal based approaches, not so much functional,” Prof. Quinn said. “By the way, we did (the contest) without using the formal tool,” he quipped.
Which Cadence tools do you use?
“My class uses all the simulation and visualization tools such as Xcelium. It starts just as an engineering verification job, with a specification planning using Cadence VPlanner. The students start developing the verification environment using Cadence UVM based methodology, which is superb as it supports the current IP design methodology,“ said Prof. Quinn. It also allows the students to incrementally do a bottom-up verification and integration works, starting with low-level IP and progressing to the SOC level. A key strength of such approach is the ability to seamlessly reuse of works previously applied at the lower-level. Subsequent verification and debug involves running random testings and the use of Vmanagerto tie various aspects of planning, testing, tracking and analysis together. And finally Indago, to efficiently manage debugging process.
Should the school program be geared towards software development mastery, hardware design proficiency or hands-on applications for EE candidates?
He believes we need all the above. A more well rounded designer is being looked for by companies and can be transitioned to different projects. Along this end, he aspires of having more courses that are inter-disciplinary, experiential in nature –and are based on multi-faceted curriculum that put together logic designers, architect plus software folks– would greatly enhance the learning experience.
What is the current state of engagements with Cadence?
Since its start in 2016 when only a handful of verification engineers entered into the industry from Texas A&M, the program has now contributed 100 or more. “It’s a win-win solution,” he said. Relating to his Drexel almamater, he believes the value of having a co-op program as good training ground for the incoming graduates. His wish is to be able to continue the efforts further and share his instructional works with an expanded network.
According to Dr. Haspel, part of the Cadence Academic Network team responsibilities is to connect the industry need of trained engineers and to enable students to not only learn but to be valuable also to the prospective employers. This is achieved through working with university on curriculum aligning, partnering with school that have the right mindsets to collaborate and arm them as the recruiting targets. Furthermore, he said “Sometimes it is the pipeline thing, but it is also the responsibility of the ecosystem…”
What is your impression on DAC presentations with respect to HW design?
Prof. Quinn is intrigued by the conference advising EDA vendors to pay more attention to big data, machine learning, security and data analytics. He concurs that it is the right feedback. He anticipates that post-silicon is able to contribute in the verification, which previously was not possible as data does not stop at tapeout. One may need a coverage monitoring –possibly at customer site by doing workload monitoring and feedback the simulation process. It is a big close-loop.
As the famous quote goes –“Tell me and I forget, teach me and I may remember, involve me and I learn.”– at DAC 2018 the Texas A&M team had demonstrated a slice of the fruitful outcomes from the EDA industry collaboration with academia. Kudo to Cadence and the Aeggis! Experiential learning really makes a difference.
Share this post via:
Comments
There are no comments yet.
You must register or log in to view/post comments.