One of the unsung heroes of our digital world is the modest voltage converter. Batteries and wired power sources rarely match up with the supply needs for advanced ICs. Leading edge ICs have multiple voltage domains and very often, as in the case of processors, use dynamic voltage scaling to help conserve power. Looking at where power converters have come from we can see a lot of progress over the years. Certainly, no mobile device could live with the fully analog converters of the past.
The job of converting DC voltages moved away from linear voltage regulators to switching based buck converters. Mentor has published a case study that illustrates how the digital content of DC/DC converters has grown as the needs for these converters have changed. The original analog controlled buck converter based on an analog PID used off chip passives to regulate the output. However, the need to shrink designs, reduce BOMs and even integrate buck converters into large SOC packages led for the search for alternative techniques to improve output quality.
Replacing the analog PID with a digital control circuit eliminates the need for external passives, helps compensate for transistor imperfections and allows for more control over the stability of the converter. According the Mentor white paper, there are a number of tradeoffs to consider in choice of ADC resolution and DPWM selection. These tradeoffs become more complicated when improving the load transient response of the converter during the activation of aggressive power saving modes in the chips they are supplying. Mentor cites systems like the Intel Speed Shift technology as a source of rapid transitions from high power to low power states in supply circuits that can lead to voltage droop. To combat this Mentor evaluated the use of digital feed forward compensation to improve load transient behavior.
While the above approaches are likely to be effective at solving design problems and helping the system to meets its design specifications, they also introduce new design and verification challenges. Foremost among these is the smaller simulation time step needed to capture the circuit behavior. Simulation runtimes explode using rule based analog to digital simulation integration. Mentor’s new Symphony AMS simulation environment uses their Boundary Element (BE) to connect digital and analog domains and speed up simulation times.
In their white paper they simulated a fully digitally controlled buck converter using a test bench that contained a 16-core Xeon E5-2682 v4 CPU. They made simulation runtime comparisons using Symphony and one of the incumbent AMS simulation environments. Symphony running on one thread was 10.31X faster, reducing 187.8 hours of simulation to 18.2. With eight threads this advantage moves to 42.2X with a runtime of only 4.45 hours. Along with this impressive performance gain, the BEs also provide improved visualization of the analog and digital portions of the design.
Mentor looked at the effectiveness of the proposed design in limiting Vout droop during load transients. After simulation, they saw that the droop was reduced by 250mV, at a supply voltage of 1100mV. They also examined how well the converter compensated for higher internal loss by increasing the duty cycle. Because this effect is usually observed over a longer time interval, it has previously proven difficult to model. In their simulations, Mentor Symphony showed a 25mV improvement in voltage stability over a period of 1.4us, with a residual droop of only 5mV. To validate the results a test chip was fabricated. The Mentor white paper goes through the silicon measurements to illustrate the accuracy of the simulation results.
I review a lot of white papers from various vendors, I have to say that this one in particular was very informative and backed up with meaningful real-world data. More information about Mentor Symphony is available on the Mentor Website.Share this post via:
One Reply to “Designing a fully digitally controlled DC-DC buck converter”
You must register or log in to view/post comments.