## How to design a good DC/CD conversion circuit ? Moderators: adafruit_support_bill, adafruit

Re: How to design a good DC/CD conversion circuit ?

Designing a DC voltage converter that works reliably is a difficult problem. That's why there are so many chips on the market to handle the difficult parts.

At the most basic level, switching converters use an inductor's ability to store energy in its magnetic field to convert voltage to current, or vice versa. You charge the magnetic field by pumping current into the inductor for a certain amount of time, then discharge the field by drawing current out of the inductor for a certain amount of time. The amount of energy that comes out equals the amount of energy that goes in.

The unit of energy is the Watt-second, or Joule, and in electronics, a Watt is the product of voltage and current. That means the energy going in and out of the inductor is defined as a product of voltage, current, and time. Voltage converters work by holding the current constant, but changing the amount of time the current flows into and out of the inductor.

The amount of energy in and inductor's magnetic field is proportional to the amount of current flowing through the inductor, so the field steals energy from the current any time the current tries to increase, and dumps energy back into the current any time the current tries to decrease. A voltage converter has to increase the current while the current is flowing into the inductor to charge the magnetic field, then decrease the current as current is flowing out of the inductor to pull that energy out of the field again. The average amount of current flowing out needs to be the same as the average amount flowing in for a voltage converter to work efficiently.

If the amount of energy that goes into the inductor equals the amount of energy that comes out, and the average current flowing in is the same as the average current flowing out, we're left with two parameters to play with: voltage and time.

If we send an average of 1A @ 1V into an inductor for 2us, a total of 2uW-s of energy flows into the inductor. If we draw 2uW-s of energy out of the inductor in 1us, and the average current remains 1A, the voltage has to be 2V.

That's the basic mechanism of DC voltage conversion: constant energy and constant average current, with a tradeoff between voltage and time.

Over longer periods of time, the on/off nature of the current flowing in and out of the inductor works like PWM, changing the average current. A PWM circuit that transmits 1A for 2us out of every 3us averages out to a DC current of 667mA. A PWM circuit that transmits 1A for 1us out of every 3us averages out to a DC current of 333.5mA.

Averaging the current over time removes the 'time' part of our energy equation, leaving voltage and current, or power, flowing in and out of the converter. At the large scale, ignoring all the switching that happens around the inductor, voltage converters are constant-power devices: to get 1W out, you have to send 1W in.

That checks with the numbers we used above: 667mA @ 1V is 667mW, and 333.5mA @ 2V is also 667mW.

In practice, keeping the energy flowing out equal to the energy flowing in, and keeping the average output current equal to the average input current, is a lot harder than it sounds. It's easy for unexpected changes in the load current to throw off the balance. It's also easy for the inductor to fall into a state called 'saturation'.

In a saturated inductor, increasing the current doesn't increase the magnetic field any farther. The inductor just acts like a low-value resistor with a lot of current flowing through it, and starts to generate heat.

Saturation creates other states of operation where the total power flowing into the inductor equals the total power flowing out, but instead of boosting the voltage by storing the energy in the inductor's magnetic field, the energy gets converted to heat, which will eventually bburn up the circuit.

So a practical DC voltage converter is about 20% actual voltage converter and 80% sensors and compensation circuits to keep the converter from dropping into a bad state that will destroy it.

Today's voltage converters use switching frequencies in the megahertz range so the inductors can be small, which adds a whole new layer of problems from parasitic inductance and capacitance of the PCB, the time it takes transistors to switch on and off, and so on. Experienced switching converter designers have a large bag of tricks they've picked up through experimentation, and have seen enough switching converter failures to know what caused any given problem in a design under test.

For general circuit design, it's best to pick an integrated circuit with the specs you're looking for and build the reference circuit in the datasheet. That will at least get you a circuit that works some of the time, and fails in the most basic ways. As you get familiar with those issues, you'll pick up the knowledge to start trying other kinds of circuits. Posts: 60270
Joined: Thu Feb 11, 2010 2:51 pm

Re: How to design a good DC/CD conversion circuit ?

928928 wrote:Thank you so much.

Check the Linear Technology website (they're now part of Analog Devices)

1. They have awesome parts, including very nice DC/DC switchers.
2. They have application circuits that work
3. They have something called LTSpice, which is a free version of PSPICE for simulation.
4. They have LTSpice schematics for their application circuits, so you can play with changing components and see the effect

Power Management ICs - you probably want to start at the "Switching Regulators" page:
https://www.analog.com/en/products.html ... management

LTSpice:
https://www.analog.com/en/design-center ... lator.html

Disclaimer: I've been using Linear switching regulators in my designs for a while. And LTSpice is awesome. I'm a working EE, and a happy Linear customer.
ka1axy

Posts: 9
Joined: Tue Jan 22, 2008 10:41 am
Location: Holliston, MA