Home Community

Buck Converter Design Problem

### Author Topic: Buck Converter Design Problem  (Read 1519 times)

#### jegues

• Newbie
• Posts: 6
##### Buck Converter Design Problem
« on: October 30, 2012, 10:11:13 PM »
See attached figure for design requirements.

I decided to experiment, solving the outcomes of L, C, ΔiL, IL and R given a selected frequency, input voltage and output voltage.

Of course to obtain the values of the outcomes in a simple manner I made various assumptions.

These assumptions are stated below,

Assumptions:

• Diode and Transistor voltage drops are 0
• Minimum output current = Output current, In other words Imin = 0 (We are operating on the edge of CCM)
• The value for the output current was always selected such that the output power was 100W. **NOTE: We are neglecting the power dissipated across the series resistance of the source (i.e. Prin = ILēDērin) because for all values of IL and D I found this to be small in comparison to the 100W.

Attached below is a figure of the table of outcomes.

Are my assumptions "valid enough" that these results have actual relevant meaning and/or insight?

Can someone help me interpret my results? Am I moving in the right direction?

#### KevinIV

• Electronics God
• Posts: 1352
• Gender:
##### Re: Buck Converter Design Problem
« Reply #1 on: November 02, 2012, 03:51:23 PM »
The input voltages are very high, making the control circuit difficult to design. An ordinary opamp and PWM couldn't be used. The main pass transistor should be a low on Vds FET or bipolar transistor operated between cutoff and saturation to achieve higher efficiency. The inductor capacitor filter is absolutely necessary. Choosing the values might be done by load testing.

#### KevinIV

• Electronics God
• Posts: 1352
• Gender:
##### Re: Buck Converter Design Problem
« Reply #2 on: November 02, 2012, 06:42:21 PM »
I'll add that the inductor capacitor filter is not acting as a filter, but is dividing voltage. The frequency of the PWM is based on the frequency of the PWM.