Electronics-Lab.com Community

# Watts , amps?

## Recommended Posts

I am a little confused about these if somebody could define them for me and help me understand the difference i would be glad

##### Share on other sites

Potential differance
Is the differance in charge between two conductors, measured in volts - an analogy would be the preasure differance between a balloon and the outside air preasure.

Current
The amout of electrons (charge) passing through a conductor per second measured in amperes amps for short an analogy would be the flow of air in the pipe connected to the balloon.

Watts
The amount of power dissipated per in in jouls per second also measured in watts, this is equal to potential differance * current.

##### Share on other sites

still confusing  :'(

##### Share on other sites

Ya confused try this 536 watts = 1 horse

##### Share on other sites

Sorry about that  just had to say it.
volts * (times) amps = watts
and if you had a 110volt 100  watt light bulb  it would draw how many amps
110 /100  =1.1 amps  110 devided by 100 equals 1.1

##### Share on other sites

and if you had a 110volt 100
##### Share on other sites

well what i understand in amps is how much current is being drawn through the wire right?
and what i've heard is watts is what is being emited is that right?
but what is realy confusing is how some power suplies such as inverters for your car to make mains and like solor cellls and stuff are rated in watts why would the power suply care how much current reaches it from what it put out? to a certin extent i mean you can't have like a 100watt light bulb conected to a solor cell or somthing rated for 5 watts dosent work that way but i still don't get why it is in watts not amps?

##### Share on other sites

Electrical power that is being used is measured in Watts.
A 100V/1A lightbulb or heater uses 100 Watts.
A 1V/100A lightbulb or heater uses 100 Watts.
A 10V/10A lightbulb or heater uses 100 Watts.
They all produce about the same amount of light or heat even though their voltage and current has changed a lot.
Everyone knows that a 100W lightbulb is bright, 60W is medium and 40W is not very bright.
Nobody knows the brightness of a 40A lightbulb without also knowing its voltage then multiplying them into Watts. ;D
Understand now?

##### Share on other sites

well what i understand in amps is how much current is being drawn through the wire right?
and what i've heard is watts is what is being emited is that right?

Watts are the units of power, the amount of energy that's being consumed per second.

but what is realy confusing is how some power suplies such as inverters for your car to make mains and like solor cellls and stuff are rated in watts why would the power suply care how much current reaches it from what it put out? to a certin extent i mean you can't have like a 100watt light bulb conected to a solor cell or somthing rated for 5 watts dosent work that way but i still don't get why it is in watts not amps?

A 12V 1A power supply is rated for 12W, or 1A at 12V, giving the power output in watts is just a differant way of describing this you could connect a 5W bulb to a 10W cell as long as the load is smaller than the maximum output of the supply you won't have a problem, unless a minimum load is given in the specification.
##### Share on other sites

ok i think i get it now thanks

##### Share on other sites

Hi Cody,

You can print this one out and nail it to the wall above your workbench!

Nice!

##### Share on other sites

OOPs I got to watch what I type late at night sorry about that, My math was wrong.
Very Sorry
gogo

## Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

Only 75 emoji are allowed.

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.