Jump to content
Electronics-Lab.com Community

Recommended Posts

Potential differance
Is the differance in charge between two conductors, measured in volts - an analogy would be the preasure differance between a balloon and the outside air preasure.

Current
The amout of electrons (charge) passing through a conductor per second measured in amperes amps for short an analogy would be the flow of air in the pipe connected to the balloon.

Watts
The amount of power dissipated per in in jouls per second also measured in watts, this is equal to potential differance * current.

Link to post
Share on other sites

well what i understand in amps is how much current is being drawn through the wire right?
and what i've heard is watts is what is being emited is that right?
but what is realy confusing is how some power suplies such as inverters for your car to make mains and like solor cellls and stuff are rated in watts why would the power suply care how much current reaches it from what it put out? to a certin extent i mean you can't have like a 100watt light bulb conected to a solor cell or somthing rated for 5 watts dosent work that way but i still don't get why it is in watts not amps?

Link to post
Share on other sites

Electrical power that is being used is measured in Watts.
A 100V/1A lightbulb or heater uses 100 Watts.
A 1V/100A lightbulb or heater uses 100 Watts.
A 10V/10A lightbulb or heater uses 100 Watts.
They all produce about the same amount of light or heat even though their voltage and current has changed a lot.
Everyone knows that a 100W lightbulb is bright, 60W is medium and 40W is not very bright.
Nobody knows the brightness of a 40A lightbulb without also knowing its voltage then multiplying them into Watts. ;D
Understand now?

Link to post
Share on other sites

well what i understand in amps is how much current is being drawn through the wire right?
and what i've heard is watts is what is being emited is that right?

Watts are the units of power, the amount of energy that's being consumed per second.


but what is realy confusing is how some power suplies such as inverters for your car to make mains and like solor cellls and stuff are rated in watts why would the power suply care how much current reaches it from what it put out? to a certin extent i mean you can't have like a 100watt light bulb conected to a solor cell or somthing rated for 5 watts dosent work that way but i still don't get why it is in watts not amps?


A 12V 1A power supply is rated for 12W, or 1A at 12V, giving the power output in watts is just a differant way of describing this you could connect a 5W bulb to a 10W cell as long as the load is smaller than the maximum output of the supply you won't have a problem, unless a minimum load is given in the specification.
Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
  • Create New...