Supplying more or less current than a device needs??

  • Thread starter Christopher A. Glaves
  • Start date
C

Christopher A. Glaves

Guest
I have a device I need to buy an AC Adaptor plugpack for but I do not
want to fry it, so I could use some advice as to which plugpacks will
work.

Quick question is what happens if you supply more or less current than
a device needs? Also, my understand is that voltage needs to be exact
or else it wouldn't work at all.

Stamped on the box it says it requires "11-18V DC 1A (1000mA)".

If I go online I found 3 different AC Adaptor Plugpacks:

#1 supplies DC output voltages 5, 7.5, 12, 15, 18, and 23 V DC, with
current of up 1000mA maximum
http://www.dse.com.au/cgi-bin/dse.storefront/3f2ccb2e03efad7e273fc0a87f9c071b/Product/View/M9916

#2 supplies DC output voltages 3, 4.5, 6, 7.5, 9, and 12 V DC, with
current of up 1500mA maximum
http://www.dse.com.au/cgi-bin/dse.storefront/3f2ccb2e03efad7e273fc0a87f9c071b/Product/View/M9917

#3 supplies DC output voltage 12 V DC, with current of up 600mA
maximum
http://www.dse.com.au/cgi-bin/dse.storefront/3f2ccb2e03efad7e273fc0a87f9c071b/Product/View/M9925


#1 looks spot on.

#2 can give me 12V which looks good but what would the "current of up
1500mA maximum" do if I plugged it in?

and #3 gives me 12V but again, what would the "current of up 600mA
maximum" do if i plugged it in?

So again, basically what happens if you supply more or less current
than a device needs. Also, is it true that voltage needs to be exact
or else it won't work at all?

Advice and web pages for guidance are greatly appreciated.

Thanks for any help,

Chris
 
On Sun, 3 Aug 2003 09:30:04 +0000 (UTC), Mike Ring
<mike.ring@Michaelbtinternet.com> wrote:

Christopher A. Glaves <cglaves@yahoo.com> wrote in
news:5ojpivoop5k2on7ck885nqn44as5uo0fpo@4ax.com:

I have a device I need to buy an AC Adaptor plugpack for but I do not
want to fry it, so I could use some advice as to which plugpacks will
work.

Quick question is what happens if you supply more or less current than
a device needs? Also, my understand is that voltage needs to be exact
or else it wouldn't work at all.

Provided the volts are correct you *cannot* supply more current than is
required; the resistance of your device sets this ie a 15W and a 1000W bulb
on 230V mains each decides for iself how much current it will draw.
I think I am getting what you are talking about here but it may be my
fault for not stating clearly enough. So if I undersand... a plugpack
will lets say @ 12 V "offer" to the device upto 1500 mA. The device I
plug it in to will only take as much current as it needs, and in this
case thats 12V 1000mA, sooo if i used a plugpack of 12V but 600mA
nothing would work cause there wouldnt be enough current and probably
nothing would happen? But use the 1500mA plugpack and all will be fine
because it will take only what is needed.

Its starting to make sense when all the elctronics books talk about
current being like a pipe/ flow.

Can u confirm my suspicions?

Thanks for the help!!

Chris
 
Christopher A. Glaves <cglaves@yahoo.com> wrote:

Quick question is what happens if you supply more or less current than
a device needs? Also, my understand is that voltage needs to be exact
or else it wouldn't work at all.
There are regulated supplies and unregulated supplies, if there is no
mention of regulation it is an unregulated supply.

The regulated supplies give a precise voltage.

The unregulated supplies give very imprecise voltage, usually they
give a much higher voltage when little current is taken out.

An unregulated supply may be marked 12V 1000mA. It may give up to 18V
at very low currents, and something like 12V at 1000mA.
It may be marked 12-18V max 1000mA. It may be marked 12v 1000mA.

You should never try to take out more current than the maximum as
marked on the supply. If you try to take out 2000mA from the example
supply above the voltage will sink to say 6 V, and your device will
not work properly. The supply will get very warm, and maybe blow a
fuse, in some cases it may even start a fire, if it is of poor
quality.

If you need exact voltage you need a regulated supply.

What devices will work with an unregulated supply is not an easy
question to answer.
Many devices are made to work with unregulated supplies and may accept
a voltage which is a lot higher than it is marked to use, some devices
will get destroyed.

If you are not sure, use a supply which fits both voltage and current
markings, because unregulated supplies give the marked voltage when
the current used is the maximum current or lower, but not too much
lower.

If you use a 12V 1000mA unregulated supply for a device which needs
12V 10mA you may cause it to run on 18V 15mA, which may destroy it if
it is poorly made.

As you have noticed there are unregulated supplies which can be set to
many different voltages, like 1.5, 3.0, 6.0, 7.2, 9.0, 12V.
When you use such a supply set it to its lowest range, connect it to
the device and turn on the device, then switch the supply upwards
until the device works properly and leave it there.

I have used that method several times and it works well, it reduces
the risks of destroying a device when you have lost the original
supply and need to replace it.


--
Roger J.
 
Roger Johansson <no-email@home.se> wrote:

The unregulated supplies give very imprecise voltage, usually they
give a much higher voltage when little current is taken out.
Let me add one thing.
If you have a valuable device and are not using the original supply
for it you can reduce the risk of destroying it be making it a habit
to turn it off and on by turning the mains off and on to the power
supply, instead of using the power button on the device itself.

Remember that such an unregulated supply can give a lot higher voltage
when not loaded. That means that if the supply is connected to the
mains all the time, and the device is turned on later the device will
be subjected to the higher voltage at the moment it is turned on,
before it starts to draw the full current.

So a 12V device will be subjected to 18V for a brief moment when it is
turned on, until things have stabilized.

If the mains is switched instead the power supply will start to raise
its voltage, the device will start to use more current, the voltage
will reach 12 V and then the device is already using the current it
needs to use to keep the voltage at 12V. The voltage will never reach
18V.

It will also reduce the current drawn when you are not using the
device to zero and you avoid other risks like lightning destroying
your devices and things being shorted while you are not around.

So it is advantageous to use a single mains switch to turn on and off
a system which is powered by one or more wall warts.

In many cases this is not practicable and things work well with the
power supply turned on at all times, but it might be worth thinking
about in some cases.


--
Roger J.
 

Welcome to EDABoard.com

Sponsor

Back
Top