techknight
Well-known member
Circuits only draw the required amperage from a power source. Once a circuit starts to fail, more current can be drawn in a quick surge or latch-up condition, which can lead to a cascading failure if the supply was able to deliver the current. From a service standpoint, lower wattages/current availability is preferred because if the supply cant push out the amount of current the "surge" is requesting, FAR less components are damaged.
Maybe i should have been more clear. I have a tendency not to be clear sometimes.
But all this babble is moot if your circuitry is operating perfectly. you can hook up a 7.5v at 400amps if you wanted to. it doesnt matter as the machine is only going to draw a certain amount as needed. Its when a failure begins to start, is when that current matters. so if a transistor shorts or a cap starts to short, and your supply can push 10 amps, it can cause some serious damage and cascading failure as the current availability is there.
think of it like a bomb or something, if the detonator false triggers, and its yeild is only 1 ton(available power), itll do far less damage than say if it had a 10 megaton supply. (maybe a bad example). So basically if your power supply is current limited, and a transistor shorts, it might get warm or something but thats it. If you had a crapload of available current, consumption would drastically increase and cause the motherboard or other failed hardware to explode/cause a fire.
But Wattage is Voltage x Current. But i am thinking of voltage as a steady supply yes. only variable current. as current increases, wattage increases, same way with voltage. But im thinking of a rock steady voltage in my explanations. But of course theres that whole factor if your power supply is only 2 amps, and a short is pulling a 4 amp load, your voltage will drop significantly preventing further damage because its a lower wattage power supply, than say, one that provides 4 or 6 amps.
Well anyway, I am a repair technician by trade so its kinda my train of thinking on that subject. Even though it may mean nothing by what others were saying. But hey....
Maybe i should have been more clear. I have a tendency not to be clear sometimes.
But all this babble is moot if your circuitry is operating perfectly. you can hook up a 7.5v at 400amps if you wanted to. it doesnt matter as the machine is only going to draw a certain amount as needed. Its when a failure begins to start, is when that current matters. so if a transistor shorts or a cap starts to short, and your supply can push 10 amps, it can cause some serious damage and cascading failure as the current availability is there.
think of it like a bomb or something, if the detonator false triggers, and its yeild is only 1 ton(available power), itll do far less damage than say if it had a 10 megaton supply. (maybe a bad example). So basically if your power supply is current limited, and a transistor shorts, it might get warm or something but thats it. If you had a crapload of available current, consumption would drastically increase and cause the motherboard or other failed hardware to explode/cause a fire.
But Wattage is Voltage x Current. But i am thinking of voltage as a steady supply yes. only variable current. as current increases, wattage increases, same way with voltage. But im thinking of a rock steady voltage in my explanations. But of course theres that whole factor if your power supply is only 2 amps, and a short is pulling a 4 amp load, your voltage will drop significantly preventing further damage because its a lower wattage power supply, than say, one that provides 4 or 6 amps.
Well anyway, I am a repair technician by trade so its kinda my train of thinking on that subject. Even though it may mean nothing by what others were saying. But hey....
Last edited by a moderator: