NerdKits - electronics education for a digital generation

You are not logged in. [log in]

NEW: Learning electronics? Ask your questions on the new Electronics Questions & Answers site hosted by CircuitLab.

Support Forum » Is too much amperage a bad thing?

May 10, 2012
by Pew446
Pew446's Avatar

I'm very confused right now. So I am trying to power my ATmega168 with a 5v/1.2A power supply. If the chip takes 200mA and my power supply outputs 1.2A, that would definitely break it, right? Or because all the volts are used, would it not matter? I've been searching all over the internet and can't find a clear answer for the life of me. Thanks for the help!

May 10, 2012
by pcbolt
pcbolt's Avatar

Pew446 -

The ratings on power supplies usually indicate the maximum amperage that can be drawn before a fuse is blown or damage occurs to the supply unit. The amperage is governed by the resistance between the positive and negative leads. So if you connected a 1000 ohm resistor between red and black, you can use the following equation to find out what amperage is being pulled through the circuit; V=IR or 5 = I * 1000 or I equals .005 amps or 5 mA. The power supply doesn't force 1.2 amps through the circuit. The ATmega168 has a max amperage of 200 mA, but if you hook it up with no LED's or other outputs, it will draw far less current (usually around 5 mA). BUT, if you were to set an output pin to high and connect that pin directly to ground without any resistance, you could draw enough current through the MCU to cook it. That's why it's a good idea to add resistors between the MCU and an LED that is connected to ground (it's not mandatory just a good idea).

Post a Reply

Please log in to post a reply.

Did you know that NerdKits has been featured on Slashdot, Hack A Day, Hacked Gadgets, the MAKE blog, and other DIY-oriented websites? Learn more...