Hi. I should probably already know this, but I don’t, so I thought I’d ask here.
If I have a charger marked as 1,5A and then try to charge a battery with 2Ah, what will happen?
- Is the battery going to be charged, but slower?
- Charged but not fully?
- Not charged at all?
You’re mixing up two units.
Let’s say your battery has a capacity of 2 Ah (ampere * hours) and your charger outputs 2A (Ampere). This means the charger would need 1 hour to fully charge your battery. Now take a 1A charger, this would take 2 hours to charge the same battery.
Clear and concise. Thanks.
Ah is capacity, A is flow rate. As long as the voltages are correct, it’ll charge eventually.
Ah == size of the bucket, A is the size of the hose you fill it with.
Very easy to understand analogy. Thanks for that. Makes it very clear on how it works. Doesn’t sound like a too small hose should cause problems then?
A straw for a swimming pool would be bad, but I think you’re ok
Bad only in a time sense. There’s nothing bad safety wise, slow charging decreases heat and prolongs battery longevity.
so 2Ah really means “2 amp-hour”. it measures how much of a charge it has- basically, you can draw 2 amps for one hour from the battery during ideal conditions (IE, it’s not an old battery, and everything about it is perfect- perfect temp, perfectly isolated from vibrations. all sorts o things. We never see that in the real world, however.)
alternatively, you could draw 1 amp for 2 hours, or 4 amps for half an hour.
1.5 amps is the amount of charge it’s providing to the battery, again, under ideal conditions (like temperature, etc.) a 1 amp charger would charge slower where a 2 amp charger would charge faster.
The important thing is that the voltages match- 12v or 18v tend to be the most common. if the charger has too much amperage, it’s fine- the battery will only draw what it can use; too little and it’ll just take forever to charge.
Too much amperage for a circuit is fine, for a battery is not. A circuit will present a known resistance and draw what is needed for a specific voltage. The batteries’ resistance will be very low at empty and change over the course of charge, and certain chemistries can be damaged over 1C.
Good answer and explanation. Thank you.
Great answers already, I’d like to add a few things: Lithium batteries like to be charged at 1C, meaning that if they have a capacity of 1.6 Ah they like to be charged at 1.6 A and will (theoretically) be fully charged in one hour (this is wrong in practice though). You can charge most of them at a higher rate (e.g. at 3.2 A), but they should be rated for it. If you don’t better, stick to a lower amperage. Too low of an amperage should not be an issue but I’m not sure - it will take ages though.
The voltage has to fit! Batteries change their voltage over the course of charging/discharging. This is more pronounced in older NiMH or NiCdH (discard those) batteries but still relevant. A 12V charger can charge your batteries to 12V. Most chargers can handle different voltages and will usually select the correct voltage depending on the battery. This is important if you are charging LiIons or LiPos (two types of Lithium batteries), as overcharging them is NOT advised. They generally peak at around 4.2V per cell I think with a working charge (don’t know the term) of 3.7V per cell. Some chargers require you to select the number of cells. Charging a 2-cell LiPo with 12-13 volt is a bad idea I wager and charging a 3-cell LiPo with 9.something volt won’t work (it won’t drain it, but the minimal voltage is higher than what the charger provides - current would like to flow the other way if It could).
TL,DR: If you can adjust the charging parameters manually or you are doing something the manual states you shouldn’t be doing, educate yourself. If however you only plug in your phone (with a painfully small battery at 2 Ah) and only provide a 1.5A*5V=7.5W charger, you are perfectly fine.
The charger is made for the battery, it’s just that the newer batteries have somewhat more capacity. Thanks for all the extra details.