The battery reportedly has a rated capacity of 10,000mAh. It appears that a vivo phone may soon join the growing list of smartphones featuring 10,000mAh or...
This is a great example of why mAh is a stupid metric. Double the voltage and you have the same capacity but through the power of marketing you can mislead people into thinking the battery life is increased.
Not in electronics. Wh is a pretty difficult metric in electronics design (and thus, speccing the time a battery will last).
I have said this before because all ICs use mA as their power consumption rating because they might have a range of 1.8V-3.6V of operation or 4.5-10V or something and they consume about the same amount of current across that spectrum but vary in power. This is why low power systems often use 1.8V.
Batteries also vary in delivered power at a constant load. They can pull 100mA continuously, but a lithium ion cell delivers 420mW first and then continually falls until it delivers only 250mW, almost half as much.
What is easier to calculate?
Integrating across a variable voltage domain for the source and then subtracting each component uses variable power, then integrating each component over its voltage range. Oh and the battery capacity left in wh is also nonlinear, so when estimating state of charge, you have to balance a nonlinear source with a differently nonlinear load, integrated over time, all on a 200MHz mcu trying to do 50 other, more important things (and that’s fast)
battery can deliver this set current for this time, circuit pulls this amount of current, battery lasts X hours. Estimating life cheaply is just:
“get starting SoC from memory and voltage” “measure current once”
“measure current again”
Current*time=mAh used.
Save
Again, not saying it is “correct”, but significantly easier on all levels.
For the consumer. Why does it matter? There is absolutely no specs every given for actual power used. Does your phone use 1W or 5W or 100mW on average? Never given.
Batteries are literally just “bigger = better”. Using Wh instead mAh would not change this at all. The only thing it would do is expose the 1% that try to fudge the numbers while everyone else just fudges power consumption.
Oh you got X phone because Y phone only had a 13Wh battery instead of 18Wh. Oh too bad, phone X uses an average of 9W and only lasts for 2 hours. Phone Y used 0.5W.
Absolutely agree with the fact that mAh should never have been used as a measurement of battery capacity, but increasing the voltage (while keeping the same actual capacity in Wh) makes the mAh rating lower, not higher.
1 Wh 1 V battery can provide 1 A (1000 mA) for an hour (1 V * 1 A = 1 W, 1 Wh / 1 W = 1 h) - that would be 1000 mAh.
1 Wh 2 V battery can only provide 1 A for half an hour (2 V * 1 A = 2 W, 1 Wh / 2 W = 0.5 h), and that gives you only 500 mA.
chinesium phones have 20000mAh with the same voltage as usual 5000mAh phone batteries. Wh will be better, yet now we have what we have - the same voltage batteries for most phones.
This is a great example of why mAh is a stupid metric. Double the voltage and you have the same capacity but through the power of marketing you can mislead people into thinking the battery life is increased.
The proper metric is Wh.
Not in electronics. Wh is a pretty difficult metric in electronics design (and thus, speccing the time a battery will last).
I have said this before because all ICs use mA as their power consumption rating because they might have a range of 1.8V-3.6V of operation or 4.5-10V or something and they consume about the same amount of current across that spectrum but vary in power. This is why low power systems often use 1.8V.
Batteries also vary in delivered power at a constant load. They can pull 100mA continuously, but a lithium ion cell delivers 420mW first and then continually falls until it delivers only 250mW, almost half as much.
What is easier to calculate?
Integrating across a variable voltage domain for the source and then subtracting each component uses variable power, then integrating each component over its voltage range. Oh and the battery capacity left in wh is also nonlinear, so when estimating state of charge, you have to balance a nonlinear source with a differently nonlinear load, integrated over time, all on a 200MHz mcu trying to do 50 other, more important things (and that’s fast)
battery can deliver this set current for this time, circuit pulls this amount of current, battery lasts X hours. Estimating life cheaply is just:
“get starting SoC from memory and voltage” “measure current once” “measure current again” Current*time=mAh used. Save
Again, not saying it is “correct”, but significantly easier on all levels.
For the consumer. Why does it matter? There is absolutely no specs every given for actual power used. Does your phone use 1W or 5W or 100mW on average? Never given.
Batteries are literally just “bigger = better”. Using Wh instead mAh would not change this at all. The only thing it would do is expose the 1% that try to fudge the numbers while everyone else just fudges power consumption.
Oh you got X phone because Y phone only had a 13Wh battery instead of 18Wh. Oh too bad, phone X uses an average of 9W and only lasts for 2 hours. Phone Y used 0.5W.
Absolutely agree with the fact that mAh should never have been used as a measurement of battery capacity, but increasing the voltage (while keeping the same actual capacity in Wh) makes the mAh rating lower, not higher.
1 Wh 1 V battery can provide 1 A (1000 mA) for an hour (1 V * 1 A = 1 W, 1 Wh / 1 W = 1 h) - that would be 1000 mAh.
1 Wh 2 V battery can only provide 1 A for half an hour (2 V * 1 A = 2 W, 1 Wh / 2 W = 0.5 h), and that gives you only 500 mA.
lower, higher, whatever, it’s manipulation of pointless numbers.
chinesium phones have 20000mAh with the same voltage as usual 5000mAh phone batteries. Wh will be better, yet now we have what we have - the same voltage batteries for most phones.
Is this not measured at a standard 3.7V for phone batteries?
Edit: In the article it’s measured at 4.53V.
Usually yes, but it doesn’t have to be.