Hello and thank you for having this site! I have a novice question please: why does the listed output amperage of many power adapters specify DC "A" when it really is AC "micro-A" instead? And why is the decimal position often off by two places?
For example, I'm using a Klein Tools MM200 multimeter to test the output of a small USB hub power adapter with the following printed specs on the adapter:
Input: 100-120V ~ 50-60Hz
Output: 5V = / 2.6A
When I test the DC voltage, it's correct (5.27 DCV), but when I test the DC amps, it seems totally wrong (.006A). However, when I switch to testing amps in AC mode, the number matches better (.025) but is off by magnitudes. Why do I need to be in AC mode for a DC output adapter? Why is the decimal two places incorrect?
I have another laptop adapter whose amperage output is only correct if I switch to AC mode and move the decimal over by one (the device's listed output is DC 1.5A, but measured at 0.031A in DC mode, but 0.12A in AC mode).
Finally, when I test the output of a second laptop charger, the voltage is correct and the amps read correctly in DC mode, but while the charger lists the output amps at 3.3A, the measured output is actually 0.033A. The decimal is wrong by two places again.
I know that if I move the multimeter selector from A to mA to uA, the decimal will be "fixed", but the reading doesn't match what's printed on the adapter/charger, namely, just "A" and not "mA" or "uA".
Thank you for any insight!