100 MA Is a Myth

A lot of electronics ends up being connected to USB - whether they need computer or not. USB and its 5 V has became an unofficial standard power supply for the world. And thus it makes me crazy when I hear from people that you can only draw 100 mA from USB without going through USB enumeration else something horrible is going to happen. And that is bullshit.

Yes, there is such a thing as USB negotiation where each device can request certain amount of current from its host. Older USB devices could ask for up to 500 mA (in units of 100 mA) while USB 3.0 devices have maximum of 750 mA (in units of 150 mA). If you go higher than that, you get into the category of USB Power Delivery 3.0 and the beauty of multiple voltages we'll conveniently ignore here, and deal only with 5 V requirements.

A misconception when it comes to USB negotiation is due to ability of USB devices to self-report their maximum usage and likewise the ability of chipset to say no if multiple devices go over some internal limit. Ideally, if you have four USB ports on the same power bus with total of 1 A available and you connect four devices using 300 mA each, three would get positive response and fourth one would get their request denied.

What might not be completely clear from this story is that bus has no means to either measure current or to enforce device's removal. This whole story depends on the device accurately reporting its (maximum) usage and actually turning itself off if it receives "power denied" response. And yes, this self-reporting works as well as you can imagine it. As there is no way for computer to ensure either accuracy of data or device's compliance everybody simply ignores it.

Computer manufacturers decided to save some money and not verify device's consumption. So what if device that reported 300 mA is using 350 mA? Should you disconnect the device just because it uses lousy cable with a big loss? Why would you even care if that is the only device connected? What to do if that device just goes to 500 mA for a fraction of second? What to do with devices reporting nothing (e.g. coffee heaters)? Is nitpicking really worth bad user experience (my damn device is not working!) and is it worth extra cost to implement (and test)?

While there were some attempts at playing "power cops" in the early days; with time all manufacturers decided to over-dimension their power bus to handle more power than specification requires and simply placed cheap poly-fuse on it to shut it down in the case of great overload.

Such friendly behavior has culminated with each port of any laptop made in the last five years being capable of 1 A minimum. And you can test it - just connect data lines (or use UsbAmps' high-power option) - and you will see you can easily pull 1 A out of something officially specified for half of that. And that is without any power negotiation - courtesy of the USB Battery Charging specification.

This leniency from manufacturers in turn generated a whole category of completely passive devices. Why the heck would USB light have a chip more expensive then all other components together just to let computer know its power usage? That is just wasting money if you can get power out of it whether you inform it or not.

Devices that have to communicate with computer over USB kept their self-reporting habits just because they had to use a bit smarter chips to interface the USB. And all those chips had to have power negotiation built-in to be certified. There was literally no cost in using this feature.

And even then they would fudge the truth. Quite often an external CD drive or hard disk would actually need more than 500 mA to spin up. And there was no possibility in the early days to specify more than 500 mA. So they lied.

Such (lying) behavior was essentially later approved with the USB battery specification that uses not an USB message but voltage levels as a limiting factor. It essentially says that you can pull as much as you want while voltage levels are high enough. Once voltage starts dropping, ease off a bit. This point can happen at 1 A, or 1.5 A, or 2 A - it all depends on power source/computer you are using.

Device manufacturers have become quite verse too. Since USB battery charging specification was done is hardware friendly manner, there was no extra cost to bear. Only task was to determine if computer supports the, then new, specification. If yes, pull current until voltage start dropping. If not, limit yourself to 500 mA. How do they recognize it? You've guessed it - by shorting the data lines.

Due to the powers of backward compatibility you can pull essentially as much current as you want from your PC. Yes, you are still limited by fuse (quite commonly 2 A or more per port pair), you still have thin USB cables and their voltage drop (with accompanying losses) to deal with, and yes devices - especially phones - will still self-limit to avoid tripping aforementioned fuse. But otherwise it is a wild west.

Unless you are making device that has to be certified, stop worrying about the power negotiation. If your device already has an USB transceiver onboard and you need to program it anyhow, go for the standard and configure current you need. But if you don't have such chip or programming is just too much of a hassle, simply ignore it and world is not going to self-destruct. I promise.

6 thoughts to “100 MA Is a Myth”

  1. No idea where you got the idea that a USB bus cannot measure current or disconnect a device. They most certainly can, and do. For example, macOS gives this message: “To prevent damaging your computer, the USB device drawing too much power has been disabled. Other devices may have also been disabled. When you disconnect the device drawing too much power, your other USB devices will be enabled again.”

    For sure, not every USB host implementation in existence is going to actually implement current monitoring in both hardware and software. But the common hosts (computers, phones, and tablets) most certainly do. Will they whine about a brief overcurrent? Probably not. And yeah, they’re generally fairly tolerant, but if a device goes too far, it will get shut down. (There are ICs in the host implementation that actually can disconnect the power.)

    And in addition to the “smart” disconnecting, they do, as you said, typically have fuses (mostly self-resetting fuses like polyfuses, sometimes one-time) in case of gross overloads, as might happen with a short circuit.

    1. Idea comes from every USB (before type-C) implementation I ever tried to pull current from.

      Even the Mac implementation doesn’t really measure USB current nor it compares it against what device asked for. It just detects if current usage goes way too high using a simple comparator. The easiest way to check this is making a simple MCP2221A based board (any chip with programmable current will actually do) and set its enumeration level to 100 mA while pulling 500 mA or higher (resistor will do). MacOS will not complain in that situation. It will complain if you go over 650 mA on the old MacBook Pro I’ve tried it on but I can bet this limit changes based on the exact model and it might even slightly vary between machines in the same series.

      So yes, most of USB ports have some form of protection or other. Some are even smarypants like Macs but none that I’ve seen actually do “current accounting” based on what device reports during its USB negotiation process.

  2. “What might not be completely clear from this story is that bus has no means to either measure current or to enforce device’s removal. ”

    Not true. iOS devices measure the current and enforce the device’s removal.

    1. Maybe they limit it of you go overboard, but I actually tried a few MacBooks and each of them allowed 500 mA when device only asked for 100 mA.

Leave a Reply to hmm Cancel reply

Your email address will not be published. Required fields are marked *