I was just thinking if you are on your computer anyway would it just use some of the excess electricity that your computer would have wasted or would it be worse than charging your phone from a charger while using your laptop separately.
The other answers have touched upon the relative efficiencies between a phone charger and a desktop computer’s PSU. But I want to also mention that the comparison may be apples-to-oranges if we’re considering modern smartphones that are capable of USB Power Delivery (USB PD).
Without any version of USB PD – or its competitors like Quick Charge – the original USB specification only guaranteed 5 V and up to 500 mA. That’s 2.5 W, which was enough for USB keyboards and mice, but is pretty awful to charge a phone with. But even an early 2000s motherboard would provide this amount, required by the spec.
The USB Battery Charging (USB BC) spec brought the limit up to 1500 mA, but that’s still only 7.5 W. And even in 2024, there are still (exceedingly) cheap battery banks that don’t even support USB BC rates. Motherboards are also a mixed bag, unless they specifically say what they support.
So if you’re comparing, for example, the included phone charger with a Samsung S20 (last smartphone era that shipped a charger with the phone) is capable of 25 W charging, and so is the phone. Unless you bought the S20 Ultra, which has the same charger but the phone can support 45 W charging.
Charging the S20 Ultra on a 2004-era computer will definitely be slower than the stock charger. But charging with a 2024-era phone charger would be faster than the included charger. And then your latest-gen laptop might support 60 W charging, but because the phone maxes out at 45 W, it makes no difference.
You might think that faster and faster charging should always be less and less efficient, but it’s more complex since all charging beyond ~15 Watts will use higher voltages on the USB cable. This is allowable because even the thinnest wire insulation in a USB cable can still tolerate 9 volts or even 20 volts just fine. Higher voltage reduces current, which reduces resistive losses.
The gist is: charging is a patchwork of compatibility, so blanket statements on efficiency are few and far between.
Your computer doesn’t “waste” electricity, power usage is on-demand. A PSU generally has 3 “rails”; a 12V (this powers most of the devices), a 5V (for peripherals/USB) and 3.3V (iirc memory modules use this). Modern PSUs are called Switched-mode power supplies that use a Switching voltage regulator which is more efficient than traditional linear regulators.
The efficiency of the PSU/transformer would be what determines if one or the other is more wasteful. Most PSUs (I would argue any PSU of quality) will have a 80 Plus rating that defines how efficiently it can convert power. I am not familiar enough with modern wall chargers to know what their testing at… I could see the low-end wall chargers using more wasteful designs, but a high quality rapid wall charger is probably close to if not on par with a PC PSU. Hopefully someone with more knowledge of these can weigh in on this.
Charging at higher voltage is more efficient (at least on electric vehicles) but also your phone is such a relatively small battery it barely matters. Avoid wireless charging as it is extremely inefficient.
USB PD (Power Delivery) actually does use a higher voltage for more wattage. Standard USB is limited to 5V at 0.5A and sometimes up to 5V @ 2A on quick chargers. But PD chargers can give 20V and 3A for 60w or even 5A (100w) with properly rated cables. There’s even a proposal for up to 48V at 5A to get 240w. This is all determined by a negotiation between the charger, the cable (which does have a small chip for this purpose), and the device, therefore PD chargers must support multiple voltages.
Gallium Nitride based modern phone chargers are 95% efficient.
The very best, most expensive PC power supplies on 115v AC will only reach 94% at the very specific 50% load of power supply rated wattage. So if you have a 500 watt power supply, and aren’t using almost exactly 250 watts, you aren’t getting that 94% efficiency. Regular power supplies under normal variable load conditions are going to be somewhere in the 80% efficient range. If the PC is idle, that efficiency can drop to 20% (but it’s fine because it’s only a few watts).
https://forum.level1techs.com/t/super-high-efficiency-300-400w-psu/184589/2
So using a modern Gallium Nitride stand alone charger will be more efficient. It will be extremely more efficient if you use that stand alone charger instead of charging off your PC while your PC is idle.
If you’re bothered about overall waste, consider that some batteries degrade slower if you charge slower. I tend to prefer a slow charger when i can.
I remember seeing an experiment saying that the difference is negligible. Even if it isn’t, it’s far more important to keep your battery between 20 and 80 percent at all times.
excess electricity that your computer would have wasted
Ya this is not how electricity or computers work.
It reaaaaaally doesn’t matter. If you charged your phone from empty to full every day it might cost you a dollar. Per year.
I was just curious also I was more thinking of energy efficiency for environmental reasons.
In that case, you can make it a point to charge when the grid is “cleaner” - usually overnight. Your electricity costs may be cheaper then anyway.
The Apple Home app shows a grid forecast for your location, with cleaner times highlighted in green. I’m sure they pull this info from the utility company, so the info should be available in other smart home apps or maybe even your utility’s website.
But like others said, phone charging is very minimal. We’re talking about a 20W charger vs. say, a 1500W air fryer. Running larger appliances off-hours is a bigger deal - dishwasher, laundry, etc.
Overnight? I thought it would be cleaner during the day because that is when the sun shines. I haven’t had an Iphone in a while but I will have a look into grid forecasts. I still use an air fryer not sure what the the wattage is though I would assume it is similar to an oven.
I’d like to remind everyone of the “vampire effect” of wall-wart chargers - if you just leave them plugged into the wall waiting for you to connect a device, you’re constantly wasting a bit of electricity. That should also be involved in the efficiency decision of using the already plugged in computer or laptop.
Someone correct me, but unless your charger is warm to the touch, this is a very insignificant amount of power a year.
Your TV pretending to be off probably draws more every couple of days.
Anything with a remote, anything with a screen, way too many “computerized” appliances. Leaving a computer on
If you care to minimize Standby power “comfortably”, usually libraries or power companies will let you borrow an AC Power Meter free of charge.
You can use that to inspect your various devices Standby Power. For example I have an amplifier that pulls nearly 15W in standby, since finding out it lives on a smart plug.
However my TV pulls less than 1W, and at that point I prefer the convenience of just being able to use the remote to turn it on.
(Also keep in mind with the smart plug solution that the plug itself will pull a little bit of power too, this will pretty much always be <1W though.)
Vampire effect was a real problem with older transformer-based wall warts. If you still have one that feels heavy and solid, or has a fixed energy rating on the label, this is you
If it feels light, almost empty, or the label has a wide range of frequencies and voltages, it uses an order of magnitude less and really doesn’t add up anymore. I believe it’s something like a couple cents per year.
Edit: found something from 14 years ago calculating the worst case scenario of leaving a wall wart plugged in as 7¢/month but I don’t believe it’s anywhere near that expensive for a modern one …. Although it’s probably more telling that I didn’t find anything more recent with a price estimate.
Unless your chargers are generating a noticeable amount of heat (and they shouldn’t be), the amount of electricity they are using is simply negligible. Electricity cannot simply evaporate into thin air.