# Forum

Unisciti allacomunità

Heads up: 120V charging less efficient

While you can charge at 120V (and several people on this forum do this or have done this while waiting for a 240V plug to be installed), it turns out to be quite a bit less efficient in energy usage than charging using any 240V power source.

Tesla has recently updated their charging calculator (click here: http://www.teslamotors.com/charging#/calculator) and by playing with the dials you can see how much actual energy the car needs to charge using different plugs and power sources.

If you use a 120V plug to charge instead of a NEMA 14-50 240V plug, you'll use about 32% more energy (and money) to charge up your car.

Electricity is so cheap though, that that isn't the end of the world - if you drive 1,000 miles a month (12,000 miles a year) at \$0.12/kWh, that'll be an extra \$12.70 per month in electric expenses.

Well, is part of this related to the vampire drain that happens when you have to leave the car sitting for so much longer?

This just refers to the inefficiencies when charging. Of course vampire drain would account for an amount of kwhs also depleted from the battery.

A modern transformer would be tossed out the window for losing 30% of energy at a different voltage. This is not that. I am sure this has to do with more realistic assumptions in the calculator about other drains during the extended charging period.

It would be helpful for someone with actual knowledge of the calculator assumptions to chime in.

I am not positive about this but when I got my Roadster I was informed by the technician that 240 is more efficient charging because while charging the air conditioner is used to keep the batteries cool. So by charging using 120 it takes longer to charge leading to more air conditioner time which leads to more charge needed. It makes sense to me but I cannot confirm this assumption.

Makes sense. The car needs a minimum level of current to perform basic maintenance while charging, i.e., maintaining the battery temperature and other parameters. This stays pretty much the same as you ramp up the current, so the higher the current the more efficient the charging because the current used by the basic battery maintenance functions becomes a much smaller percentage of the total current being used.

Cooling could be part of it, but I doubt it could be that much. Less heat is generated at lower charging voltage, so if the cooling were running full blast as much as with 240v, the batteries would be getting actually cold?

@ soma, a lot of what happens with the battery cooling also has to do with ambient air temps. So even at 120v, we are probably going to see variation between a Model S in Arizona and one in Minnesota.

Um, but this issue is about a decrease in charging efficiency vs. voltage, independent of ambient temperature...

I think we're missing something here.
Here's what I know that helps me misunderstand this:

The line loss is related to the current by the formula P=I^2*R, where I is the current and R is the internal resistance.

If you double the current, to (nominally) double your charging speed, then the internal resistance goes up by a factor of 4x. Now, in this case the internal resistance is a relatively small part of the power usage, but it tells me that there should be a decreased efficiency as the current rises.

Of course, I am not correcting for the different 'R' values of the wires, and maybe the smaller 120V cabling makes more of a difference than I think.

I know enough to be dangerous. Anyone else know enough to be safe?

yeah, the dangerous part is that that formula applies to DC resistive loads (and AC in absence of reactive elements), and I think the main question here is whether this efficiency loss is something about the AC transformer voltage. (or other things going on in the car during charging)

Any constant load/loss would be a higher percentage of a lower total current.