Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Falling lithium-ion battery prices to drive rapid storage uptake (pv-magazine-usa.com)
138 points by okket on Aug 3, 2017 | hide | past | favorite | 177 comments


I wonder how much of this is simply the Gigafactory coming online at full production + others trying to keep their factories running.

If you look at Tesla's plan to be making 10,000 model 3's a week, with a nominal 90 kWh battery in each that is 900 MWh a week of capacity or 46 GWh capacity a year. But it is all tied up in cars of course[1].

The power companies have been screwing around with rate plans to keep grid-tied solar from being as economic as it could be and so I do see a lot of people such as myself moving to a whole house storage system at some point.

[1] I fully expect a TV show to have a scene where they need more power and the 'genius' hero looks out the window and sees a row of electric cars charging and "hacks the system to feed the car's energy into the power grid for the extra energy needed to save the day" along with the gratuitous cars exploding as they are being drained too quickly but reaching the minimum level of power at just the right time to save the world.


Utilities are actually planning on using parked electric cars for storage.

https://en.wikipedia.org/wiki/Vehicle-to-grid


I don't know why, but this somehow reminds me an excellent paper[1] by lcamtuf about parasitic data storage (keeping data "on the cable").

[1] http://lcamtuf.coredump.cx/juggling_with_packets.txt


Wouldn't this wear out the battery life?


a little- so you can pay the car owner of the use, and only use/drain the batteries when the difference between peak and off-peak power prices justifies the use. In NZ there is a market for power - and you can see how the peak to off-peak prices can create the arbitrage opportunity.

http://www.em6live.co.nz


Hell no!

As soon as someone departs for a long trip and the battery isn't charged, there will be a huge class action lawsuit.

The lemon law attorneys will also have a field day!

This is a really dumb idea.


I'm ignorant as to existing implementations but the system I heard of to solve this is:

Your car or charger module has electricity spot prices available to it and you can select a strategy for it to automatically follow.

E.g.

* Rapid charge (charge battery, ignore price)

* Economise (charge battery at low price)

* Make money (charge/discharge according to price to profit from the difference)


It doesn't have to happen against anybody's will.

As you plug in your car, you set the time you need it charged, then you charge for a discount for your cooperation.

If you change your mind and use the car earlier than expected, you know what you signed up to. No need for a lawsuit.


Of course they aren't 90kWh. more like 55 and 75 in 2 options. Still that is 32GWh per year in battery production on a single factory alone. Shouldn't be hard to make it to 1TWh a year worldwide if electric vehicles to ever have anywhere like mentionable penetration.

In a way, electric car batteries have demonstrated to outlive their cars by 2x-3x (there are already several Teslas who were driven beyond median lifetime of a car and they lost just a few percent of capacity). So when i car is trashed, a battery is essentially still new. They can be bought out for say half the price and used for electricity storage. If there is a billion electric cars in the world instead of the current billion gas powered ones, and they are on average 55kWh of which 10% is lost while they are in a car, and another 20% is lost while they are used for grid storage, on average they will have 45kWh of capacity. Grid storage takes 6 hour capacity (pumped hydro plants usually have 6 hour storage, enough to compensate any daily cycles), so 7.5kW power per battery or 7.5TW overall... the world has 6.3TW of installed generation capacity.

Probably grid storage will be a side business of car makers because all storage plant will need to use same type of battery. Maybe they'll even offer like 10% discount on a new car on a condition that the battery is theirs and to be returned when a car is scrapped.


Any (non-tesla?) source? Model S has only existed for nine years.

Also, the life-time of a car that needs a $20.000+ battery replacement will obviously prematurely make the car utterly worthless.

Usage and weather most certainly have a huge impact as well.

Don't get me wrong, I'm interested in a used model 3 when that time comes. But that hinges on that you can replace the battery for a reasonable cost, reasonable even for an older car.


It only existed for 7 years, but some small fraction of cars are driven much more than others. Typical lifetime of a car is 15 years, so why are you surprised that some are driven 2x more than average? Of course there are many such cars, maybe whole 1% of those made in the first year - hundreds of them. That is not an extreme, one in a million case - one in a million case looks like this: http://www.tilburyautosales.com/used/Ford/2014-Ford-F-250-2d...

No, Teslas will never need battery replacement. Contrary, batteries outlive cars severalfold (yes they can be bricked by grossly violating their usage rules in a way BMS can't prevent, but this is not wear and tear - it is equally easy to do with a brand new or 10 year old car - that's approximately an equivalent of what happens if you fill a gasoline car with diesel). With a normal averagely intense usage and a good driver, a battery will last 40-50 years which is much, much more than it makes sense to keep using a car itself.


https://www.quora.com/Whats-the-life-expectancy-of-a-Tesla-M...

"Tesla Tech Talk (06/13) - Speaker's main points: 1) battery degrades everyday 2) battery degradation is non-linear over time; meaning it starts very very slow, but after 4-5 years, it gets faster 3) after the first 5 years, degradation may be as low as 5%. But by the 8th year, they expect about 30% degradation."

Batteries will without any doubt seriously affect second hand markets for EVs. Especially when considering cold climates and the fact that the range can be quite limiting even when the battery is brand new.


https://electrek.co/2016/11/01/tesla-battery-degradation/?pr... from the same page

in 2013, there was no data to tell what happens after 8 years of typical use, first Model S was just a year old by then and even more intensely used ones didn't have that much use - 8 years of normal use equivalent. Now there are many of those that were driven beyond median lifetime of a vehicle.


I have no idea what to make of the claim that no further degradation is expected. It's not like batteries are a new not understood technology, batteries don't behave like that.

Is that an ad?

It is expected for the capacity to drop off significantly with age. The main weapon tesla has used against this is good cooling and sensible charging patterns, but that is just stalling the inevitable.


>with a nominal 90 kWh battery

Model 3 has a base 60kWh battery, with upgrade available to 75.


I don't believe that's correct. The Model 3 has both a smaller range and lighter weight than a Model S did at 60kWh, indicating a smaller battery.

Tesla hasn't confirmed how large the pack is at all, but they have specifically confirmed it's less than 60.

I think it's either 40 or 50.

https://electrek.co/2016/04/26/tesla-model-3-battery-pack-co...


Ah thanks, for some reason I read the 60 as 90.


>I wonder how much of this is simply the Gigafactory coming online at full production + others trying to keep their factories running.

The "Gigafactory," is a drop in the ocean if you count noname Chinese factories, but it will be one of the biggest market players outside of China


We have different ideas about 'drops in the ocean' :-) See this article (http://fortune.com/2017/06/28/elon-musk-china-battery/) for some numbers. Projected Gigafactory production (as more come on line) could continue to be 25 to 35% of the global production of batteries.

My father-in-law, who is a big booster of South America, feels that if Argentina, Brazil, and Chile got together they could move up the 'value chain' and produce batteries for export. I do know that much of the raw lithium comes from Argentina and Chile so I expect there is some truth to that idea.


I have no idea of where they get those digits. Chinese ministry of commerce posted back in 2015 that, the amount of battery capacity used by biggest Chinese EV makers alone (excluding PHEW manufacturers) was 16.9GWh. And the amount of battery sales to car makers make less than 10% of the market in China.


Here is an educated guess that it takes a kilogram of lithium salts to make a 10kWh battery[1]. (not my work but it looks like a reasonable process). That can help you create an estimate of how much lithium salt production is required to meet various market requirements.

It is just another way of checking various numbers. You can start from world production of Lithium salts. And remember that they are used in all lithium batteries not just the rechargeable type like we're talking about here.

[1] http://large.stanford.edu/courses/2010/ph240/eason2/


I don't think the "EV" numbers are comparable to "car" numbers in China. Electric scooters and rickshaws are extremely popular there (and have a much larger fraction of their market segments than electric cars).

Source: traveling in big cities and tourist areas.


Also Musk has said that he's considering building 2-3 more gigafactories, but we don't know how fast those will be built. It could be over the next decade or so.


And by "drop" you mean a blob of water the size of a small moon.


The TV plot sounds like an updated version of Operation Yashima.


Pretty sure they will also use the batteries in their powerwall's as well (I seem to remember that the grid storage in CA was produced with batteries from the gigafactory)

I would imagine they would quickly switch the powerwall to use the new battery size that the gigafactory is producing..


The powerwalls and powerpacks use a different chemistry from the cars, and are still using the 18650 size going forward.

It's trivially easy to change the size of cylindrical batteries. Li-ion batteries are essentially a tape, and the second to last step of production is to roll the tape up and insert it into the battery. Different cells just require different cans and widths of tape.


I'm pretty sure Powerwall 2 uses the same 2170 cells as the Model 3. https://en.wikipedia.org/wiki/Tesla_Powerwall#Technology and see the note on 2170 cells on this page https://en.wikipedia.org/wiki/List_of_battery_sizes#Lithium-...


My mistake! You are correct.


What about a different chemistry , like Li-Sulfur ? how big of a change the gigafactory will require ?


Most likely very little. Sulfur cathodes would just be deposited on the tape like any other cathode, and the gigafactory could freely switch to any other standard chemistry after some setup time and adjustments. More radical (and also much less likely) changes would have bigger impacts. Lithium metal batteries would probably require replacing some significant machinery. Solid electrolyte or ceramic batteries could need almost an entirely new production line. Metal air batteries would be almost completely incompatable.

Realistically, there are probably no changes coming in the next 30 years that will require rebuilding anything significant.


Germany has plans and investment commitments for a 34 gigawatt-hour plant that will add even more supply to the global battery market.

https://www.bloomberg.com/news/articles/2017-08-03/germany-g...


Will the GigaFactory operate on weekends ? 1400~ car per day .. I'd love to see that on video.


The Gigafactory operates seven days a week with two shifts.


didn't they change it to three shifts to reduce injuries because of human error from being tired


God I hope so. Ford figured this shit out over 100 years ago... 3 shifts maximized factory output. Also having 5 day vs 6 day work weeks didn't significantly reduce factory output, but helped significantly reduce turnover.

I swear, it's Musk's biggest weakness, not realizing the costs of mistakes people make when they're tired. I couldn't get through his recent biography because I kept seeing mistake after mistake being chronicled that wouldn't have happened if people were working reasonable hours. I figured the setbacks to SpaceX due to sleep stupid were at least 6 months before I had to stop reading somewhere around their early launches...


There are known techniques for running a large 3-shift operation effectively. One is that all employee-facing services must be available on all shifts. Yes, this means running 3 shifts in human resources, so someone working 12 to 8 can straighten out their insurance problem. Safety and repair staff must operate on 3 shifts. Food services must be as good late at night as they are during the day. Reasonably high level management has to be present on all shifts.

William Bratton, who headed the NYPD (twice) and the LAPD, was big on this. When he first started running the NYC transit police, he'd get on the subway late at night and ride around, talking to the cops on duty. Once the rank and file got used to seeing the commissioner on the subway, things started to improve.

A useful test for your local police department in a city of any size - find out who's the highest ranking cop on duty at 1 AM on a Saturday morning, peak time for most city PDs. If it's a lot lower rank than who's on during the day, the department is not well managed. Such police departments are derisively called "9 to 5 departments" in the business.


It is not uncommon for factory workers to prefer 12 hour shifts and working alternating 3 and 4 day work weeks. That staffs a 24x7 operation with four crews of workers.


There is a good chance that what people prefer does not correlate with what works best. People plan for their lives, the company plans for efficient work. This isn't uncharted territory, and I have never heard of any study claiming 12 hour shifts are more productive / safer / less error prone.


Interesting. That might work assuming workers get an appropriate amount of rest. I know it works well for nursing staff to have longer shifts. There's definitely a cost to handing off tasks, though I expect this to be near zero in a factory setting.

I just know that people get really dumb when they're tired, and it's the same kind of dumb as oxygen deprevation where people actually think they're not effected.

I'd be curious if these 12 hour shifts are preferences for personal reasons, or if there are measurable benefits in the factory as well.


I heard they wanted better safety but didn't think they meant "more sleep".


What is this? 1 car per minute?


Depends on how many hours it's running. 24 ? 20 ? 16 ? someone mention two shifts so I guess it's gonna be above 10.


1,440 min/day.

86,400 sec/day

So, yes.


Using lithium-ion batteries for stationary storage is interesting because their original advantage was all about light weight (lithium is the lightest metal element).

Now, it seems the development effort devoted to lithium-ion has paid off so well that they are the best OVERALL battery solution. The single-minded effort we put into cell phones and cars is so strong that it now reaches all the way to homes and the grid.

To me, this suggests that all sorts of cool technologies might be possible if we as a society could concentrate on making them work.


>Using lithium-ion batteries for stationary storage is interesting because their original advantage was all about light weight (lithium is the lightest metal element).

Not entirely. The real killer feature of LiPo which has allowed things like Roadster and Model S to exist is their high discharge, or "C" rating. The power density of LiPo is far beyond any previous chemistry available for batteries at scale. Energy density and power density are two completely different things [0].

https://www.quora.com/What-is-the-difference-between-power-d... [0]


In the last three decades battery technology was mostly about powering small, lightweight gadgets. Now that there's massive amounts of money to be made with car batteries and grid-level storage, I certainly expect some great battery technologies that focus on that in the next decade or two.

That's also why I don't care about all these fears about us not having enough lithium: nobody is going to power the world on lithium, it's just the cheapest method right now because of the massive amounts of research money already dumped into it, and great economies of scale. Other technologies will catch up as their use-cases become more valuable.


> That's also why I don't care about all these fears about us not having enough lithium

Who is afraid of that? Take a look at this: https://en.wikipedia.org/wiki/Abundance_of_elements_in_Earth...

Lithium is #33, estimated at 20 ppm of the crust. We presently mine 64000 tons per year. It is more common than lead, which is #37 and only 14 ppm. Yet we mine 4.8 million tons of lead per year. Now examine boron. #41, 10 ppm and 9.4 million tons per year. Are these unsustainable? Are we on the verge of peak boron and peak lead? Probably not. We have twice as much lithium so we probably aren't going to run out of lithium even if demand increases by a hundred-fold.

I am not a geologist, so maybe much of this lithium is more inaccessible than I'm assuming. But everyone who says we are going to run out of lithium have been only looking at the known deposits and are ignoring the possibility that more mines will open to meet demand.


Lithium has gone up in price about six fold in the last 15 years and is currently about 4 times as expensive as lead per metric tonne.

I suggest that something is wrong with your assessment.


Why is there anything wrong with his assessment. Availability affects price, sure, but demand does equally so. Pretty sure lithium has higher demand than lead.


Lithium is a lot harder to produce than lead. That's certainly most of the price difference, regardless of abundance in the crust.

But statements like this should really be talking about the "economically extractable ore", and it needs to include economic thresholds. There's a price curve there, and I suspect that for lithium it is a steep one because of how complicated its manufacturing is relative to lead.

I don't know if we're near the limit though. I don't really buy it personally, I'm much more worried about rare earths.


Production of lithium from hard rock isn't very different from lead, and lithium production from brine is far easier.

Hard rock lithium is most commonly extracted in the form of spodumene which is about 6-7%[1] (8% when totally pure) lithium oxide. Rich lead ore may have 3-8% lead content. Obviously comparing the two is difficult since lead is about 20x denser, but they're on the same order of magnitude. As a ratio of rock in to metal out by volume there is probably a bit more lithium, but by mass lead ore is probably 2-3x more concentrated.

For both lead and spodumene, the ore is crushed, separated by density, ground, counditioned (treated with acid to remove organics/slime), froth float separated, cleaned, filtered and dried[2]. That gives you metal that is ready to be reduced in a furnace (or electrolytically or chemically, for lithium). Both processes use the exact same machinery and chemicals (sulfuric and hydroflouric acid, and NaOH). Both processes are almost identical except that the lithium is collected at the low-density end and lead is collected at the high-density end.

Brine production is much easier and involves a well, a pump, a bulldozer and a bunch of sodium carbonate (water softener)[3]. It is significantly cheaper than any other metal production method, as well as low-energy, low-waste and low-impact. The only downside is there is a lot of water loss to evaporation. It's a relatively small amount of water, especially compared to a farm of the same size, but brine mines exist exclusively in places without much water to start with (otherwise the brine would have washed into the ocean already). It's a tradeoff.

Regardless, the difficulty in producing lithium is most certainly not most of the price difference. First, annual global lithium production is ~36,000 tonnes vs 5 million tonnes for lead, a 140x difference. Lithium's lower density could account for a 2-3x price per tonne difference, but it is vastly more likely that economies of scale are the culprit.

Not only that, but the current lithium price surge is entirely due to market contraction. It takes 2-3 years to build a mine, and longer to decide to build one. Tesla wrecked the battery market in less than two years, becoming the largest battery consumer. There simply hasn't been time to open new mines yet. The prices will drop as soon as they catch up.

Finally, the majority of lead is produced by recycling. That 5 million tonnes mined each year is less than 50% of all lead. The recycling industry obviously drops the price of lead immensely, and lithium will take advantage of recycling too, when and if it ever needs to.

>But statements like this should really be talking about the "economically extractable ore"

There is no such thing, really. Not even for oil. The content of the ground is not mapped that well. As soon as people go looking for lithium, the reserves will skyrocket just like they did with oil. Trust the crustal content more than the global reserves. Lithium is so cheap and ubiquitous that nobody has gone looking for it since the 50s- Seriously, that's the last time the USGS did a survey of US lithium supplies.

>I'm much more worried about rare earths.

Don't be. They're only used in hybrid cars (neodymium) and even then they aren't needed. You don't need magnets to make a motor, just steel and copper. The other green use for rare earths (tellurium) is in thin-film solar panels, which make up 5% of all solar panel production. Rare earth elements aren't even the first choice for these technologies, and a contraction in supply will push them out of favor entirely. Seriously, the only reason we use thin-film solar is because it's cheap, even though it's less efficient. If it's not cheap, then we won't use it at all. Good riddance.

[1]: http://encyclopedia2.thefreedictionary.com/Lithium+Ores

[2]: https://www.911metallurgist.com/blog/froth-flotation-spodume... [3]: https://www.thebalance.com/lithium-production-2340123


Thanks for the fantastic post!

My rare earth worries are based on this DOE report:

https://energy.gov/sites/prod/files/edg/news/documents/criti...

It actually calls out dysprosium as the biggest problem, with neodymium, terbium, europium, yttrium, and indium as the next biggest worries.

This is fairly old though, I read it back in 2010. I've read about a number of breakthroughs on higher strength-to-weight magnetic materials and I'm sure solar has responded to this report accordingly as well. It's comforting to know that things seem to have worked out seven years later!

I think a lot of the stress in 2010 was because China was limiting exports and it became clear that the USA was at a strategic disadvantage because of vulnerability to Chinese actions. Glad that Obama took it so seriously by supporting funding to develop those new technologies.


Yah, lithium's power/weight advantages had driven so much investment that its now has awesome power/volume advantages too. The result is that even things which don't care about the weight (http://www.storagereview.com/apc_backups_pro_500_lithium_ion...) are starting to switch from lead acid because its no longer the super cheap solution.

That is something I've found really curious because if you tear into just about any lead acid battery it still looks like technology from the 1950's. Just about everything else has shrunk, become lighter, used less material, etc. But lead acid's last major "breakthrough" was in the 1970's with gel cells. I would have expected lead acid to look more like a capacitor at this point (thin layers of material rolled/folded up) but its still these fairly thick plates sitting in electrolyte with non-existent charge control. Surely some creative person sitting in a lab could come up with way to minimize sulfation or to decompose the lead sulfate within the battery without basically melting them down and building new ones.


There is no possibility of ever running out of lithium for battery production. It is an extremely common element, and the actual lithium makes up about 3% of a LiPo cell by mass.


We do this a lot. CCD image sensors are inherently more suited to astrophotography, but so much development has been put into CMOS that it is catching up to (and may surpass) CCD. A single widespread technology, with more applications & development effort, surpasses a better but more niche technology.

It's an interesting phenomenon. In the case of batteries, I wonder if we fully account for all the costs though. Lead-acid, for example, is a very sustainable (long-lasting, highly recyclable) battery technology that is inherently suitable, but if the full mining/recycling/disposal costs of lithium-ion are not priced in, the market won't account for this.


I've been watching Joel Salatin's talks on his farming methods. Been surprised to hear that there is a huge amount of 'low tech' innovation occurring in farming within the past 50ish years because long-used materials and parts that were historically expensive to produce are now order(s) of magnitude cheaper.

- His mobile chicken coops need to be lightweight. Before thin-bladed sawmills, milling lumber to be smaller than ~2" was cost prohibitive as each cut had a 0.25" kerf. Making 1"x1" planks would mean nearly half of the wood goes into the waste pile!

- Cheap electric fences allow pastured hog farming to be mobile. Pigs will wreck strong fences, but a lightweight deterrent allows quick placement and removal.


LiS will be even more popular with grid utilities. Super high specific energy, but low overall discharge weight + low cost of Sulfur means a popular, stationary battery perfect for grid tie in. Ironically, it's even less dense than li-ion.

The current issue is commercialization (i.e, doesn't exist) and low conductivity of the sulfur cathode (meaning a low discharge rate to avoid self heating) -- it could be a decade + away.


In principle redox flow batteries should offer better scaling properties for stationary storage, but Li batteries have a process advantage of several years. So the picture might change down the road as long as there is still enough money flowing into alternative research.


I wonder. One of the reasons internal combustion has lasted so long is great boatloads of cash get showered on the technology trying to eke out that last 0.2% of efficiency, and when that's the situation eventually someone has a bright idea that makes a bigger difference.

If lithium ends up with the same advantage, other technologies that ought to be better may end up lagging even in the long term.


I see what you are saying. But maybe some big company like Panasonic or Seimens will decide to put the needed bucks into flow batteries. Or maybe China will do what it is doing with PV's and decide to push them hard.


Li-ion batteries have a high energy density in terms of both weight and volume, when compared to other types of batteries. And yes, it's all relative to which tech you're comparing them to, but it's worth remembering that they are both lightweight and physically small, for the amount of energy they can store.


Don't lithium-ion batteries degrade extremely quickly, though? Won't these all be down to half capacity or so within a decade?

Is it really worth investing in gargantuan quantities of energy storage if they need to be replaced every few years? I'd be eyeing EDLC technologies or something like molten salts over giant stationary phone/laptop/car/etc batteries.


Li-ion batteries have the second slowest self-discharge of any battery and the longest cycle life/easiest upkeep.

An li-ion battery will do about 500 discharge cycles from 0-100. They'll do several thousand from 20-80, and tens of thousands cycles from 30-70%. If you have a few days of storage, the battery as a whole will last an incredibly long time even when cycled every day.

As for why: When the battery is kept at a high or low voltage, it puts an overpotential on one side or the other of the battery. There's more or less lithium on one side of the battery and it tries to diffuse across the battery very slowly. It never makes it to the other side and reacts with the electrolyte and is lost, reducing capacity.

The closer it is to 50%, the less lithium is pulled/pushed out into the electrolyte.


Huh, interesting. You learn something new every day, thanks!


They degrade extremely quickly in the usage pattern of a typical consumer device (alternating between full charge and full discharge). They last fairly long if you keep them between 30% and 80% charge.

It's not the most long-lasting technology, but we replace most infrastructure every 20-years already. A 10 year lifetime wouldn't be that bad.


> They last fairly long if you keep them between 30% and 80% charge

wait. does that mean that, effectively, for a careful Tesla owner seeking to maximize battery life, most of the time her practical range will be roughly 70% of the stated value?


Teslas have a "trip" mode. During normal operation they charge to 80%, but if you put it in trip mode it charges to 100%.


No, they don't degrade that fast. They'll be good for at least 15 years based on charge/discharge cycle data Tesla has collected from vehicles.


$200/kWh by 2019.

Assuming that households consume between 30 kWh and 15 kWh electricity daily on average and you want to have storage capacity for 1/3 of daily electricity consumption to get wind or solar energy work, you need 10-5 kWh per household. That's $2000 - $1000 per household. If that battery lasts 10 years, its $200 - $100 per year.

It's workable and scales if we are able to reuse lithium. When does that happen? (recycling is not reuse)


That wouldn't work for much of the South where summer usage for a house can easily hit 100 kWh/day for air conditioning. The hottest part of the day usually starts at about the end of the prime solar hours. Plus you'll want massive overcapacity of your batteries as deep discharges drastically reduce their useful lifespan.


Wow - that's a crazy amount of energy. As a Brit who's never been the south of the US, is that figure really true? 100kWh is about 300 miles in a Tesla, or boiling a kettle constantly for about 30 hours.

If that's really accurate, then I'm staggered at how much energy folks must be using to keep cool, and makes my efforts at saving energy seem paltry in comparison.


Yes, that is a crazy amount. I live in the South, and the highest we hit is about 35kWh on extremely hot weekend days. Our typical summertime use is more like 20kWh/day -- but according to our utility, we're using about 1/2 of what a typical neighbor uses.

Based on this very scientific single-neighborhood sample, I'd guess 40-60kWh/day would not be unusual during the peak summer months.


In the US, there are very large variations in house interior volume, insulation, and usage patterns. Upper-middle class house built on a slab with standard in Dallas with a stay-at-home mom, for example? I wouldn't bat an eyelash at 100kWh/day on the hottest and wettest days. Condo apartment in downtown Houston? That would be excessive.

US insulation requirements in Dallas is R-20 in the walls [1]. The PassivHaus standard is R-40 to R-60. I believe Germany requires R-5 at a minimum. Mainstream US construction design and practice still doesn't account for thermal bridging, so that defeats much of the insulation we're throwing into our buildings.

[1] https://eepartnership.org/wp-content/uploads/2016/02/Texas-2...


Texas also produces the most wind power of any U.S. state.

https://en.wikipedia.org/wiki/Wind_power_in_Texas

The source in wikipedia is out of date, but it's still true as of Q3 2016:

http://www.awea.org/Resources/Content.aspx?ItemNumber=9488


The Texas summer is brutal [1].

Last month my average home used between 100 and 120 kwh per day [2].

Luckily we rarely have to heat in the winter - although it's not uncommon to have the A/C running for Christmas.

[1]: http://www.statesman.com/news/local/wipe-that-brow-austin-re...

[2]: http://imgur.com/a/45oGu


>Last month my average home used between 100 and 120 kwh per day [2].

That's half of my entire energy usage for a month...

Have you considered a swamp cooler?


Swamp coolers work in areas that are hot and dry, like New Mexico or Arizona.

They do not work well if it's hot and humid, for two reasons. One is that the water doesn't evaporate as well, so they don't cool well. The other is that the whole point is comfort, and in a humid climate you don't achieve that unless you remove both heat AND humidity, but swamp coolers add humidity.

Texas has some desert areas (western part of the state), but most of the population lives in the part of the state that isn't a desert.

The good news is with modern construction techniques and equipment, you can cut energy usage WAY down. You can cut energy usage a lot with stuff like a radiant barrier in the attic, attic vents, lots of insulation, double pane windows, a "tight" house with little air leakage, proper angles to keep direct sun from heating up the house at the wrong times, and a high-efficiency AC unit that you service and maintain properly. In fact, a lot of those things are actually more beneficial than getting solar panels.


I've heard they are not effective with our extremely high humidity levels.


I live in New Jersey and just had solar panels installed on my roof. Peak power is about 7.8 KW. On clear days, power production in the summer is about 50 kWh

I have been keeping daily logs this summer and find that my daily power consumption swings between about 25 kWh on cool days to 65 kWh on really warm days. Definitely a lot lower than 100 kWh mentioned previously.


Needless to say, NJ is very different the South, and in any case this is highly dependent on the size of the house and the amount of insulation.


Remember, the south of the US is in the subtropics, roughly level with the north of Africa. Imagine living in Egypt and not running the AC constantly.


I spent much of last year in Southeast Asia within ten degrees of the equator and rarely had A/C. It wasn't a problem for me, though my computer was often unhappy.

I went to Egypt as a teenager and remember the temperature in Sharm el Sheikh cracking 50C. We would literally sprint from the A/C in the hotel across the beach to the water because the sand was too hot to walk.

Equivalent latitude doesn't mean equivalent climate. As an example, which of these cities is further north, Buffalo, NY, USA or Cannes, France? Hint: it's the one known for topless sunbathing, not the one known for sub-zero outdoor football games.


According to WolframAlpha, the hottest temp on record in Cairo is 118°F, while Phoenix, AZ has hit 123°F. Even the average high is 5°F higher in Phoenix.


There are a lot of people living in Egypt who don't have to imagine that.


Yes, but I was talking to a British fellow.


But obviously you shouldn't make such big/poorly insulated houses then.


I can't find where the site now, but there was a fairly compelling argument that it was overall still more energy efficient for people to live in Arizona and AC dropping the temp from 90F to 70F than to live in a northern state where for 6 months out of the year you have to heat the house from 20-40F up to 70F.


That seems likely. In imprecise terms:

A/C efficiency ("power factor") is measured using the ratio of heat (energy) moved outside to the amount of energy put into the air conditioner , which is usually >> 1 unless there is a huge temperature difference between inside and outside.

An ideal furnace has a 1:1 ratio of heat energy injected into the house to energy used to run the furnace. Heat pumps can get over 1:1 ratios, but come with caveats and are more expensive than A/C units.

Also, you can make it feel cooler with a dehumidifier, which often is more efficient than cooling. For furnaces, you need a sweater, but your article assumed a 70F indoor target.


Did you mean s/likely/unlikely/?


Even if you compare poorly built/insulated houses, cooling is much less efficient (per degrees vs outside temp) than heating.

Heating can also be made more efficient easily with insulation, and heat generation through CHP or geo heatpump.


Let's not get ahead of ourselves, 100kWh is not the "low limit" of how much energy you need to get an temperature controlled home in the South for a day.

This is the figure in an environment where electricity costs nothing and there is zero pressure from home design to the actual AC unit to reduce consumption. Let's face it, this attitude isn't going to fly in the future. You could probably cut that figure by a good 10% by installing your choice of solar panels or a bunch of trees, just from less sun hitting the structure.

(Cut it in half if you consider the average American in the South lives in a freakishly large house fit for a queen with entourage and is cooling all of it..)


It's 24/7 air conditioning working. In EU you get around 100kWh per day also if you use home air condition 24/7.


> That wouldn't work for much of the South where summer usage for a house can easily hit 100 kWh/day for air conditioning.

It should be noted that peak energy expenditure for air conditioning happens more or less at exactly the moment when the influx of energy from the Sun is also at its peak. That's, like, a gigantic hint that mother nature is dropping on you, hoping you'll notice it. The disease, and its cure, are gushing forth from the same place at the same time.

Probably most of the energy for air conditioning in the future will not come out of storage, but it would be produced at the moment when it's consumed, by solar panels or some other solar technology.


From the grandparent's post:

> "The hottest part of the day usually starts at about the end of the prime solar hours."


For those interested in more info on this, it's called Temperature Lag. There's also the related phenomenon of Seasonal Lag, which is where the hottest part of the year comes somewhat after the period of maximum solar radiation, not during it.

https://en.wikipedia.org/wiki/Diurnal_temperature_variation#...

https://en.wikipedia.org/wiki/Seasonal_lag


There's still significant overlap, it should be used.


It's only prime solar hours if you point your solar panels due south. That gets you a higher figure on paper, but if you need the power later in the day you could point them a bit more to the west and get a lot more usable power when you need it. With solar panel prices falling, it's a race to see if it's cheaper to over-provision solar panels or batteries.


I would think that for that use case, using thermal storage should be more efficient? Don't store electricity to run AC later, but run AC to cool down water or the ground beneath your house or something, and then cool your house with that later.


Or build something like an Earthship (http://earthship.com/). I know that these can air condition a house in New Mexico through geothermal heat transfer, however no idea how they handle the more humid summers of the South East.


They work, but aren't as comfortable as in drier climes. Humidity is tough to control with low-energy, low-maintenance solutions. Unfortunately, many people's discomfort perceptions are heavily influenced by humidity.


I recently moved from Nashville (quite humid) to Flagstaff (quite dry) and the difference is amazing. Large temperature swings are barely noticeable in the dry climate, but would require substantial changes in weight of clothing in the humid one.


That would require a huge amount of work and expense, and at the end of the day cold water isn’t very good at cooling off giant masses of air and furniture.


They are sized for bigger installations, but time shifting energy consumption using ice storage is basically an available commercial product.

Here's one company that makes them.

http://www.calmac.com/


There are far cheaper ways to keep your house cool, though they require an upfront investment.

http://energyblog.nationalgeographic.com/2013/09/17/10-myths...


This digit is unbelievable. A good aircon can pump 4 gigacalories per hour using 1kw/h while running at full power with +40C and 40% humidity outside.

5gkal per hour is more than enough to keep a 50m2 room with huge windows at 21-22C in scorching heat.

If your house is huge, like 200m2 McMansion, and your ventilation does no recuperation whatsoever, and you have near nil heat insulation (like most of housing in US,) you will barely use 50kw/h per day even if you live in the middle of Sahara


200m2 is about 2/3 of the area of what a McMansion would be. Many are twice that.


> deep discharges drastically reduce their useful lifespan.

That applies to lead acid batteries. Lithium ones are much more forgiving on that (as long as you stay within design limits.)


A well insulated water tank that is electrically chilled during peak solar and used to cool down air later in the day/night could solve that storage problem without any cell degradation. Theoretically, with perfect insulation, this could even reduce the total electricity demand, when the conversion from electricity to a temperature gradient happens during a time when the base temperature is low.


100 kWh/day is just incredible. The average consumption per person in the nordics is about 1500 kWh per YEAR.


That doesn't quite seem right: http://shrinkthatfootprint.com/average-household-electricity...

(Interestingly Canada's per capita consumption is comparable to the US despite being much further north)


I should have edited my post to say just Denmark. Sweden and particularly Norway use much more energy.


I would have thought that heating would require a lot of energy in the Nordic countries - but perhaps that doesn't use much electricity? I live in the NW US, in a heavily-insulated passivhaus with a 5kW PV array on my roof, and my average usage is 13kWh / day (though I am also charging a PHEV vehicle's battery some days, and still using an ineffecient old hex-core Xeon workstation on top of heating and cooling, heating water, cooking etc.).


Electrical heating is not used when it can be avoided. It is much better to use waste heat from energy production and industry to heat your home: https://en.wikipedia.org/wiki/District_heating


Actually, in Norway most of the heating is done with electricity (they have loads of hydro power).


I believe, they don't use electricity to heat the house in -20C, and few live in detouched houses.


If it's 10 cents per kWh, that's only 10 dollars per day or 300 per month.

Investing in insulation or other things might or might not make sense. High electricity price for a long future period would also encourage investment.

Alternatively, constructing your house in a benign climate, if you do a lifecycle analysis and look at the upkeep cost.


Most likely not using electricity for heating/cooling?


Even if the hottest part of the day is past prime solar hours, with a solar system you won't be discharging the battery as fast since the system will be powering the AC for most the day. Also, with that much usage, it might be time to investigate insulation.


Why wouldn't recycling be an option? Chances are that, if large batteries become commonplace, it will be cheaper to mine lithium from used batteries than to do so from the earth because they will have very high lithium content.

Also, covering 1/3 of daily use seems low to me. It wouldn't get one through a few windless winter days with dense cloud cover, for example.


Is it possible that perfect is the enemy of good here? You could choose a lower ratio of battery storage to average use and just use non-carbon neutral peaker plants for the few days in winter where there's a shortfall of renewable energy. Still a huge boost to renewable energy, still a huge reduction in carbon emissions.


Extraction of lithium from old batteries is five times more expensive as mining lithium. At some point it may become profitable.

On the other hand, there is lots of lithium. If the environmental impact of lithium mining going off the roof is not a problem, there is no reason to reuse.


>Extraction of lithium from old batteries is five times more expensive as mining lithium.

Even as an upper bound that's still a silly thing to say. It means comparing a lab-scale extraction with massive multinational, highly-automated mining operations. At scale lithium recycling would certainly be much cheaper than five times more expensive.


it is probably also cheaper to extract lithium from sea water than to extract it from old batteries


There is another way of thinking about the economics of this. IIRC the typical price is 12 cents per kwhr. You would need to utilize the battery in sold energy no less than 2000 cycles just to break even. About 5 1/2 years of constant monetized operation (which is a ludicrous proposal). And this assumes no cost on the energy production side.


Electricity price is more like .26€ here(~.31$), and this is certainly not the most expensive place in the EU. Mostly network charges and taxes.

Going off the electric grid should become fairly easy. Cheap solar and storage should make this feasible for an eco-conscious household...


12 cents is cheap! in denmark the average price is 37 cents per kW/h



I couldn't find the information about reusing lithium on those links?


Are you asking if you couldn't find it?


Sorry, I thought you meant grid storage. Tesla has a link on their site about their closed loop recycling program.


> recycling is not reuse

Reusing the lithium from the batteries means recycling the batteries.


>It's workable and scales if we are able to reuse lithium. When does that happen? (recycling is not reuse)

I assume you mean reuse is not recycling ie using old batteries as lower-capacity storage is not workable indefinitely.

First off the most pressing issue is that there is no practical replacement for cobalt in many li-ion applications. Chemistries without cobalt, like lithium titanate or iron phosphate, have a price and weight premium and reduced capacity. Cobalt is rarer than lithium, makes up more of the battery, and is harder to mine. The price might go up very quickly past a certain point, if supply from the DRC becomes tighter.

Lithium is very widespread and relatively abundant but most importantly its final impact on the price of a battery is only a few percent. If it becomes more expensive to reclaim or mine, it will still have a pretty small impact on the cost of batteries.


Sulfur beats cobalt on energy density by a significant margin; its theoretical specific energy is 5x as large, although the chemistry is much more complex: https://www.researchgate.net/profile/Hong-Jie_Peng/publicati...

There is one company in Colorado making Li-S batteries at small scale: http://www.solidpowerbattery.com/


LiS8 + silicon batteries would be a dream: immediate 8x boost in capacity for a given weight and the size is halved.

Unfortunately the damn thing tries to swell to 200% size when you charge it and lasts for six cycles. Someone may figure out how to make it work but most likely itll use nanotech (like a123 did with FePO4, or a graphene coating). The fact is its such a hard problem there are very few ways it could be done cheaply. Right now LiS is only applicable to very, very niche applications. SolarImpulse used them. They come at a very high price and limited longevity.


> By 2025, the world’s base of cumulative installed storage capacity will reach 52 GW, IHS Markit says, up from around 4 GW today. Last year, 1.3 GW of grid-connected storage was deployed globally, and this rate is poised to accelerate to 4.7 GW a year by 2020, and 8.8 GW annually by 2025.

GW is not a unity of energy storage capacity. It's a unit of power.


GW is, however, a characteristic of an energy storage system, and just a valid and important measurement as the amount of energy it stores.


> GW is, however, a characteristic of an energy storage system, and just a valid and important measurement as the amount of energy it stores.

Wrong.

The watt is a measure of power, not of stored energy. These are different physical notions. So, yes, you could say that some storage facility can output up to so many GW, but that's actually a measure of how much energy per second can exit the system when it's being used.

The proper unit for measuring an amount of energy that's stored in the system is the Wh (or its multiple the GWh), which has the same dimension as the joule, which is the standard unit of energy.

GW is like how many liters of water per second can exit the storage tank when it's being emptied. GWh is like how many liters of water total are stored in the full tank. And yes, both are important when designing a battery system. But saying "GW is [...] a valid and important measurement as the amount of energy it stores" would fail you high school physics.

GWh is static energy (or simply - energy). GW is energy moving from point A to point B, which is power.

Here's a brief introduction to the basic meaning of these terms:

https://cleantechnica.com/2015/02/02/power-vs-energy-explana...


GW is a very important characteristic of an energy storage system, especially as a ratio of GWh. A technology could store 100TWh, but if it can only release 1kW, I'm not interested.

The comment you replied to said "just a valid and important measurement as the amount of energy it stores". The comment didn't say power was the same thing as energy, it says that the power is as important as the energy. I don't think this would fail high school physics.


Let me quote that thing for you again. Parse is again, more slowly this time, and see for yourself:

> GW is [...] a valid and important measurement as the amount of energy it stores

There's no way to sugarcoat it. This is like saying km/h is a measurement of distance.


Once you fix the obvious typo there, it says "GW is [...] _as_ valid and important measurement as the amount of energy it stores"

Which is entirely true... Both the overall capacity, and the rate at which it can deliver power are critical metrics for an energy storage system.


You're reading it like this:

> GW is [...] a [..] measurement as the amount of energy it stores.

Which doesn't parse. If the text read "of" instead of "as", I might agree with you. As it stands, it doesn't correctly parse either way. Given the full context of the statement, the more likely original intention was that:

> GW is [...] just a[s] valid and important [a] measurement as the amount of energy it stores.


>Wrong.

At no point do you actually say what was wrong about what you quoted. What do you feel is in error?

>GWh is static energy (or simply - energy). GW is energy moving from point A to point B, which is power.

Yes, obviously. This is HN. Generally it pays to assume that people know what they are talking about around here.

It may be that you don't care about the power rating of the storage instance, but a utility does! They care about the power rating just as much as the total storage amount, as they are both essential design parameters. Particularly in the case of lithium ion, since many applications use discharges or charges less than an hour long (lithium ion batteries are typically designed with a W:Wh ratio of 1:1 - 1:4).

The original article is citing a market report from IHS Markit. Now, I can't find the particular news release for this one, but IHS Markit uses both terms GWh and GW when talking about storage, and as market researchers if they were messing that up it would be a worthless report. So lets look at one of the recent reports:

https://technology.ihs.com/590967/global-battery-energy-stor...

and one easier to check number:

> This was largely a result of over 100 MW of projects being completed and commissioned in California in early 2017 as part of Southern California Edison and San Diego Gas and Electric’s response to the Aliso Canyon gas leak.

Let's look at this other report on that 100 MW:

https://www.greentechmedia.com/articles/read/aliso-canyon-em...

>(CPUC) expedited the approval of around 100 megawatts of energy storage in Southern California Edison and San Diego Gas & Electric territories, in response to the Aliso Canyon blowout.

>Tesla, Greensmith Energy and AES Energy Storage celebrated the completion on Monday of three large-scale lithium-ion battery projects totaling 70 megawatts -- consisting of 20 megawatts, 20 megawatts and 30 megawatts, respectively.

Now, here all megawatts! And the mean it, because these the AES batteries were 37.5MW and 150MWh, and the Tesla battery is 30MW and 80MWh:

http://www.utilitydive.com/news/inside-construction-of-the-w...

So yes, they mean GW and not GWh, and yes, that is a very meaningful statistic when talking about the grid. It's not the only one, and for bystanders, maybe they care about the Wh more than the W, but it's not like you can design a system without known both.


> At no point do you actually say what was wrong about what you quoted. What do you feel is in error?

The whole comment past that point is an explanation why that statement is wrong. Read it.

You're clearly confused about the difference between power and energy. There's no meaningful discussion on this topic until you figure that out.


It is obvious from epistasis' comment that (s)he is fully aware of the difference between power and energy, and also that both are relevant properties to consider when designing an energy storage system.


Please explain this concept with your understanding of power and energy:

> lithium ion batteries are typically designed with a W:Wh ratio of 1:1 - 1:4


for those of us too lazy to read about the difference between power and energy, there is EEVBLOGs recent video explaining it

https://www.youtube.com/watch?v=YdbhnmA4M9g


The maximum output is interesting to know, but if you asked how much my car's fuel tank held and I told you "100mph," you'd be understandably confused.


But we're not talking about cars, we're talking about the electrical grid.

In the current stage of the grid and storage, storage capacity is used not for long duration transmission of energy, but shorter term filling in of power gaps. In the past few years that's been frequency regulation, and now the replacement of peaker plants, or instantaneous response when waiting the 10-15 minutes for a peaker plant to come online.


A watt is a measure of power, not distance. The equivalent unit for a car would be horsepower. You could express the capacity of your fuel tank in horsepower hours. Since both are measurements of power and horsepower is defined in terms of watts you could even measure the capacity of your fuel tank in kilowatt hours. Similarly, you can define the output of your engine in kilowatts.


Maximum discharge speed (which is one of the very few storage characteristics I can think of to be measured in GW) is pretty much irrelevant for storage systems. Add a few quick discharge capacitors that cost almost nothing and boom -- your max discharge speed shoots through the roof (but speed goes down for a very short time).

Maybe sustained charge speed is a (little bit) more meaningful thing you can express in GB.

I am with the parent -- this is either a typo or a journalist quoting something he knows nothing about (what's that "h" in GWh? Strange capitalization; probably a typo; let me strike that).


Your instincts would be wrong on this one. This is a technical article, that talks about all the right things for grid storage: value stacking, $/kWh (note the h), various install locations such as behind-the-meter, etc. They know what they are talking about.

Currently, grid discussions happen mostly in the MW or GW space. Maximum instantaneous discharge speed is pretty much the most important characteristic for storage, up until very very recently, since it was mostly used for frequency regulation. Lately more and more articles are discussing total energy capacity instead, as this is discussed in wider circles than just those interested in the grid. Also note that the duration of discharge for a storage system is often implicit because it is deployed in an energy market where the bids are on fixed time periods (e.g. 4h).

Ideally we'd know both the GWh and the GW, but the collated stats have been mostly GW so far.

Guessing that they don't understand GWh vs. GW is pretty far off base. Though this may be a confusion of units that happens often in discussion with lay folk, it's not much of a concern when talking to people that are discussing the grid.


I think what threw me off was when they say "capacity".

I don't work in power systems, I work with low-voltage DC stuff, so when I think "the capacity of the battery system" I think it would mean the maximum stored energy.

But "The maximum amount that can be produced" is also a valid definition of capacity, and I'm guessing this definition might be used in power generation more often ("the generator is operating at half capacity").


Exactly.

If you're interested, the context is that capacity was historically used in the utilities industry to refer to generation capacity, which is in MW or GW. E.g., the capacity of a power plant could be 500 MW, which for the decades of power production preceding renewables, could be sustained indefinitely as long as you're feeding it fuel.

By extension, when you talk about battery capacity in the context of the electrical grid, you're talking about the MW or GW of generation that you can replace during peak loads. The ability to distribute batteries across a grid to meet peak demand (and defer infrastructure/peaker plant construction) is the best way (today) to justify investments in batteries.


Well, it depends on who wrote the article. I see science journalists get this wrong frequently.

And really, as you say, everybody talking about storage systems should be quoting both energy and power.


> They know what they are talking about.

In this article, which is what we're discussing here, it is very clearly either a typo, or it's some journalist spouting off above their pay grade.


Why do you think that? I read a lot of about grid storage, and I can't come to the same conclusion as you on this one. It's very common to talk about GW in isolation of GWh for storage systems in the press, and for GW to be talked about before GWh.

And if you're used to thinking about utilities, it makes a lot of sense, you think about power capacity, and then the duration afterwards. GW is a more direct way to approach that. If you want to get duration from GWh and GW, you have to do a mental division, which is slower than multiplication.


I had the same initial reaction, but I think epistasis is probably right.

This is the best reason why they should quote both numbers: so we can be sure they haven't gotten confused and quoted energy as watts.


OK, thank you, good to know. I will re-read with this in mind to give this a fresh look.


It's weird and not usually reported like that. Possibly it is a typo, but either way it's relatively closely coupled to storage capacity: li-ion is typically used around 1C, so ~1.3 GWh of capacity was deployed, within a factor of two or so.


I paid $354/kWh last month for 24kWh of batteries (inc 20% tax), and I got a free car that goes around it


What kind of electric car costs $8500? Used Leaf?


I just bought a 2011 Leaf with 40k miles in mint condition for $5,000. They are insanely cheap right now, and amazing cars if the range works for you (I get about 60-70 miles on mine). Also with the free quick chargers everywhere in the Bay Area I literally drive for the cost of car insurance now. I think it's just a combination of fear from the consumers and lack of knowledge on used dealers part combined that has left these things dirt cheap.


I heard from a friend that the Leaf's battery pack had a major redesign of the cooling system in recent models, that might have driven down the value of used older versions with worse anticipated battery wear.


Yeah, that's why I got the 2014 version. Turns out it also had the 6.6kW charger, even though it wasn't advertised as such, that was nice. You'd think the Nissan dealer would know something about the cars they were selling.


Hah, I misinterpreted "storage" here, and thought this was going to be about Li-ion cells getting cheap enough to replace the capacitors in the Power Loss Protection circuits in hard drives/SSDs/NVMe boards.

Which doesn't make sense, after giving it a second's thought—batteries wear out after just a few years, while capacitors (very nearly) don't, so it'd put a hard lifetime on drives. (Though, for server use-cases with constant workloads, they have hard lifetimes anyway...)

But it's an interesting question! If—rather than UPSes as specialized enterprise-level hardware—we instead had Li-ion cells on computer motherboards (picture a big brother to the CR-2032 CMOS battery) that kept the disks alive for up to, say, 30 minutes after power loss—could we architect our storage subsystems differently?

(Sorry if that's a complete tangent from the topic at hand, but I'm not sure where I'd post this otherwise!)


Keeping the disks alive is certainly doable. At a rough estimate of 10W per drive, even storage servers would only need very moderate batteries. But what would you gain by that?

More interesting (in my eyes) would be to power down spinning disks and processors, but keep caches and ram alive. Then once power is restored, you could basically supply power and continue where you left off, without any reboots etc.

Then again, power loss should be fairly infrequent, and you have to architect against connection cuts or physical disruptions anyway.


Congratulations, you just invented the ACPI S3 power state. Also known as standby or suspend to ram. I've been pretty happy with it since ~1998 when I started returning hardware that didn't support it properly. Although these days even linux on a laptop tends to work properly with S3 and S4 (hibernate, or suspend to disk). That said the concept was also part of the earlier APM standard which was a lot more hit/miss.

Anyway, its great, I used it today to ride out a nearly hour long power outage with a couple of fairly trivial UPSs connected to my NAS, desktop, etc. The desktop was already in standby because that is its default state after being idle for 30 mins. But the NAS went into that mode when the UPS indicated 70% battery life.

The problem of course is that UPSs/etc have miserable power conversion efficiencies when supplying just a couple watts of power, like 10-30% efficiency, so they tend to have max run-times less than 2 hours even when completely unloaded.

More on point, for servers, if your server is lightly loaded, or loaded for only parts of the day, using WOL to wake it and going into standby after a few minutes can save a fair amount of power. With SSDs, the resume times are generally less than a second unless you have bad cards/drivers that take forever to reinit.


If you have a look through some of the Google datacenter tours you'll see they actually implement this idea on a massive scale, though I believe they use lead-acid batteries... Every motherboard in the datacenter has a small battery capable of powering it for a long enough period of time for the backup generators to kick on and take over.

It's definitely a clever idea! Makes maintenance much less risky as it's distributed, and also reduces the chance of having a single large UPS fail when it takes over the load!


I guess the hardware does exist, but I was thinking of a different use-case for it: not for fault-tolerance per se (i.e. waiting for the power to come back on or the generator to kick in), but rather for giving the disk cache up to 30 minutes (after the rest of the machine is well-and-truly cut from power) to flush to the disk itself before finally shutting off, rather than a few seconds to empty its pending writes and park the disk head.

You could, therefore, have a really, really big disk cache, and a relatively slow-to-flush disk. With such an architecture, you could build systems that use disk+cache the way Optane NVMe is being used, without having even needed to invent Flash memory to get fast, highly-parallel writes first.


This is how battery-backed RAID cards used to work. An on-card rechargeable battery could keep the write buffers alive for about 24-48 hrs to flush to disk after you restored power in your datacenter.

Now, they use super-capacitors and a flash memory device to flush the volatile RAM buffers to flash memory, then on next power up the firmware can flush the flash buffers to disk. This all happens within the RAID controller so the host OS just thinks it already got write confirmation back when the PCIe transactions finished and the data blocks landed in the controller's RAM.


> kept the disks alive for up to, say, 30 minutes after power loss

Not relevant to disks, but you could have battery-backed DRAM? Commonly used in RAID clusters as a cache.

The main problem with storage architecture is too much hardware that lies about whether it's actually committed a write.


Portable computers sort of do that.

I had a vague idea that I had heard something similar before:

http://www.zdnet.com/article/google-reveals-secret-server-ha...


>IHS Markit expects li-ion battery prices to fall below $200/kWh by 2019

In that article they claim the digit of $200, while you can buy an assembled 1kw/h battery pack in China in retail quantities for around $120 today.


Is that NCA or NMC?

Typically grid storage batteries are rated to 10+ years or ~5000 cycles or something on that order of magnitude.

One way to get your cells to last that long is to lessen the discharge depth; that may account for some of the price difference.


Most likely consumer grade LiCo with carbon anode. LiFePo modules are twice as expensive, demand for them certainly exceeds supply


Any medium-to-large commercial campus with time-of-use-billing and large electrical bills may likely benefit from building a 500 kWhr - 3MWhr li-ion battery bank, 480VAC inverter, transfer and switchgear to buy electricity off-peak to use at-peak. The costs for lightly-used inverters and LFP module packs make it compelling to evaluate. For example, Jehu Garcia (@jag35) is involved in a 1 MWhr bank for a manufacturing customer in SoCal that looks really P&L sensible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: