Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The US grid battery fleet is about to double – again (canarymedia.com)
47 points by mfiguiere on Jan 15, 2024 | hide | past | favorite | 79 comments


How long does the typical battery last before it needs significant new investment (major maintenance or replacement)? In other words, how long before we need to install another 30 GW not to increase capacity, but to maintain it?

I'm not a pessimist but I know that at this stage in an adoption cycle, it's all gain, but that's not the long-term reality. I'm wondering what the long-term looks like.

In part, if the capacity keeps doubling, replacing 30 GW will be trivial compared to the new capacity. But eventually that will flatten out.


Tesla Megapacks come with a 15-year "no defect" and "energy retention" warranty. A 10 or 20 year "performance guarantee" is available for an additional cost. (an example of Li-Ion based storage, I assume it won't differ wildly with other producers)

I think they can in principle last longer if used well. You can control the temperature, you can use weather forecast (as part of some network-wide forecast of future required capacity) to decide how much you should charge the batteries. You'd use the "unhealthy" charging/discharging only in extreme situations, while in the day-to-day cycles you'd keep the batteries within the healthy voltage range.


Two years ago when I was considering installing a battery along with my solar panels, here was the estimations I did:

1. Additional cost of installing a 13.5kWh Tesla Powerwall was quoted as being around £12,000

2. Warranty was 80% capacity after 10 years

Now, my experience with batteries in general is that once they start to go, they deteriorate pretty rapidly; so 80% after 10 years to me basically account 10 years as the lifetime of the battery.

£12k for 10 years is £1.2k per year, or £100 per month: that is, the battery would have to allow me to save £100/month in electricity just to break even -- and that's before factoring in the cost of capital (i.e., if I take that £12k and invested it somewhere else, I'd have a lot more than £12k after 10 years).

Even if the batteries managed to eke out a reasonable capacity for 20 years -- which seems pretty unlikely to me -- I'd still have to somehow save £50/month to break even. All in all the battery just didn't make financial sense for me as an individual.

One expects that the companies doing this on an industrial scale have done the math and managed to find some economies of scale that make it profitable; but it's not out of the realm of possibility that due to a combination of optimism, overselling on the part of battery companies, and FOMO, that they've drastically overestimated the performance of their batteries, and are going to find 15 years down the road that they're beginning to fail without having paid back the capital investment used to buy them.


If I'm reading the website right, UK prices for Tesla Powerwall units are now: "1 Powerwall £6,000, 1 Gateway £900": https://www.tesla.com/en_gb/powerwall/get

I think you may be pessimistic about it dying immediately after 80% capacity given most homes won't discharge a thing that size most days, and 10 years is 3652 days which seems like a reasonable number of cycles to 80% for a variety of specific batteries (some do more, some do less, but not only do I not know the cheapest nor do I know what Tesla uses, I wouldn't understand the answer if they gave it to me).

Also, for grid users, it probably matters how many cycles[0] rather than how many years, and how much they cycle them depends on the entire rest of the grid. Also, as space isn't the limiting factor[1], any unit can probably be kept in the system down to 5% of initial capacity.

[0] even that's a simplification, if you can store more joules in total over the lifetime by never charging to a full cycle

[1] look how space inefficient hydro dams are, and yet we still use them; battery tendency towards spicy pillows, however, is always a concern.


Is there a reason you did the math with a 13.5kWh battery? Do you heat a large house electrically? Do you need to charge an EV at night off that battery?

Most people can get away with a significantly smaller battery - if your goal is to just get a normal house through a single night. That goal can be achieved pretty economically, especially if you're willing to use grid electricity on mornings with little sun on your panels or nights with unusually high demand.


Basically the quote we got from the local company gave us the option of a 10kWh non-Tesla battery for about £10k and a 13.5kWh Tesla battery for £12k. The math doesn't change that much between the two. I had also looked into battery sizing previously based on our annual usage; but that was over 3 years ago so I don't remember what I came up with.


What is the point of a grid battery if you don't heat your house electrically, particularly from the point of view of climate change which is meant to be the real reason we're doing this.


The difference is mostly financial. Lots of people around me install/upgrade their rooftop PV system right now, and most of them get a small battery to make it through the night. Both a new PV system and a battery is subsidized, and the money is easy to get.

Heatpump upgrades to existing houses are much more expensive, and for some reason the subsidies are incredibly complicated to get, and there's no guarantees for exactly how much you'll get approved.

Baby steps, I guess. Financial incentives (and the extreme price drop in batteries and PV modules) mean we're starting with switching normal household electricity consumption to renewables. We'll do household heating... some other time.


I'm pretty skeptical "energy through the night" is financially viable for anyone. Whenever I've looked into it, the cost-per-kwh on a battery blows away any savings over the cost of just paying for off-peak electricity.

Which makes sense of course: if it didn't, then it would be more cost-effective for larger players to simply build grid batteries and arbitrage the power cost down to parity.

The only real value I've seen is when your selling your own battery capacity under grid-stabilization programs (where some other power company is managing virtual power plants). And I'm still fairly skeptical if that's even a sensible business model (i.e. how much real estate cost can you be saving versus the costs of installing distributed batteries as opposed to centralizing them?)


If you would buy a Tesla power wall you actually have a system which would keep running if you have a power outage.

There are plenty of areas which would not mind paying a little bit more for this feature.

If it doesn't make sense for you, all good.

The batteries will get even cheaper I'm pretty sure.

And the big batteries make their money through grid stabilisation.

But there are a few companies which can leverage your battery at home and act as a big battery


£50 isn’t that hard to save.

The octopus off-peak rate is 7p vs the normal 30p/kWh. If your bill is £70/month for electricity it suddenly becomes £20, so there is your £50 immediately. My bill is £140 so I would save £100.


Warranty or not, I don't think I'm relying on Tesla's performance claims as data. I know, it's warrantied - 'but wasn't it your fault? What humidity level did you store it in? The moisture sensor says ...'


We will probably also switch to different chemistries for grid batteries. Sodium batteries for example are cheaper. Iron oxide is also interesting.


If it keeps on doubling like this, a fraction of the installed base may be affected by this some 10-15 years down the line. That's kind of how the math works out. Most batteries around that time will be new ones. And that seems to be the time line that these things are sold for. They might have quite a bit of life in them beyond that of course. It's not like they just stop working.

In any case, we're talking about first and second generation products here adapted from batteries designed for the automotive industry. It works and it's great but there are better options coming to market that are more optimal for grid storage. Ten years from now, this market is going to be very different in terms of cost, performance, longevity, and volumes produced and deployed.

The world's electricity production per year is around 25 pwh (25000 twh) per year. So, you can do some back of the envelope math on that. We're going from hundreds of gwh to twh of battery produced per year in the next years (overall, not just grid). Each of those batteries is good for thousands of charging cycles. So if you were to charge and cycle each battery every day to add up to 1/365th of 25pw, you end up with a number of around 70twh of battery that is needed. We don't actually need that much of course since there are other options on the grid. But it just shows you that it could be doable and might be what we have in the field in a few decades. A few twh of battery on standby goes a long way and might just sit there fully charged most of the time.


I don't know the actual numbers on that, but I do know that stationary storage lasts much longer than e.g. EV batteries, since there are no weight restrictions, so they're cycled more gently, temps are better, they can be kept in a more optimal window, you can use things like LFP that are heavier, and BMS has more latitude with more cells to work with.


The doubling of production is also what drives the learning curve cost decreases at a rate of 20% per doubling. Which is what has reduced battery costs by 90% over the last two decades and is widely predicted to continue.

https://ourworldindata.org/learning-curve


It's easy to explain to finance people. It's a form of commodity speculation. Storage is a buy-low, sell-high business, with a daily cycle. This expansion will continue until peak and valley prices start to level out.


> easy to explain to finance people. It's a form of commodity speculation

Energy trades unlike any other commodity due to its perishability. There is nothing easy about trading power, though if someone has a background in derivatives, that helps. (They're also irregularly perishable.)

For the lay person, I'd recommend smarting up on Fabozzi's Fixed Income before trying to explain anything to anyone about trading power.


There is significant cost in storage though, so there will still be peaks and valleys.


Doubling in 2024 is great but the estimate for 2025 is nowhere near doubling again. How many doublings will we need to store enough energy that solar can replace all non-renewables? I imagine it's still a lot.

Edit: this page claims we need 84 times what we had in 2022, the estimate for 2024 is ~3x higher, so call it ~5 more doublings after this year. https://www.alsym.com/blog/how-much-energy-storage-do-we-nee...


The opposite needs to happen, industry needs to adjust to taking advantage of regular daily energy price swings. Plenty of industries can.

Instantaneous pricing for electricity motivates people and industry to make better choices.

Very cheap energy will enable interesting things too.


Indeed.

As a consumer in Ohio, variable-rate pricing kind of scares me: It's nice to be able to run the clothes dryer without any concern over what time of day it is. We don't do variable-rate pricing here (at least for residential users).

But I'd manage to sort it out in pretty short order, I think: It usually isn't very important to me what time of day the clothes get dried (as long as they don't sit long enough to get stinky), so I would indeed be motivated to pay attention to that. I, myself, would even become motivated to automate it so that the clothes begin drying automatically when energy is [relatively] cheap.

I can also imagine things like automating water heating: Burn Joules when it is cheap to do so, and store [most of] them for later use at a time that may be more convenient to me. (The math gets interesting on this one.)

Having very cheap energy be available occasionally would also be neat: I've got computationally-intensive tasks that need to get done some day. These cost real money to run on the hardware that I have. It sure would be nice if I could save some money by only doing these tasks when power is cheaper. (Right now, I do try to optimize timing them for energy efficiency. For instance, in the summer it is cheaper to run them on cool nights when the windows might be open than during a hot, sunny day when the aircon might barely be keeping up.)


Realistically both things will need to happen. There is a lot of demand that can’t be time shifted by more than a few hours. We need storage for a few weeks of consumption eventually.


Storing even 10% of US electricity usage from summer to winter could be in the tens of trillions of dollars. Spending $10,000 to store $2 of electricity in a battery is not scalable. Other forms of energy storage are the future.


Anyone else starting to get the feeling that the idea of "base load" power was a scam ? By "base load" I mean the insurmountable blocker for renewable adoption.

The storage industry has already proven it can basically double national installed capacity from one year to the next: It did so in 2023 and 2022, and in 2021 it more than tripled the previous year’s tally. The continuation of this trend just gets more impressive with time: A few years ago, doubling storage capacity only meant building 1 gigawatt. Now, the industry is looking at adding 14 gigawatts in a year, requiring an unprecedented amount of work at project sites around the country.

It's hard to believe how quickly the renewable revolution is happening. The rise of AI? What about the rise of renewables, holy hell.

I was browsing panels the other day for my upcoming project to convert our entire house to solar (we only bought last year, it's top priority for me, for my children's future) and I cannot believe that I can buy top of the range 550W panels for about $200USD per panel, like, my electricity bill is about $400 a month. It's totally ridiculous. I cannot believe people more people haven't caught onto this. I'll be able to pay off the whole system, including batteries in about two year's worth of electricity bills, granted I'm mostly installing myself, it's wild.


I've been in your boots for 15 years, and I barely convinced anyone. Even my thermal panels are viewed with suspiction, and people don't believe me when I told them my hot water come from them with a little help from an electric heater in the winter. They pay themselves once per year, easily. Frustratingly, the more common doubt is "if it's so good, how are there so few people doing it?"

At this point I gave up. I enjoy my free electricity, and even earn some money, but quietly.

A pending change in households is going 24/48V DC. The inverters are costly, and they feel stupid when you know a lot of appliances have an AC/DC transformer to undo what your $3K inverters are doing 10 meters away, losing maybe 20% in the process.


Every time I bring up that subject, people get aggravated and point out that lower voltage means larger cables, which is true, but I wonder if the tradeoff of paying more for thicker cables is worth the simplicity it brings.


It might pay for itself eventually.

Suppose I'm making toast -- a process that normally uses around 1100W, anywhere in the world.

At 120V, that's about 9A, which is fine for a 15A branch circuit running with 14AWG wire (the very cheapest of wire that we ever use for this stuff in the States).

At 48V, making toast requires about 22A, and thus needs something more like 10AWG wire, which uses about 2.5x as much copper.

But that bigger wire only needs to be purchased and installed one time, while any efficiency gain gets to be kept (presumably) forever.

And that efficiency gain (achieved by having fewer, simpler electronics between the sun and the toast in my kitchen) means a smaller solar array, and a smaller battery bank -- stuff that does wear out eventually.

So why not do both?

Why not stick with high voltage (because it's cheaper), and also switch some things over to DC?

Neither the wires in my walls nor the electromechanical bits of my toaster know or care if things are being powered with 120VAC or 120VDC. The toast comes out the same either way.

Why must voltage also decrease?


What simplicity? Houses aren't limited from being off-grid because of 120vac conversion efficiency, and the new hotness in batteries is HVDC which makes chargers cheaper because it aligns battery voltage to common DC MPPT voltages from solar panels.

The industry has been literally moving away from low voltage DC recently.


Frustratingly, the more common doubt is "if it's so good, how are there so few people doing it?"

Do you think that 15 years back it was prohibitive from a cost perspective now? Was it always this economically viable? I mean ~ $200 a panel, it's a true no brainer, maybe the initial outlay was a little too much for people way back when?

On the other hand, I guess with peaking electricity prices in many parts of the world, it would've easily paid itself off by now for most people.


Yes, it make less sense 15 years ago. I got some help from the government with the loan. IIRC, I calculated the break even in 7-8 years. In fact, I also bought a sun tracker, because at the time it was cheaper than to buy more panels.


> my thermal panels

How cold is it where you're at? I'm weighing putting in a new tankless boiler system, as well as a rooftop PV, but thermal solar might be a better bet.


Solar thermal works well even in northern latitudes, for example this municipal bath uses them extensively: https://maps.app.goo.gl/w571B7hTXCUBpN7Q6

Edit: This building is of course a bit exceptional in that it is both well-aligned south-facing and has no trees or other things shading it. Another thing working in it's favour is that reflections from water, ice, and snow also provide a non-negligible boost in captured energy.


Winters average 5-10ºC, summers 15-20ºC (nights included). Winter months average 100 hours/month of sun.


> Anyone else starting to get the feeling that the idea of "base load" power was a scam

I don't think it was in the past, it's just becoming obsolete, piece by piece. Each method of more traditional power production has different capabilities for ramping up and down, in descending order: gas, hydro, coal, nuclear. Now we have renewables entering the market, which so far have more or less had to be matched with gas peaker plants for scaling up and down. Batteries are obviously putting downward pressure on peak energy generation.

Furthermore, we've had the classic paradigm of electricity demand, where if I put a load onto the grid, like turning on my oven or flipping a light switch, it must function. Now we have electric cars, heat pumps, hot water heaters, and even in parts of Scandinavia washing machines, which schedule themselves to run during off-peak times.

Where we find ourselves now is market forces working themselves out, with investors buying into battery storage, and homeowners switching to time-of-use billing for their energy bills to take advantage of cheap electricity at night when charging their cars.

In energy politics we obviously still hear the term base load, but it's now nothing more than rhetoric of an outdated era.


I don't think it was in the past, it's just becoming obsolete, piece by piece.

This is what I'm questioning though, 30 years of hand waving about "base load", and all the stories about how renewables aren't sufficient, but then, oh wait, actually, we can probably do it now.

Maybe, just maybe the tech wasn't there, but it is convenient that when push comes to shove, we do have the technology. If the investment was there 30 years ago, it feels like we could've made a lot more progress. But the narrative persisted.


How much energy storage do you think those 14 gigawatts of batteries represent? How long do they provide 14 gigawatts for?

Then go cost out how much it would cost you to deal with say, 3 days of solar under production due to grey skies for your house. In most studies, the capacity factor of a solar plant is about 25% at best, so that 550W panel is worth about 137W over the course of a year, presuming you can store all of it.


In this case , yes, why can't my energy requirements be supplanted with a fully charged battery grid, solar from somewhere else, or gas peaker or something similar?

Doesn't seem like a compelling case for running coal power plants 24/7 to be honest.


For the same reason they're not now, and why you currently don't have batteries and are only just now considering solar panels: cost.

For example, here's the breakdown of the Australian NSW energy regulators supply and demand dashboard: https://aemo.com.au/en/energy-systems/electricity/national-e...

See the scale on the right for demand? The bottom is 6,000 MW. That's 24/7, all year round pretty much. My home state never drops below 6 GW of constant, continuous demand. That's baseload. Doesn't matter what it's made of, doesn't matter what it's components are, if you want to avoid brown outs or blackouts, then at all times there must be at least 6 GW of generation available overnight.

So, applying the 1:3 rule-of-thumb for LiFePO4 power:energy, overnight we have a period of at least 8 hours where we need at least 48GWh of storage - and we're going to use all of it. Of course, that's a number where you scrape through - because to recharge that storage, you're going to have to supply at least double that amount of energy to support the baseload while you do it. So now you need 96 GWh of generating capacity. But solar doesn't have the capacity factor for it remember - 25% at best, over time. So optimistically we'll need to deploy about 384 GW of solar to charge that system. Only...we can't rely on that either, because 25% is...average over time. And we absolutely have to charge those batteries to make it through the following night.

But wait: there's a big mismatch here. We can't just amortize over 384 GW of solar. Because all of that solar might be generating at full power during the day. Or it might be under-performing, or not performing at all. We have this massive surplus we need to have, but our batteries - 48GWh of them - are going to give us maybe 16 GW of power, and likely they'll be able to absorb energy slower then that (i.e. charging would be maybe 90+% efficient). We can't charge them faster then 16 GW: that big array is solely to try and meet an average amount of charge to get us through the next night - provided nothing else goes wrong. And we can't use it efficiently: because we also need the batteries during the day. Clouds over a solar plant kill the output instantly, so the battery has to step in to compensate and retain grid stability.

So the actual amount of battery capacity we need, to get us through one night is going to get considerably larger then 48GWh (16 GW). In fact ideally we actually need...pretty much 384 GW of batteries. Because if our arrays perform well, we need to be able to soak all that power up to have enough charge to get through the night, but we also need enough batteries to sustain the arrays going down during the day and needing to run the grid off the storage momentarily...but we can't afford not to be charging, because on average we're only getting 96 GW - but the lows and highs are very far from that number.

So from that one bit of analysis - and making no accounting for emergencies, equipment failures, efficiency of individual components (i.e. 90% battery charge efficiency + 10% losses in transmission lines etc.) we're currently at a tally of 384 GW of solar, 384 GW of batteries, and we have no redundancy whatsoever in this system. Because we can't get a reliable 6 GW from solar.

Now obviously the picture gets better if you include other things: i.e. wind tends to match solar dips and does work at night, so a combined solar/wind capacity factor is usually about 50%, and with better modelling you could shave some of these absolute margins into more balanced ones, but the problem remains: you've got to charge the batteries, and there's a limited rate you can do it. And it's a problem which gets worse for something like Pumped Hydro, because pumped hydro can have higher energy storage but it has much lower power output as a proportion - meaning it takes longer to charge (it would however be a good backstop for long term storage if we could build enough of it - can we?) There's also some positives - i.e. over time that giant over-sized battery installation is going to get way better cycle life since it's now >1000GWh of storage capacity and we won't actually be using all of it or even a fraction very frequently. We actually have a pretty good buffer over time if we expand the generating capacity further since we could a couple of weeks over no sun without running down our buffer.

Of course...current Australian generating capacity for solar in 2023 - nationally - is 32.9 GW.[1] And globally...there's about 300 GWh of LiFePO4 in existence at all. And the deeper you regularly cycle your batteries, the more expensive per unit they become.[3] Which is a problem because I've just proposed installing ~USD$154 billion dollars of batteries (assuming low-end cost per kWh estimated)[4], more then the entire world supply, to be able to adequately guarantee baseload electrical supply for one state of my relatively small country. Or about USD$25 billion per reliable GW, in batteries alone. Which makes the current expensive nuclear power plants look downright cheap and ITER would still be competitive when it's actually done.

[1] https://www.theguardian.com/environment/2024/jan/04/australi...

[2] https://www.lifepo4-battery.com/News/10-Largest-BATTERY.html

[3] https://gwl-power.tumblr.com/post/130701906811/faq-lifepo4-c...

[4] https://www.nrel.gov/docs/fy21osti/79236.pdf


For the same reason they're not now, and why you currently don't have batteries and are only just now considering solar panels: cost.

No I don’t have solar or batteries because I only just purchased a property and ran out of time to do more work before winter arrived (snow). Maybe your point is I couldn’t have afforded it ten years ago regardless ?

Appreciate the long response. I honestly read your post as a lot of insurmountable hurdles when realistically, as you point out. We can do the numbers and we will build the solutions.


He, like me, comes from Australia. He's quoting you figures from NSW.

South Australia is another state in Australia. Unlike NSW, SA doesn't have any native deposits of coal or natural gas, so about a decade ago SA started renewable generation. It is now running at 70% renewable. That's a yearly average, not some peak figure. A 100% renewable day is pretty normal for them. That's the highest in the OECD. In 2026, they expect it to be 85% https://www.energymining.sa.gov.au/consumers/energy-grid-and...

I don't doubt they will get to 85%. XorNot's comments might be relevant for the remaining 15%, but I suspect he will find the sand moving underneath his feet. For example, he only talks about storage but a large portion of SA's stability comes from over provisioning. It's possible to over provision because renewables are so cheap. I suspect EV's will have an impact. Even if they don't contribute to the grid they can soak up all that over provisioning.

Then there is storage he doesn't consider - like hot water systems. There are other similar loads, like water desalination and aluminium production. Many of these are switched loads. And they have been switched loads forever because matching generation to load isn't a new problem. Coal (nuclear is worse) has a comparable problem to renewables - it can't be switched off quickly. They needed somewhere to dump all that power. So among other things they made aluminium with it.

Economics drove the introduction of switched loads happened when the price of power wasn't as unpredictable as it is today. Renewables have driven the wholesale electricity prices are wild in Australia. As in, on most days the electricity price will go negative, ie they pay you to draw power. At other times the price goes through the roof. We have a retail system that shields us from that of course so you get a flat(ish) price, but you can opt to go through a retailer that exposes you to the wholesale price. If you do that a battery currently has a better return than installing solar. My guess is that won't last forever, as a lot of households will install batteries, just as they have done for solar in Australia.

But only if they get get in before the Hydrogen generators do. They are also eyeing off the cheap power over provisioning creates. https://www.dcceew.gov.au/energy/hydrogen They are looking at that as a natural gas replacement.

I have no idea where we will end up. But I'm pretty sure that if you only consider solar, wind, batteries and pump storage as XorNot does, then you have a very blinkered view of what the energy future hold.


South Australia is a much smaller energy market: https://aemo.com.au/en/energy-systems/electricity/national-e...

That graph actually does go to 0, but it peaks at only 2.5GW. The grid is backstopped by two state interconnectors[1] with a third under construction. The existing two have 650MW and 220MW of capacity. The new one will have 800MW of capacity. Just those two connectors sum to 0.87GW, or 34% of the peak usage of the state. The demand low overnight is less then the capacity of those interconnectors.

SA backstops it's baseload through other states (and there's nothing wrong with that, but someone, somewhere, has to have the generating capacity to provide that - if it's all batteries, then none of that helps).

[1] https://www.energymining.sa.gov.au/consumers/energy-grid-and...


There can be extenuating circumstances. Adding panels would void my roof’s warranty and I couldn’t afford a broken roof if something were to happen.

I’m good at math. I can see that solar would save me buckets of money for a small investment. It’s just that my hands are tied for other reasons.


I don't think it's a scam, scams requires malice, and it's sufficient that people don't expect exponential change even when it's happening reliably for ages — after all, even when people do learn about exponential growth, many say things like that it has a "knee" or reaches an "inflection point".


There's a survivor bias, many things do stop growing at some point, but these are harder to notice. You're more likely to perceive the things which did scale exponentially.


Sure — all exponentials eventually turn out to be sigmoids, or some quote to that effect — but I'm unclear why this matters as I'm suggesting most people don't even know what an exponential really is and what to expect from them in even the short term?


> it's sufficient that people don't expect exponential change even when it's happening reliably for ages

This sort of implied that people should have been expecting exponential change. But even if you know what exponential growth is, you couldn't know in advance with certainty if it would be the case for battery storage, renewables etc.


For batteries and renewables, the growth had already been exponential for a decade or more prior to the relevant conversations and threads, but people were (even last month on HN) looking at the currently installed capacity and responding as if that was all it could ever be.

(At some point the growth will slow, but that's never the criticism given in any replies I remember).


It's the same problem, you don't know when the exponential growth stops / fades.

Nuclear power was growing exponentially during 60s, 70s, 80s. Would it mean it has to continue growing exponentially through 90s and 00s?


It would be the same thing if the people I'm thinking of actually meant "I think the exponential will, or might, soon stop"; I'm asserting they didn't demonstrate any awareness of what an exponential even is, which is a precondition to be able to make the claim it was going to stop.


Baseload is indeed meaningless unless you put a number on it in gw and gwh actually needed. People wielding the term without doing that (i.e. most of them), are basically insisting on unspecified amounts of energy to be needed for unspecified amounts of time for unspecified calamities that may need said unspecified capacity. It's usually accompanies by some handwavy statements about clouds, weather, and seasonal darkness and the suggestion that we should instead put all our resources into building nuclear plants.

This indeed bullshit. Because as soon as you specify these numbers these things, it becomes a simple engineering challenge with some clear economics that you can model for different solutions.

Numbers for domestic solar indeed are such that most installations pay back within a few years in most parts of the world. The largest cost these days is not even the hardware but the installation cost and getting the time of the certified experts that can do this. But even factoring in all that, you basically end up earning your money back.

If you plop down enough panels and batteries, you won't need anything else.


it becomes a simple engineering challenge with some clear economics that you can model for different solutions.

This 100%, this is exactly how I felt about it too. Now it's a simple engineering and IMO economics challenge, it turns out, it's quite possible.


The US consumes approximately 4 trillion terawatts/hour of electricity. If this prediction holds true we will have 14 gigawatts of storage this year. You do the math and you tell me whether base load generation is still an issue or not.


> The US consumes approximately 4 trillion terawatts/hour of electricity.

It's roughly 3800 TWh (Terrawatt-hours) per year, no need to invent new units.


My bad!

So, the issue is that the US needs 3800TW of electricity every hour, and by the end of this year it will have 0.014 TW of battery capacity. It is clear that batteries are not replacing base load generation any time soon.


You're making the same mistake again. The operation in "Terawatt-hour" is multiplication, not division. It's not Terawatt-per-hour, it's Terawatt-for-an-hour. 1 TWh is the amount of electricity you consume when you run a 1 TW machine for 1 hour. Or a 2 TW machine for half an hour. Or a 500 GW machine for two hours.

> US needs 3800TW of electricity every hour

This alone should give you pause. "3800 TW of electricity" doesn't mean anything. A Watt is a unit of power. It's the flow of electricity, not an amount.


I was trying to place everything in the same unit as much as possible so that we could compare, and it appears I failed miserably.

However, you seem to be really good at it. Could you tell me which proportion of the US' electricity needs (I didn't want to say power before because then we are including things like oil) can be fulfilled with 14 gigawatts of batteries?


facepalm

Can I see the calculation how did you get from 3800 TWh per year to the same amount per hour?


Are there other players other than Tesla making these?


BYD, Hitachi, Siemens (energy).


a doubling of barely anything is still barely anything


Let's play chess. If I win, you send me some rice, ok?


In case anyone is unfamiliar: https://www.dr-mikes-math-games-for-kids.com/rice-and-chessb...

  "Oh emperor, my wishes are simple. I only wish for this. Give me one grain of rice for the first square of the chessboard, two grains for the next square, four for the next, eight for the next and so on for all 64 squares, with each square having double the number of grains as the square before."

  The emperor agreed, amazed that the man had asked for such a small reward - or so he thought. After a week, his treasurer came back and informed him that the reward would add up to an astronomical sum, far greater than all the rice that could conceivably be produced in many many centuries!


against what timeframe are you betting a load of money on batteries displacing all fossil fuels


Can the number of grains follow an S-curve?


A number of grains can apparently follow many different curves: http://foodfont.com/rice/


> The US grid battery fleet is about to double – again

So they now have two. /s


Gigawatts are not a unit of energy.... something something zalgo.


Installed capacity is sometimes measured by power output rather than energy storage capacity. Both measures are relevant.


Exactly, especially with some types of batteries this is relevant. Redox flow batteries that are being experimented with for grid storage now use a reservoir for storing the anolyte and catolyte fluids. The bit that allows those to interface where the power is generated is much smaller though. So you can take the same battery and trivially double or triple the storage capacity by simply using bigger reservoirs and more liquid but the power delivery would stay the same. These types of battery are potentially great for longer term storage but they aren't necessarily able to deliver the stored energy quickly. It could take days/weeks to deplete them and a similar amount of time to charge them back up.

Lithium ion grid batteries are sort of the opposite. They can deliver a lot of power but usually only for a few hours at maximum capacity until they are depleted. So, you see companies install a few hundred gwh of capacity being able to deliver most of that power in a few hours.


And yet only one is ever mentioned when batteries come up, despite the context being heavily implied to be "energy" not "power".


Depends on who you ask. For the grid operators since they balance instantaneous supply and demand they only care about power available. Expecting the producers to maximize their profit and hence save the batteries until they are most needed.

When explicitly designing an electrical system, say an off-grid cabin, both the energy stored and instantaneous matching of supply needs to pencil out.

In the large scale grid example we expect the market to handle the energy balance. Adding transmission links, secondary markets and so on when the market does not adequately match demand.

Generally speaking today batteries used for grid scale storage store 4 GWh per 1 GW. This number comes from maximizing the utilization of the grid connection and matching the hours per day with highest electricity rates.


I always find talk of power weird. It does make sense when you need to replace some plant going down... But in general storage capacity is much more important. That is how many hours and at what price can they provide demand for. With hours here being some reasonable number above say four...


What I've seen in several articles is that peak power production seems to be as much a limiting factor as stored energy. I'm still haven't found a good explanation why that is though. So if anyone could educate me, I'm all ears.


Let's take an absurd extreme case to illustrate - if you had a battery that could store 1GWh but could only discharge up to 1mW in an hour, how would you power an aluminum furnace with this battery?

Assuming you are using batteries for continuous solar power, you basically need the batteries to be able to store enough power to last through the night, and to be able to match the solar power plant's peak power - otherwise, the grid can't power the same things during the night that it can during the day, which was the whole reason we wanted the batteries in the first place.


Because grid batteries are currently used as peak supply - if a production of power plants suddenly falls of by 10 GW or there is 10 GW additional demand, you have to quickly provide that 10 GW of power before other (cheaper) additional production can be ramped up.


My suspicion is battery life estimating is complicated and proprietary. So your depreciation due to wear depends on both the rate of charge and the rate of discharge.

My best guess is they shoot for a discharge rate of roughly C3. Meaning they can deliver the rated power for 3 hours. Suspect they keep the charge between 10 and 90%. So I think you can multiply watts by 4 to get watt-hours.

That said I've looked for actual numbers in kwh and not found much.


They’re a unit of power which is what is being described here. People want to know how much they can draw per unit time from each source.

Wind Turbines a GW

Nuclear Plant b GW

Battery Storage c GW

That’s a power measure so you use power units. It would make no sense to use energy units.


It's presented in Gigawatts here, but I expect the actual metric is probably something like "Gigawatts available for 99th centile of duration of demand peaks", as this would allow you to talk about capacity in the same units as grid supply and demand. The whole point of grid batteries is for near-instant availability, so total energy available is moot if you can't access it right away - therefore a metric of energy would be less useful. Anyway, that's my speculation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: