CO2 mitigation costs

With modern gas turbines, burning natural gas makes perhaps 0.4 kg CO2 per kWh. I generally use $75/ton as a baseline target for CO2 mitigation costs; that's around what you see from various reasonable approaches like biomass conversion. $75/ton * 0.4 kg = $0.03/kWh. Maybe that seems cheap to those of you living in California, but it's a large fraction of the cost of generating electricity in the US. (People in California are now paying >$0.30/kWh, mostly because of corruption, and also lawsuits for fires from poor maintenance from corruption.)

So, $0.03/kWh is the target that should be met for the CO2 mitigation benefit of replacing natural gas with (energy storage systems + renewables). There can be other benefits that justify extra costs:

  • If renewables are cheaper than eg natural gas, then that cost difference is compensation for storage costs. But of course, other power sources would probably still be needed for longer periods of low generation, and their cost per kWh probably increases when they're used less.
  • Small systems that can store energy locally can provide backup power when electricity grids are down.
  • Batteries can smooth out short-term power fluctuations.

But for the CO2 mitigation part, I think of ~$0.03 as being the target.

The grid energy storage systems I'm most optimistic about are (currently) water-compensated compressed air energy storage, and (for the future) chelate flow batteries. Below, I'll go through some of my cost estimation for those.

Li-ion grid storage is expensive

A couple years ago Tesla was charging $265/kWh for just grid storage batteries, not including transformers, power lines, buildings, etc. (Yes, BloombergNEF costs were lower, those were biased by Chinese subsidies.) $265 / $0.03 is obviously at least 8833 cycles, and presumably more due to maintenance costs and interest rates.

Li-ion batteries do not last for 8833 cycles, especially if you charge and discharge them once a day. LiFePO4 battery lifetimes are generally overstated for grid storage applications, because a SEI layer forms over time from reaction with electrolyte, and charge cycles crack that, so there's an interaction between cycle life and calendar aging which makes battery life shorter than either individually. Some people were acting on the assumption that Li-ion battery prices would decline according to a linear regression to below $100/kWh, but over the last year they actually went up. (This isn't the main point of the post, just context, and I'm not going to argue about it again here.)

There are lots of other proposed systems: various hydrogen systems, Form Energy, vanadium flow batteries, zinc flow batteries, gravitational energy storage, etc. When I say I'm most-optimistic about some particular systems, I don't mean "these are the systems I've heard of that I like the best". What I mean is that I understand the entire conceptual space and every serious proposal, and those designs seem like the best ones that current human societies are able to develop. If that wasn't the case, I wouldn't be writing this post.

compressed air

Gas turbines compress, heat, and expand air; you can store the compressed air for a while instead, but it gets hot from compression and storage wastes that heat. There are existing CAES systems using that approach. Efficiency and cost-effectiveness have not been great, despite also using natural gas. It's possible to store the heat so it's not lost, but that's more expensive.

Compressed air can be stored cheaply in salt caverns made by solution mining, but the variable pressure is really bad for efficiency because it changes the turbine conditions and the pressure changes cause temperature changes. Also, this limits location options.

It's possible to get constant-pressure compressed air storage by filling the storage chamber with water. A simple way to do that is to have a water reservoir on the surface, and an underground chamber at a depth where water pressure matches air pressure. Here's a video from Hydrostor, a startup pursuing this approach. But this introduces new issues.


If putting water in and out of a storage chamber, the chamber has to be waterproof, so it can't be made of salt. Mining hard rock underground in controlled shapes is expensive. For hard rock, mining costs are something like:

  • solution mining in suitable salt caverns: ~$25/m^3
  • surface pit: ~$30/m^3
  • block caving: ~$40/m^3
  • stoping: ~$130/m^3

I'd say "stoping" is the mining type most similar in difficulty to making underground caverns for CAES.

Underground hard-rock mining generally uses drill-and-blast, but blasting is often banned under cities, which is a big part of why tunnel boring machines are used. If you want to build storage locations in cities, that's a problem. There are non-explosive approaches, such as roadheaders, hydraulic breakers, and (disc cutter) tunnel-boring machines, but for hard rock, drill-and-blast is the cheapest, which is why mining uses it. Roadheader mining costs vary greatly with rock properties; for coal they're cheaper than blasting, but for very hard rock they're very expensive, because mining speed goes down and the carbide picks wear out faster.

For 70 bar pressure, you need 700m of water; this is a normal depth for underground mining, not a big problem, but small excavations at that depth are too expensive. Each CAES storage site would have to be large to keep costs down.

Another issue is that a little bit of high-pressure air can dissolve in water. If pressure drops cause water with dissolved gas to bubble, that decreases the hydrostatic pressure, and causes bubbling water to spout upwards - the "champagne effect". But this issue seems solvable.

Some cost improvements do seem possible:

  • Perhaps it's possible to use solution-mined salt caverns, by using a little extra effort to get them in the right shapes, then adding plastic or concrete linings.
  • A guy I know has been working on non-explosive mining machine designs, and says systems he calls "Grond" and "Undine" could do non-explosive hard-rock mining for <$70/m^3. (Rock fracture dynamics are actually very complex and interesting, or at least so I'm told.) Per the names, they involve impact hammers and water. Those designs do seem like they could basically do what The Boring Company had hoped to but failed at, so let me know if somebody wants that.

Energy in compressed gas is an integral of 1/x, so energy = volume * pressure * ln(pressure). At 70 bar, 1 m^3 is 8.26 kWh. Of course, that has to be adjusted by temperature when expanded and by turbine efficiency. For now, let's suppose you get 9 kWh. Assuming everything underground costs as much as stoping, you're up to $14.4/kWh capacity, and you need a reservoir on the surface too. Supposing reservoirs are cheap, we amortize costs over 10 years, and average 75% charge/discharge per day, that's ~$0.0055/kWh. Not bad.

Considering the cost of electricity from natural gas, and the fact that we don't need high-temperature turbine blades, we can suppose turbines only cost $0.005/kWh if run continuously. Supposing a 1/4 duty cycle, that's $0.02/kWh. Let's say the renewable power lost from inefficiency is worth $0.01/kWh output.

We also need heat exchangers, and those could be more expensive than the turbines. Supposing $2 per W/K and a 10° gradient, that's $200/kW, perhaps $34/kWh for 1-day storage. Let's say the heat exchangers add $0.01/kWh. Then you need something to store the heat, but water is cheap so let's just ignore that for now. Heat exchanger cost also depends greatly on manufacturing methods, temperature, and pressure; gasketed plate heat exchangers for warm water are cheaper than high-temperature shell-tube heat exchangers for gas turbines.

Instead of using heat exchangers and some fluid, another option is sending compressed air through beds of eg sand. Such "packed bed heat storage" might seem cheaper, but this paper estimated ~$0.0685/kWh stored for packed bed heat storage - just for the tanks, rocks, and insulation. So, this approach is probably more expensive than using fluids but allows for higher temperatures.

We're now up to ~$0.0455/kWh incremental cost over generation. That's too high to be competitive for CO2 mitigation, and installations have to be large - but it's sort of competitive with nuclear power costs. I think better mining methods and various other improvements could plausibly bring that down to $0.04/kWh. There are lots of details to those potential improvements but that's good enough for this post.

flow batteries

Let's estimate the cost of electrolyte for a chelated chromium-iron flow battery. EDTA is a common chelation agent; it doesn't actually work well for this, but it's a good enough approximation for production costs at large volumes.

  • 1 mol of Cr is ~$0.50
  • 1 mol of EDTA is ~$0.70
  • at 2 volts, 1 mol of electrons is ~0.054 kWh

So, Cr and EDTA is only ~$22/kWh. The iron side would probably be cheaper, maybe half as much. So, electrolyte costs for flow batteries seem potentially very reasonable. The real problem is the cells that electrolyte would run through.

Batteries involve immobile materials with variable charge state, and mobile ions with constant charge. Normally, the variable charges are insoluble in a liquid, but flow batteries are defined by everything being soluble, which means ion-selective membranes are needed. Those membranes are obviously more expensive than liquid, and they make flow batteries more expensive than regular batteries per watt.

Nafion (or Aquivion) membranes are fluorinated and expensive. There are lots of papers on cheaper membranes with somewhat better conductivity, so why aren't they used? Those papers generally don't say, but it's because they're not durable enough. The membranes have a tendency to get oxidized or broken apart, which is (mostly) why the expensive fluorinated ones are used. But there is a new-ish type of membrane that I think is promising: sulfonated phenylated polyphenylene. That seems suitable for hollow-fiber membranes.

With current membranes and current densities, and some rough estimation of other costs, and extrapolation from current systems like vanadium flow batteries, cells for chelated iron-chromium flow batteries seem ~$2000/kW for 90% efficiency, with large-scale production. That's too expensive. But with cheaper membranes and large-scale production, I think flow batteries could realistically be made for $300/kW, which might be $50/kWh for 1-day storage, not including the electrolyte. (Water desalination plants are much cheaper than that per membrane area, but also simpler. Still, they're useful as another reference point.)

That's a lower cost than CAES, but perhaps an even bigger advantage is that they can be smaller and placed more flexibly. Flow battery systems could provide local backup power, which obviously has some extra value. (Salty water isn't flammable, unlike Li-ion batteries, so there are fewer safety issues.) Placing storage where electricity is used would also reduce the number of power conversions. Even if homes have batteries, rooftop solar still doesn't make economic sense without subsidies, but solar panels over parking lots is only slightly more expensive than in open fields.

seasonal storage

So far I've talked about 1-day storage, which helps with the sun not shining at night. It doesn't help with longer periods with little sunlight and wind, which can happen in the winter in Europe.

Storing compressed hydrogen or naturals gas in underground salt caverns has very cheap storage capacity, cheap enough for seasonal energy storage, but converting between electricity and hydrogen is much too expensive. Hydrogen fuel cells are expensive enough that burning it in gas turbines is better. I can't see this being economically practical, and I don't expect hydrogen from water electrolysis to be <$4/kg (before subsidies) anytime soon; see also this post.

If you'd need more than 5 days of energy storage, and can't use natural gas or coal, and don't have enough land to get that energy from biomass, I don't see anything potentially competitive with nuclear power.

New Comment
12 comments, sorted by Click to highlight new comments since: Today at 12:40 PM

The grid storage situation would absolutely be better with more nuclear.

Aside from that, I agree with pretty much everything you wrote (I'm a cleantech consultant, I also do these kinds of analyses a lot). It's very well thought out.

I would add a few extra variables that might be worth considering.

  1. There are a lot of other things we're going to need to transition away from fossil fuels in ways that are very energy intensive and, at present, very capital intensive to do any other way. (Right now, that means early plants need 24/7 power to  be even close to viable even with subsidies). Chemicals, as well as liquid fuels for shipping and aviation, are the big obvious ones for me. Any substitute will be more electricity intensive but we should be able to get capex down over time. Possibly to the point that it makes sense to tolerate lower utilization rates when electricity is abundant, enabling overbuilding and reducing grid storage needs. But in any case these shifts will change when and where we consume a significant fraction of world energy use.
  2. There are some early-commercial-stage solar technologies that can plausibly better spread production across the day and year, reducing daily storage requirements somewhat.
  3. We're going to build a lot of Li-ion battery vehicles anyway, that will add up to an equivalent of 1-2 days of energy storage. I know people don't want to shorten their own battery lifetimes, but using some of that for V2G, or being at all smart about how and where we build and implement charging infrastructure, could make a lot of sense cost-wise.
  4. Consumer-level IoT and real-time pricing as forms of  demand response could help with the duck curve.
  5. The trajectory for PV is clearly that as-generated daytime power is going to get very cheap relative to grid power today, which means considerations for storage in another decade are likely going to be dominated by capex rather than round-trip efficiency.

I would also add that thinking of a fixed $/MT price/value for CO2 emissions abatement is not optimal, given how much easier some sources of emissions are to abate than other. If you can cut the problem in half but have no idea how to make the other half feasible, do it ASAP and you've doubled the available time to fix the remainder plus you can then better concentrate investment on the problems that turn out to still be hard in a few more years. You always go to war with the army you have, but in this case the enemy doesn't fight back. I, for one, think years of overly complicated policy regimes and attempts at forecasting future tech trajectories have made this whole space a lot more complicated, and a lot more expensive to address, than it needs to be.

Direct air capture is too expensive, sure, but if CO2 mitigation costs from biomass usage are competitive, you don't need to get CO2 emissions down to 0. In any case, we're not at the point of total mitigation being plausible yet.

There are some early-commercial-stage solar technologies that can plausibly better spread production across the day and year, reducing daily storage requirements somewhat.

Do you mean solar-thermal with molten salt energy storage?

Not specifically, though I do agree that thermal storage is worth pursuing, especially in cases where what you need is actually heat, whether industrial process heat or, in areas where is makes sense, district heating. I'm less convinced about the economics of it when we're talking about storing heat to then make electricity, but we'll see.

What I had in mind were some emerging technologies that can help reduce the efficiency penalty solar panels have in suboptimal conditions and outside peak daylight hours. Perovskite PV is one example which could also get pretty cheap given how it's made and what it's made of. Another that's still very early and expensive is something like metamaterial waveguide films that basically work like CPV, without lenses, mirrors, or tracking, which can boost efficiency under any conditions and if good enough, can also make it feasible to use more expensive high efficiency multijunction cells. 

From another angle, one that's been around a while but hasn't been very practical until recently, there are waste gasifiers that makes syngas or hydrogen. Obviously we want to minimize production of all kinds of waste, but the fact remains that there's a lot of stored energy in discarded non-recyclable plastics and biomass, and these systems can capture over 50% of the energy content in waste that hasn't been sorted, while separating out the inorganics (glass, ceramics, metals) for recovery and recycling. We're starting to see some municipal use cases as well as hazardous waste use cases. And depending on the application the syngas can be use to make dispatchable electricity, or in some cases to make synthetic hydrocarbons my coupling it with a Fischer-Tropsch process. It'll never be more than a small fraction of the total electricity mix, but stable, cheap piles of garbage and a 100 MT/day gasifier might be great where the alternative is multi-day or seasonal energy storage.

Eh, concentrated PV solar used to seem like a good idea, back when panels were expensive, but now all that stuff is more expensive than the actual solar panels. You physically can't increase incident light per surface area using a thin coating without tracking, so that solar metamaterial thing seems questionable from a basic physics perspective. But maybe you can explain how it works, exactly?

People have tried to do IGCC, but for power generation, gasification just isn't competitive with boilers. For stuff with higher water content than coal, it's even worse. Some people are working on wood gasification but that's just to make "renewable" plastics.

For sure, panels and land are cheap, and there's no good reason to increase $/W just to gain efficiency. Except sometimes on rooftops where you want a specific building to gain maximum power from limited space, but you obviously wouldn't use CPV with all the extra equipment in that kind of application.

The metamaterial thing (or to a lesser degree even just other advanced optics, like GRIN lenses) is that you can make thin films that behave, electrically, like non-flat optical elements. Metamaterials can give you properties like anomalously high or low absorption coefficients and refractive indexes, negative refractive indexes, and highly tunable wavelength specificity. In some designs (see: Kymeta, Echodyne and their sibling companies) you can make electrically reconfigurable flat, unmoving antennas that act like a steerable parabolic dish. The "how" involves sub-wavelength-scale patterning of the surface, and a lot more detail than I can fit in a comment.

And I don't mean IGCC, I agree about that. I have spoken with several dozen companies in the waste gasification space, their technologies vary quite a bit in design and peformance, but at the frontier these companies can extract ~50% of the chemical energy of mixed organic wastes (with up to 20% water content) in the form of syngas (~30% if you have to convert back to electricity and can't use the extra heat), 2-4x what you get from a traditional incinerator+boiler (which are about 10-12% energy recovery).

I understand physics and material science as well as a grad student, you don't need to explain basic diffraction. What I'm asking is how these metamaterials increase solar power output. Are they increasing the light that hits the solar panel? Where would that light have otherwise gone, if not for the metamaterial thing?

I'm also confused by why waste gasification would be more energy-efficient than boilers. Are they comparing the chemical energy content of syngas to electrical power generation at 30% efficiency, or something? Gasification uses more energy input than burning stuff. It's better to convert methane to syngas and burn coal than vice-versa. And biomass is further in that direction, it's better to gasify coal and burn biomass than vice-versa. This is a basic fact and any startup who won't admit it is either delusional or lying.

Sorry, got it. Sometimes it's hard to guess the right level of detail.

First point: The comparison to make is "An area covered with solar panels" vs "an area covered with a metamaterial film that optically acts like the usual CPV lenses and trackers to focus the same light on a smaller area." The efficiency benefit is for the same reasons CPV is more efficient than PV, but without the extra equipment and moving parts. It will only ever make sense if the films can be made cheaply, and right now they can't. The usual additional argument for CPV is that it also makes it viable to use more expensive multi-junction cells, since the area of them that's needed is much smaller, but we may be moving towards tandem cells within the next decade regardless. In principle metamaterials can also offer a different option beyond conventional CPV, though this is my own speculation and I don't think anyone is doing it yet even in the lab: separating light by frequency and guiding each range of frequencies to a cell tuned to that range. This would enable much higher conversion efficiencies within each cell, reducing the need for cooling. It would also remove the need for transparency to make multi-junction cells.

Second point: I've talked to the people operating these gasification systems, not just the startups. The numbers are all consistent. Yes, gasification costs energy, and gasifying coal would not make sense (unless you're WW2 Germany). But the process can work with any organic material (including plastics and solvents), not just fossil fuels or dry biomass and the like, as long as the water concentration isn't excessive (I've been told up to ~20%), and consumes a bit less than half the energy content of the fuel. The rest is what you can get from the syngas, most of which is in the hydrogen, and fuel cells are about 60% efficient if you want to use it to make electricity. That's where the 30% number comes from. There are plants doing this using agricultural waste, MSW, ASR, construction waste, medical waste, hazardous waste, food waste, and other feedstocks that otherwise either have little value or are expensive to dispose of.

You can certainly make Fresnel lenses that focus light, but without some sort of active control, it's not physically possible to focus light from an unknown direction onto the same spot. That would be a thermodynamic law violation by focusing blackbody radiation. So what's the advantage of this metamaterial stuff over Fresnel lenses? (Those work well enough, but of course aren't quite economically practical.)

Also, that metamaterial company you mentioned, their website shows regular solar panels with this metamaterial coating on them, not a system that focuses light onto smaller PV panels.

I'm of course aware of the split-spectrum solar proposals using diffraction gratings, but that's another thing that died off with the fall in PV prices.


Coal gasification typically has exergy efficiency <50%. You can get ~60% exergy efficiency, but biomass would certainly be worse than coal. Some plastics might be similar, I guess, but for wood you'd be looking at maybe 1.6x the losses, so something like 36% exergy efficiency from biomass to syngas. Then with 60% conversion to electricity you have ~22% efficiency, worse than boilers & steam turbines and with higher capital costs.

Yes, some simulations of wood gasification have given better numbers, but I don't trust them. Coal gasification is much better understood, it's used on a large scale in China, we know how it performs in practice, and we know biomass gasification is worse.

Combined cycle gas turbines can do 60% efficiency, and fuel cells are more expensive than those, so you probably wouldn't use fuel cells.

In your view how do flow batteries compare with sodium-ion?

They're very different, and there are many possible types of both. To guess at what you want to know:

  • I don't expect Na-ion batteries to replace Li-ion.
  • Flow batteries have separate costs for power and capacity, while those are linked in Li-ion and Na-ion. Flow batteries only make sense when charging and discharging over at least a few hours.

I've seen a couple of videos about the relative strengths and weaknesses of different battery technologies but don't know anything about flow batteries. Mostly interested in whether sodium-ion would be better than flow batteries for grid storage. That video claims sodium-ion batteries are projected to have $52/kWh production cost in 2025 based on various industry projections (timestamp 30:47) with a speculative $40/kWh in 2030.

Are you saying sodium-ion won't replace lithium-ion even for grid storage, and why?

Those projected numbers seem to just be made up, extrapolated from graphs and hopes by consultants rather than based on proper techno-economic analysis.

I don't expect Na-ion to be cheaper than Li-ion. The disadvantages seem to outweigh lower material costs.