=energy =economics =global warming
CO2 mitigation costs
With
modern gas turbines, burning natural gas makes perhaps 0.4 kg CO2 per kWh. I
generally use $75/ton as a baseline target for CO2 mitigation costs; that's
around what you see from various reasonable approaches like biomass
conversion. $75/ton * 0.4 kg = $0.03/kWh. Maybe that seems cheap to those of
you living in California, but it's a large fraction of the cost of
generating electricity in the US. (People in California are now paying
>$0.30/kWh, mostly because of corruption, and also lawsuits for fires from
poor maintenance from corruption.)
So, $0.03/kWh is the target that
should be met for the CO2 mitigation benefit of replacing natural gas with
(energy storage systems + renewables). There can be other benefits that
justify extra costs:
- If
renewables are cheaper than eg natural gas, then that cost difference is
compensation for storage costs. But of course, other power sources would
probably still be needed for longer periods of low generation, and their
cost per kWh probably increases when they're used less.
- Small systems
that can store energy locally can provide backup power when electricity
grids are down.
- Batteries can smooth out short-term power fluctuations.
But for the CO2 mitigation part,
I think of ~$0.03 as being the target.
The grid energy storage
systems I'm most optimistic about are (currently) water-compensated
compressed air energy storage, and (for the future) chelate flow batteries.
Below, I'll go through some of my cost estimation for those.
Li-ion grid storage is expensive
A couple years ago Tesla was charging $265/kWh for just grid storage
batteries, not including transformers, power lines, buildings, etc. (Yes,
BloombergNEF costs were lower, those were biased by Chinese subsidies.) $265
/ $0.03 is obviously at least 8833 cycles, and presumably more due to
maintenance costs and interest rates.
Li-ion batteries do not last
for 8833 cycles, especially if you charge and discharge them once a day.
LiFePO4 battery lifetimes are generally overstated for grid storage
applications, because a SEI layer forms over time from reaction with
electrolyte, and charge cycles crack that, so there's an interaction between
cycle life and calendar aging which makes battery life shorter than either
individually. Some people were acting on the assumption that Li-ion battery
prices would decline according to a linear regression to below $100/kWh, but
over the last year they actually went up.
There are lots of other
proposed systems: various hydrogen systems, Form Energy, vanadium flow
batteries, zinc flow batteries, gravitational energy storage, etc. When I
say I'm most-optimistic about some particular systems, I don't mean "these
are the systems I've heard of that I like the best". What I mean is that I
understand the entire conceptual space and every serious proposal, and those
designs seem like the best ones that current human societies are able to
develop. If that wasn't the case, I wouldn't be writing this post.
compressed air
Gas turbines
compress, heat, and expand air; you can store the compressed air for a while
instead, but it gets hot from compression and storage wastes that heat.
There are
existing CAES systems using that approach. Efficiency and
cost-effectiveness have not been great, despite also using natural gas. It's
possible to store the heat so it's not lost, but that's more expensive.
Compressed air can be stored cheaply in salt caverns made by solution
mining, but the variable pressure is really bad for efficiency because it
changes the turbine conditions and the pressure changes cause temperature
changes. Also, this limits location options.
It's possible to get
constant-pressure compressed air storage by filling the storage chamber with
water. A simple way to do that is to have a water reservoir on the surface,
and an underground chamber at a depth where water pressure matches air
pressure. Here's a
video from Hydrostor, a startup pursuing this approach. But this
introduces new issues.
If putting water in and out of a storage chamber, the chamber has to be waterproof, so it can't be made of salt. Mining hard rock underground in controlled shapes is expensive. For hard rock, mining costs are something like:
- solution
mining in suitable salt caverns: ~$25/m^3
- surface pit: ~$30/m^3
-
block caving:
~$40/m^3
- stoping: ~$130/m^3
I'd say "stoping" is the mining
type most similar in difficulty to making underground caverns for CAES.
Underground hard-rock mining generally uses drill-and-blast, but
blasting is often banned under cities, which is a big part of why tunnel
boring machines are used. If you want to build storage locations in cities,
that's a problem. There are non-explosive approaches, such as
roadheaders,
hydraulic breakers, and (disc cutter) tunnel-boring machines, but for hard
rock, drill-and-blast is the cheapest, which is why mining uses it.
Roadheader mining costs vary greatly with rock properties; for coal they're
cheaper than blasting, but for very hard rock they're very expensive,
because mining speed goes down and the carbide picks wear out faster.
For 70 bar pressure, you need 700m of water; this is a normal depth for
underground mining, not a big problem, but small excavations at that depth
are too expensive. Each CAES storage site would have to be large to keep
costs down.
Another issue is that a little bit of high-pressure air
can dissolve in water. If pressure drops cause water with dissolved gas to
bubble, that decreases the hydrostatic pressure, and causes bubbling water
to spout upwards - the "champagne effect". But this issue
seems solvable.
Some cost improvements do seem possible:
- Perhaps
it's possible to use solution-mined salt caverns, by using a little extra
effort to get them in the right shapes, then adding plastic or concrete
linings.
- A guy I know has been working on non-explosive mining machine
designs, and says systems he calls "Grond" and "Undine" could do
non-explosive hard-rock mining for <$70/m^3. (Rock fracture dynamics are
actually very complex and interesting, or at least so I'm told.) Per the
names, they involve impact hammers and water. Those designs do seem like
they could basically do what The Boring Company had hoped to but failed at,
so let me know if somebody wants that.
Energy in compressed gas is an integral of 1/x, so energy = volume * pressure * ln(pressure). At 70 bar, 1 m^3 is 8.26 kWh. Of course, that has to be adjusted by temperature when expanded and by turbine efficiency. For now, let's suppose you get 9 kWh. Assuming everything underground costs as much as stoping, you're up to $14.4/kWh capacity, and you need a reservoir on the surface too. Supposing reservoirs are cheap, we amortize costs over 10 years, and average 75% charge/discharge per day, that's ~$0.0055/kWh. Not bad.
Considering the cost of
electricity from natural gas, and the fact that we don't need
high-temperature turbine blades, we can suppose turbines only cost
$0.005/kWh if run continuously. Supposing a 1/4 duty cycle, that's
$0.02/kWh. Let's say the renewable power lost from inefficiency is worth
$0.01/kWh output.
We also need heat exchangers, and those could be
more expensive than the turbines. Supposing $2 per W/K and a 10° gradient,
that's $200/kW, perhaps $34/kWh for 1-day storage. Let's say the heat
exchangers add $0.01/kWh. Then you need something to store the heat, but
water is cheap so let's just ignore that for now. Heat exchanger cost also
depends greatly on manufacturing methods, temperature, and pressure;
gasketed plate heat exchangers for warm water are cheaper than
high-temperature shell-tube heat exchangers for gas turbines.
Instead
of using heat exchangers and some fluid, another option is sending
compressed air through beds of eg sand. Such "packed bed heat storage" might
seem cheaper, but
this paper estimated ~$0.0685/kWh stored for packed bed heat storage -
just for the tanks, rocks, and insulation. So, this approach is probably
more expensive than using fluids but allows for higher temperatures.
We're now up to ~$0.0455/kWh incremental cost over generation. That's too
high to be competitive for CO2 mitigation, and installations have to be
large - but it's sort of competitive with nuclear power costs. I think
better mining methods and various other improvements could plausibly bring
that down to ~$0.04/kWh. There are lots of details to those potential
improvements but that's good enough for this post.
flow batteries
Let's
estimate the cost of electrolyte for a chelated chromium-iron flow battery.
EDTA
is a common chelation agent; it doesn't actually work well for this, but
it's a good enough approximation for production costs at large volumes.
- 1 mol of Cr
is ~$0.50
- 1 mol of EDTA is ~$0.70
- at 2 volts, 1 mol of electrons
is ~0.054 kWh
So, Cr and EDTA is only ~$22/kWh.
The iron side would probably be cheaper, maybe half as much. So, electrolyte
costs for flow batteries seem potentially very reasonable. The real problem
is the cells that electrolyte would run through.
Batteries involve
immobile materials with variable charge state, and mobile ions with constant
charge. Normally, the variable charges are insoluble in a liquid, but flow
batteries are defined by everything being soluble, which means ion-selective
membranes are needed. Those membranes are obviously more expensive than
liquid, and they make flow batteries more expensive than regular batteries
per watt.
Nafion (or Aquivion) membranes are fluorinated and
expensive. There are lots of papers on cheaper membranes with somewhat
better conductivity, so why aren't they used? Those papers generally don't
say, but it's because they're not durable enough. The membranes have a
tendency to get oxidized or broken apart, which is (mostly) why the
expensive fluorinated ones are used. But there is a new-ish type of membrane
that I think is promising:
sulfonated phenylated polyphenylene. That seems suitable for
hollow-fiber membranes.
With current membranes and current densities,
and some rough estimation of other costs, and extrapolation from current
systems like vanadium flow batteries, cells for chelated iron-chromium flow
batteries seem ~$2000/kW for 90% efficiency, with large-scale production.
That's too expensive. But with cheaper membranes and large-scale production,
I think flow batteries could realistically be made for $300/kW, which might
be $50/kWh for 1-day storage, not including the electrolyte. (Water
desalination plants are much cheaper than that per membrane area, but also
simpler. Still, they're useful as another reference point.)
That's a
lower cost than CAES, but perhaps an even bigger advantage is that they can
be smaller and placed more flexibly. Flow battery systems could provide
local backup power, which obviously has some extra value. (Salty water isn't
flammable, unlike Li-ion batteries, so there are fewer safety issues.)
Placing storage where electricity is used would also reduce the number of
power conversions. Even if homes have batteries, rooftop solar still doesn't
make economic sense without subsidies, but solar panels over parking lots is
only slightly more expensive than in open fields.
seasonal storage
So far
I've talked about 1-day storage, which helps with the sun not shining at
night. It doesn't help with longer periods with little sunlight and wind,
which can happen in the winter in Europe.
Storing compressed hydrogen
or naturals gas in underground salt caverns has very cheap storage capacity,
cheap enough for seasonal energy storage, but converting between electricity
and hydrogen is much too expensive. Hydrogen fuel cells are expensive enough
that burning it in gas turbines is better. I can't see this being
economically practical, and I don't expect hydrogen from water electrolysis
to be <$4/kg (before subsidies) anytime soon; see also
this post.
If you'd need more than 5 days of energy storage, and can't use natural
gas or coal, and don't have enough land to get that energy from biomass, I
don't see anything potentially competitive with nuclear power.