Thursday 11 September 2014

"240 people died as a consequence of the Windscale accident"

I looked into the claim that "240 people died as a consequence of the Windscale fire". I couldn't find any evidence for it.

John Garland said: "The reassessments showed that there was roughly twice the amount than was initially assessed."

This would have also impacted the numbers of cancers that the accident would have caused, said the authors.

Previously, it was thought that the radiation would have eventually led to about 200 cases of cancer, but the new contamination figures suggest it could have caused about 240.

The claim of 240 was made by Rebecca Morelle, a BBC journalist (ref 1). If John Garland made that claim it must've been in ref 2 (unavailable online)

The radioactive fallout at Fukushima Daiichi was 1000 times that of Windscale. No one died from radiation at Fukushima Daiichi. 240 excess cancer deaths due to Windscale looks like a made up number to me. 'Made up', as in extrapolated from a mathematical model. That model being LNT (linear no-threshold) extrapolated to zero. The 240 figure is also reported in a physics newsletter (ref 3). Given the truth of the LNT model is entirely open to dispute, the 240 estimate must also be disputed. 0 looks like a more likely figure to me. LNT is only controversial when it extrapolated to zero. At low radiation levels: there's weak evidence to support LNT, and strong evidence to refute it.

Ref 4 gives a good description of that cause of the fire.

  1. BBC news, Rebecca Morelle
  2. Atmospheric emissions from the Windscale accident of October 1957, Atmos. Environ. 41 3904–20; Garland J A and Wakeford R 2007 DOI: 10.1016/j.atmosenv.2006.12.049
  3. Institute of Physics, Environment Physics Group, Newsletter, Nov 2007
  4. The Windscale reactor accident -- 50 years on, 2007 J. Radiol. Prot. 27 211, doi:10.1088/0952-4746/27/3/E02

Wednesday 10 September 2014

Electricity Dispatch | capacity credit | firm rating

Another copied PatLogan post
PatLogan Viridis

Wind most certainly isn't the back-up.

Anyone who's ever worked around grid management will tell you that the suitability of a generating technology as a back-up source is a function of it's ability to be "dispatched" (i.e. called on) on demand.

Not least because if a back up can't be called on reliably when needed, you need a back-up for the back-up.

The measure of dispatchability is "capacity credit" or "firm rating" (the latter is the UK term). It basically says that if I've got "n" megawatts of that technology, what proportion of it am I 90% likely to be able to rely on if needed.

For conventional technologies (CCGT, hydro, nuclear, coal) it's basically the lifetime availability of the plant, so around the 90-95% mark.

For variable renewables, it's never higher than the capacity factor. Onshore wind's capacity factor tends to be about 27% in the UK. Firm rating (depending on the site) is typically 8-14%. Solar's capacity factor in Germany is about 9%, suggesting a firm rating of 5-6%

If I've got 10,000MW of solar, it means I can rely on it to provide 50-60MW operating as a back-up.

Here's a thought. If I came to you and sold you car insurance that had a 6% of paying off whem you needed it, would you buy it?

WilliamAshbless: This doesn't make sense. The UK has ~ 10.5GWe of wind 'capacity'. Wind has been delivering an average of about 5% of rated capacity for an entire week now. Sometimes ,last week, it's dropped well below even 5% [e.g. it was 4% earlier this morning]

PatLogan: It's a probability based number, William - the basic definition (although it tends to get more complicated in applications) is what proportion of rated (i.e. nameplate) power am I 90% likely to get if the plant is called on in any given half-hourly period.

So yes, there will be periods when available capacity is below that, just as there'll be others when it's higher. 90% of the time, I'd be able to get that amount or higher.

Wednesday 3 September 2014

Possible new UK nuclear builds (by PatLogan) ~ 17.7GWe

Another PatLogan repost from the Guardian CiF

Pat:

0

at the moment, we've got three consortia (and probably a fourth) each putting up several hundred millions to buy the rights to develop one or more of the UK licensed sites.

  • EdF:CGNPC:CNNPC:Areva for Sizewell and Hinkley Point ( in the ratio 51:20:20:9. Each site is 2x1600MW EPR
  • Hitachi (100%) for Wylfa and Oldbury. Each site is 2x 1300MW ABWR
  • Toshiba (60%) and GdF-Suez (40%) for Moorside The site is 3x1100MW AP1000
  • and it's looking increasingly likely that a CNNPC:CGNPC consortium is buying Bradwell to build at least 2x1400MW CAP1400

you were saying?

Me:

Summary:

Actual demand for electricity in 2012 was 35.8GW on average, and 57.49GW at its peak. Planned new nuclear reactor builds will generate 15.525GWe, with a possible addition of 3GWe more (at Bradwell). By 2024 all other nuclear plants apart from Sizewell B should've closed. Add Sizewell B at 1198 MW. Apply a 90% capacity factor, the UK could expect 17.008 GWe of nuclear electricity by the late 2020s.

ConsortiumSiteLocaleType#Capacity (MWe gross)Start
EDF EnergynHinkley Point CSomersetEPR2165033002015
EDF EnergynSizewell CSuffolkEPR2165033002017
HorizonWylfa NewyddWalesABWR2138027602019
NuGenerationMoorsideCumbriaAP1000 3113534052020
CNNPCBradwellEssexHualong-12115023002022
HorizonOldbury BGlosABWR2138027602025
Total1317825

Current nuclear power gross capacity is up to 9190 MWe, with 490 MWe (Wylfa) due to close before the end of 2015.

Monday 1 September 2014

Another copied PatLogan post : Does Geothermal make sense in the UK?

2

No, I stated that 75% of ELECTRICITY comes from conventional hydro - which it does, and leaves geothermal generation as a minority contributor to a small overall total volume of generation. After al, your original question was about

drilling down deep enough to hit hot rock in order to power the steam turbines in a power station

You do understand the difference between total energy use and electricity usage, don't you?

So, let me explain the difference between the sorts of conditions that you need for generating power from geothermal, as opposed to just getting hot water.

Power station steam needs to be hot and dry - and chemically pure. If you have droplets of water in it the blades of the turbine get eroded, and if you've chemicals like sulphur in there, you'll get etching and corrosion. That latter means it's generally a very good idea not to take water/steam direct from the heat source, but to pass it through heat exchanger - which means a loss of temperature of some tens of degrees.

The minimum temperature you'll realistically get way with at the turbine inlet is maybe 200C (otherwise you get those droplets) - and ideally 250C+

Even in Iceland the number of sites with that sort of conditions is limited. But Iceland's geologically very unusual - sitting on the North Atlantic Ridge, with all it's volcanic activity

In the UK, the general temperature lapse rate is about 25C/Km. That is, to get 250C temperatures, you need maybe a 10Km bore. That's deep - a lot more than is routine in oil and gas exploration - but not infeasible.

More than that, you need to transfer a lot of heat from the rocks to whatever's the working fluid. That means you need much more surface area than you get from just the bore walls.

So, when this was tried at Cambourne School of Mines, they came up with a system where you drilled two bores some metres apart (one for the fluid to go down, one for it to come up). They then needed to "frack" between the two - the idea being the fractures give a path for the fluid to flow through and pick up heat on the way. When Cambourne did it, they used explosives - you can't use water at pressure as in fracking for gas, as the rocks are too hot.

They found a number of problems; first the cost of drilling. Second, the crack systems run in all directions from the bottom of the bore, and a lot of the fluid gets lost. Third, the amount of energy to pump the fluid through the crack system is high. Fourth, the crack system tends to close up, through a mixture of plasticity in the hot rock, and chemicals leached from the rock tend to deposit in restrictions in the cracks and blocking the flow.

End result, it's a phenomenally expensive way of getting small amounts of power. At current costs, it'd be north of even tidal/wave which gets CfD support of £305/MWh.

As to "Are you a woman ?"

One of the brightest people on my BSc course was a female. She finished up as technology director for one of the world's biggest bulk chemical manufacturers.

I suspect she's got the edge on you for clarity of thought and brains....

PatLogan - Comparing Solar vs Nuclear (EPR) costs

16

Perhaps you should note that the original Feed-in Tariff of 42p/KwH has be halved.

It has. and strangely enough, I the wake of that the rate of new installations has collapsed.

The strike price for the new nuclear plant at Hinkley Point is over 3-times that.

You'r a factor of ten out. £92.5/MWh is for 1000 KWh, not 100. So the price is 9.2p/KWh. So, the solar cost is about 3 1/3 x the Hinkley price. Of course, the solar price omits the costs of either sles torage or back-up generation. Solar requires 100% back-up; nuclear about 10-15%

They take the form of tax breaks

In fact, even larger tax allowances apply to renewables. Investment in renewables schemes attracts "Enterprise Investment Allowances" for individual investors at 30%. From the corporate perspective, all capital expenditure in renewables generation falls under "Enhanced Capital Allowances" - a far more generous regime that applies to any other form of generation, or for investments in oil and gas. In fact it allows capital investment to be charged against tax at up to 100% in the first year.

relief from liability for cosequent environmental and health issues

Similarly, intermittent renewables are exempt from penalties under the "Balancing and Settlement" code - i.e. from the costs they impose on the rest of the system.

PatLogan: Guesstimate for gas-fired CCS project

Another post copied from Guardian CiF comments by PatLogan
3

Hard to say exactly; there aren't any gas-fired CCS projects I'm aware of. But we can do some "rule of thumb" guesstimates.

Probably the best approach is to look at an IGCC (a coal gasification based generation technology) CCS plant. Probably the furthest advanced example is the Kemper County plant in the US.

An IGCC plant is basically made of three components: a coal gasification unit (which makes the coal into CH4), a gas reformer (which converts the CH4 into separate streams of H2 and CO2) and a CCGT gas turbine optimised to burn hydrogen. The H2 from the reformer goes to the turbine, the CO2 to sequestration.

Were you to use gas, obviously the gasification stage would be redundant. The reforming stage would still be required.

Kemper's costing about $5.5Bn for a 580MW plant which can capture about 65% of the carbon content of the coal. In the US a CCGT unit without CCS would typically come in at about $800-1Bn/GW.

So, crudely scaling Kemper to 1GW gives us $9.5Bn/GW, of which something like $1.2bn will be for the adapted CCGT.

That gives us $8.3Bn for the "balance of plant". If we crudely assume that's split half and half between the reformer and the gasifier, and that as this is "FOAK" (First Of A Kind) plant that costs double what a series build unit would cost. we get about $2.1Bn for a series built reformer capable of supplying a 1GW unit.

That has the whole system coming in at about $3.1Bn - 3 to 3 1/2 times the cost of a standard CCGT. for a comparison, that's not dissimilar to the cost of an AP1000 PWR unit.

The system also absorbs some of the output of the plant - we'll assume about 20% for a gas fired CCS CCGT. That equates to a 25% increase in fuel usage.

The underlying cost of power from a CCGT is usually about 80% fuel and 20% fixed (operations, capital and finance). In the UK gas can currently produce at breakeven at about £60/MWh.

So, gas cost is about £48/MWh at the moment, and other costs about £12.

On the basis of the above, you'd expect gas costs per MWh of output to rise to about £60/MWh, and other costs to rise to about £42/MWh - giving a total of £102/MWh.

Which is somewhat more than the Hinkley C strike price, or that for new onshore wind. But it's predicated on a few key assumptions/omissions

1 - that the technology can be matured to a stage where series build cost reductions can be reached.

2 - that gas prices stay where they are.

3 - that there's no carbon cost - I'd expect a system like the above to outperform Kemper in terms of carbon capture, but not to approach 100% - I'd guess in the 80-90% range, giving CO2 output of 45-90g/KWh. It'd be a small contribution, though.

4 - I've not the faintest idea of the costs of the sequestration system - compression, piping and disposal in geological formations. I'd be staggered if it came in under £5/MWh, surprised at £10, though.

So, not easy to develop, and certainly not cheap!

Comparing wind and nuclear raw material costs (repost of Guardian CiF PatLogan post)

1

There are some really interesting comparisons around materials usage between wind and nuclear.

http://bravenewclimate.com/2009/10/18/tcase4/

A typical 2.5MW land-based turbine uses about 390 tonnes of steel and 1100 tonnes of concrete (that's ignoring access roads) It's got a 20-25 year operational life, and (in the UK) will average about a 27% capacity factor.

Output over life is

(2.5*25*0.27) = 16.875 MW-years.

So that's 23.1 tonnes of steel per MW-year and 65.2 tonnes of concrete per MW-year.

A 1620MW EPR (the most resource intensive of the UK new-build designs) . That's got a 60 year design life, at a capacity factor of 90%.

That's

(1620*60*0.9) = 87,480 MW-years

It uses 205,000 cubic metres of concrete (at about 2.4 tonnes/cubic metre 492,000 tonnes- ) and 71,000 tonnes of steel.

That's 5.6 tonnes of concrete per MW-year, and 0.81 tonnes of steel per MW-year

In other words, the wind option uses about 13 times as much concrete, and 28 times as much steel per unit of output over life.