Last updated on March 1, 2013

It’s Friday, a thunderstorm is raging outside and I am on vacation so there is not much else to do than some simple calculations. On a Swedish blog I made a comment regarding how everyone is ignoring the uranium elephant when it comes to energy discussions. Pessimism abound regarding the reserves of fossil fuels and doomsday is commonly predicted. But I have long held the view that nobody can seriously claim humanity will ever run out of energy (however that doesn’t exclude bumps on the road), many might find that to be a naive idea, but some very simple arithmetic proves my point.

The Earths crust contains trace amounts of pretty much all elements, including uranium and thorium. A nice gedankenexperiment (we physicists are obsessed with thought experiments) is to consider what would happen if humanity was restricted to just the average abundance of everything (a situation we will encounter in the far far future unless space mining takes off). Energy is the most important resource and the only energy sources abundant in reasonable amount in the average crust is uranium and thorium, let’s restrict ourself to uranium to be extra conservative. According to a nice list on Wikipedia the average crustal abundance of uranium is about 3 ppm (2.7 rounded upwards). That means if we cut out a kg of crust we get 3 milligrams of uranium, quite a pathetic amount. But what is the energy content of those grams?

A rule of thumb in nuclear engineering is that the fission of one gram of actinides results in 1 megawatt-day of heat energy. A megawatt-day (abbreviated as MWd) sounds like a weird unit but it is simply equivalent to 24 000 kilowatt-hours (kWh). Our 3 milligrams in one kg of crust thus gives us:

$$3*10^{-3}*24000=72$$ kWh

That is if we completely fission the entire amount of uranium which requires breeder reactors, our current light water reactors only utilize 0.5% of the energy in uranium. 72 kWh by itself might not say so much, but let’s compare it to something else like coal. Coal has 24 MJ of energy per kg, 24 MJ is equal to 6.67 kWh. Strikingly that implies our average kg of crustal matter thus has 10 times the energy content of a kilogram of coal! Hold on just a damned minute, here someone will shout, coal one can just throw into an oven but one can’t just throw some crustal matter into a reactor. That is of course correct, but how much energy does it take to extract the uranium from average crustal rock?

First we have to mine and crush the rock, we can use the uranium mine Rössing as a role model. Rössing used about $$1.9*10^{15}$$ joule to process 10.7 million tonnes of ore (and also removing 40 million tons of waste rock). This entails blasting it from the bedrock, crushing it and so on and finally producing $$U_3O_8$$, otherwise known as yellowcake. Energy cost ends up at 176 kJ per kilogram of rock, 176 kJ is equal to about 0.05 kWh. Even if we assume the lower concentration of uranium will require more sulfuric acid etc per ton of ore we can still see the energy margins are overwhelming. We can easily assume 10 times higher energy cost per kg of rock and it still doesn’t change the picture. In the Talvivaara mine in Finland they plan to extract uranium at concentrations of 40 ppm and that is to fuel the normal light water reactors today that only get about 120 kWh per gram of uranium, compared to the 24 000 actually in the uranium. Very low grades are already economic even with the inefficient standards of today.

Let’s continue by inserting some conservative assumptions. First assume that the extraction of uranium from the crust would only be 80% efficient, second assume that the reprocessing of spent nuclear fuel is only 95% efficient (i.e 5% of fissile material is lost each recycling) and in each run in the reactor we burn 10% of the uranium. If we start with 1 kg of uranium fuel we efficiently only use the energy in a fraction, T, of that kilo expressed as:

$$ T = 0.1*1 + 0.1*0.9*0.95+0.1*0.9^2*0.95^2+ 0.1*0.9^3*0.95^3….$$

In the first run in the reactor we extract 10% of the energy in one kilo. We then send 0.9 kilo to the reprocessing center which has a 95% efficiency. So in the second run in the reactor we are left with 0.9*0.95 of one kilo that we once again extract 10% energy from, so we send 0.9*0.9*0.95 kg to the reprocessing center and after that we have $$0.9*0.9*0.95*0.95=0.9^2*0.95^2$$ kg for the next round in the reactor where we extract 10% of the energy and so on, repeated until infinity. The series can be expressed as

$$T=\sum\limits_{i=1}^\infty 0.1*0.90^i*0.95^i$$

Which is a geometric series and one can show it is equal to

$$T=\frac{0.1}{1-0.9*0.95}\approx0.69$$

We multiply that by 0.8 to take into account that we only retrieve 80% of the uranium from the rock and then we end up with using about 55% of the energy content that is in the rock. So instead of 72 kWh/kg of rock we will get 40 kWh/kg. Recycling can of course be more efficient than 95% and one can possibly extend the time the fuel spends in the core to extract more than 10% of the energy.

We can apply this to some random mine. In northern Sweden there is a large open pit mine primarily mining copper. They mine about 30 million tons of ore every year, if we assume this ore contains uranium in a concentration that is equivalent to the crustal abundance it means they dig up 90 million grams of uranium every year! The total energy contained in this ore is about 2160 TWh and if we go with the losses assumed above we can ultimately extract about 1200 TWh from those 30 million tons. **This is thermal energy, if we assume 33% efficient conversion to electricity we end up with 396TWh/year. In 2010 Sweden used 147 TWh of electricity, in other words the uranium contained in the waste rock from one single mine in Sweden is enough to provide over twice the amount of electricity Sweden today consumes.** This of course means we would have to build about 20-30 breeder reactors of the PRISM or BN-800 type and reprocessing facilities in order for it to be possible. The BN reactors are currently about 25% more expensive than light water reactors on a per watt basis. Sweden built 12 light water reactors in 19 years from 1966 to 1985 so it is not unachievable and well in line with the industrial capacity of the country. Neither does it require new technology, the reactor type needed is already offered by reactor vendors, reprocessing technology is well understood and no extra mining requirement is needed, all that is needed is to spray some sulfuric acid over the remains of the mine waste after Boliden has taken the copper, gold etc out of the rock.

In the world about 2 billion tons of iron ore is mined every year, if that ore contains 3 ppm uranium then the heat energy content is 144 000 TWh and about 79 200TWh of that heat energy can be utilized to produce 26 000 – 30 000 TWh of electricity. The world as a whole produced 19 120 TWh of electricity during 2011. So the mining waste stream from one sector of mining can provide more than enough uranium to produce all the electricity that the world currently consumes. About 7000 PRISM reactors however would be required to utilize that uranium. But if a country of 7 million could build 12 larger reactors in 19 years, why can’t a world of 7 billion people build proportionally as many reactors? The world has built that many coal and natural gas power plants after all. If we want to do it over a time period of 30 years it means 233 reactors per year. With a price tag of one billion dollar per piece (about 3 times the cost per watt of new reactors in Korea or China) the build-up would require about 0.3% of the current gross world product.

What I haven’t mentioned so far is that the average crustal abundance of Thorium is 9 ppm, 3 times as high as uranium. But fissioning one gram of Thorium releases as much energy as fissioning a gram of uranium. So in reality we only need e a quarter of the waste stream of iron mining.

What is my point with this blog post? My point is simply to show that with just a small fraction of the mining we already conduct we extract far more energy resources than the world needs. The normal bland and boring rock beneath our feet contain vastly much more energy than the finest coal or oil deposits. The cost of producing energy from this resource is at worst about 25% more than the current cost of nuclear electricity. Simply put, the world will never run out of energy.

/Johan

At http://www.guardian.co.uk/discussion/comment-permalink/14314607 I worked out some of the same numbers. I guess this is the nub:

” … the leanest present-day chemical fuel ores …… the Alberta tarsands. At six percent tar, these contain 2.4 megajoules per kilogram and yield around 2 MJ as net energy. And this energy concentration is almost matched by ordinary rocks’ fissile content. Not uranium-for-breeders, not thorium, but U-235, the stuff we use now. “

Crazy, and at the same time greens are screaming and shouting one can not mind “low grade” uranium deposits and get a high EROEI.

The worse is we already have very massive in situ leach mining operations in progress all over the world. Uranium dissolves in water, so all rivers contains tons of uranium.

[…] siirtymä hyötyreaktoreihin poistaa kaiken huolen polttoaineen riittävyydestä (katso esim. täältä ja täältä) joten, jos haluamme maksaa energiasta tätä enemmän on se aktiivinen valinta, […]

[…] ett PS: När jag letade data snubblade jag över det här inlägget (och nej, jag har inte bara letat på kärnkraftpositiva sidor) om uranutvinning ur vanlig […]