I’ve been reading “Going Nuclear” by Tim Gregory on the recommendation of my brother having seen him give a talk on it.
To be clear it’s written for not just a general science audience, but a completely general lay audience – even explaining units of measure and metric prefixes of scales of measure, as well as the basic physics. So, as an engineer in the capital facilities industries, including power of all kinds – as well as non-power nuclear applications – I’m not really his audience. In fact my two-before-last professional job was at the very same facility where Tim still works, Sellafield.

My agenda is a couple of levels more abstract than Tim’s – he’s focussing on the balance of psychology and scientific fact that have led to decades of failure to invest in the safest-and-longest-sustainable source of energy available to humanity – specifically nuclear power. My focus is the received wisdom world-model we hold being inadequate to address the meta-crisis underlying the whole gamut of existential poly-crises (including the energy<>environment balance) requiring our individual and collective decision-making. The Psybernetics of individuals and organisations living in free-democratic societies seen as Complex Adaptive Systems – a deeper, wider, longer story. (Coincidentally, Sellafield was the facility where I first recognised “complexity” as a project systems discipline in its own right, alongside all the other specific technical expertise, another longer story.)
Short story here, as I say, is that nuclear power (fission now plus fusion later) is simply the safest-and-longest-sustainable source of energy available to humanity. Anyone who doesn’t recognise that needs to read Tim’s book. It’s not just the economic consequences of wasted opportunity. In terms of human and environmental, risks and emissions it really is as benign as renewable sources like wind and solar. We suffer far greater (yet practically harmless) radiation exposure from the environment and air-travel say, than from the workings of nuclear power – indeed the infamous linear-non-threshold basis for assessing exposure risk associated with the nuclear industry is given a thorough treatment. In terms of human health and fatalities, fossil fuel emissions were / are clearly far worse, but population-scale evacuations (like lockdowns, incidentally, but specifically Chernobyl and Fukujima say) have far greater human risks than any original source of potential danger in unplanned accidents and the like. Nuclear in particular suffers a shared paranoia in mis-understanding relative risks, which is why for me, this is an exemplar of the wider psychology-more-than-science nature of our meta-crises and political decision-making generally.
Short-termism is only the half of it. Tim also goes on to cover very long-term small-scale nuclear power applications such as those enabling greater space exploration and population – the final frontier. For us Brits, Sellafield is of course sitting on an enormous resource of both fissile and fertile nuclear fuel. He also covers the many different viable fuel / reactor combinations including the state-of-the-art as well as the history of these. As an engineer, rather than a lay reader, probably the only criticism I would raise is that early on – he explains that he’s using the simple term “radiation” throughout, without distinguishing between ionising and non-ionising forms – for simplicity for his target audience – but I wonder if his arguments might actually be clearer in later sections looking at overall exposures, including solar?
But for now, if you or your friends don’t believe that:
Nuclear power really is the safest-and-longest-sustainable source of energy available to humanity and that in terms of human and environmental, risks and emissions it really is as benign as renewables.
You need to be reading and sharing “Going Nuclear” by Tim Gregory.
I shall be passing my copy along to the local pub book club π
=====
END
=====
Post Notes: Relevant to my agenda, not necessarily Tim’s, but so much is connected here.
Above, I used the idea of CAS (Complex Adaptive Systems) AND remembered that Sellafield was where I first saw “Complexity” as a specialist discipline in its own right. One way or another, I’ve always been in “Systems”. My own specialisms have evolved from aeronautics, structures and fluid dynamics, to the materials and the integrity of pressure containment systems, management systems, operating systems and information system models to where I am now: focussed on the epistemology and ontology of the world models we use to make individual and collective decisions. The fact these were always complex (not just complicated) systems had always been a given for me. Systems “thinking” is simply a response to that “complexity”. Systems don’t come more complex than we as collections of humans?
I have a number of ongoing dialogues with practitioners in this space, and there’s been a recurring topic about why some talk exclusively about either systems or complexity, and in either case why they insist on limiting their scopes to systems-science(s) or complexity-science(s). For me Complex Adaptive Systems Thinking is pragmatically about both and (obviously) science-informed so far as possible, but it’s more than science.
Dave Snowden is none of those, and despite being one of the “Complexity Science” camp, he is recently using “CAS” short-hand, confirming S is understood to be Systems (with no explicit mention of science). This was part of a thread where Dave was bemoaning “experts” claiming to understand CAS were yet still talking about “root causes”. He’s right, the point of CAS is to understand causal networks are emergent from the whole, but he also stated that CAS “deprivileged experts” – which is a little worrying not just for the dented egos of experts, but for the general public loss of faith in experts, so it needed clarifying imho:
Me: Telling that you put the expert ‘help’ in scare quotes.
Probably the point where I’d differ is in that there are many kinds of expertise in many dimensions – real expertise (like yours) is knowing which kinds are likely to help where.Me: CAS deprivileges the wrong kind of expertise. (Edit/Add – to be clear I agree with your main point about CAS vs “root-causes”.)
Me: Be interested to know, in your world, what the S in CAS stands for π
Dave: Iβm ok with systems and expertise in root cause analysis is very valuable, but not when we have a CAS. Myers Briggs which I have also seen advocated as a CAS approach but itβs a crude form of categorisation and a pseudo-science so there is no value in any human context. Context matters
Me: Yes, context matters, and contrary to many information scientists, with CAS, context isn’t just “more content” π
Me: (Aside – seeing my use of “the wrong kind of expertise” there – some of my ex-colleagues may recall my use of “the wrong trousers” metaphor – when we were barking up the wrong tree, looking for the wrong kind of software solutions.)
Lots more corollaries in there: So to recap:
Context? The relevance of context and its being more than simply “more content” has been a long-standing item here. A given. But obviously in a sense, at some level, a physical data storage implementation level (say), context might be seen as just more content, but ironically, that’s why context, including levels of abstraction between contexts, matter. Human context?
And “a” CAS case? For me all interesting cases are CAS – open interacting human individuals and organisations and their ecosystems – beyond any simple closed command and control system(s), however simple or complicated.
And “crude form[s] of categorisation and [dubious] science [have] no value in any human context”. OH YES! Pretty much the driver of my CAS agenda.
- Categorisation? (classifications, namings in context, in taxonomic ontologies) are inescapable and actually essential, despite all their pitfalls and unintended consequences. It’s why we have to have better ones than current “received wisdom”.
- Honesty about the limitations of science? hopefully goes without saying, but public / political communications about what is or isn’t relevant science are severely compromised by that “received wisdom” I mentioned.
Onwards and upward.
=====
I’ll come back to those “wrong trousers” one day π
=====






