The Real Joke

I’m a big fan of Howard Webb (defended him over the world cup final Dutch fiasco, for example) and in fact in the offending game here, he got both the controversial decisions right – Gerrard sending off and the Berbatov penalty – in our opinion. Anyway, after Kenny branding the latter decision a “joke” and Ryan Babel tweeting his photo-shopped image of Webb in Man U colours;

Priceless irony that a commenter on the BBC news page on the story, going by the handle of “Victory Through Harmony” should comment:

How sad have things got that now the players aren’t even allowed to have a pop at refs?

Harmony / Pop does not compute, mate. In the heat of competitive battle, refs handle plenty of “pop” with minimum sanction – but pre-meditated high-profile post-match public channels – they don’t have to take it.

Dennett on Hacker

Since I’ve just started reading Bennett, Dennett, Hacker and Searle’s “Neuroscience & Philosophy” I thought I’d post a link to the coincidence in the earlier post today, as a current reminder to myself to post some review notes.

My most considered reading of Bennett & Hacker was back in 2009. There I speculated on Dennett’s potential response, since Dennett was a clear target of that book. Wittgenstein’s importance means he’s in the background of many of these discussions, and just this morning I picked-up the Rob Minto reference to Wittgenstein in the post below. Rob’s tutor was Hacker, which led me to follow my own links back to my earlier reading of Hacker and Bennett and add this in a post-note there.

[I recently found that Dennett did respond and now obtained Bennett, Dennett, Hacker & Searle – “Neuroscience and Philosophy” A cover blurb quote from Akeel Bilgrami of Columbia Uni suggests:

“If you can get sworn and unrestrained philosophical enemies such as Dennett and Searle to join forces against you, you must be … the controversialists of our time.”

Fascinating. That was pretty much my take on Bennett and Hacker – controversial by design, with (hopefully) deliberate misreading of the scientific position on a philosophy of mind.]

Loop closed for now, until I have some notes on the Dennett response. Certainly looks, on the basis of a quick scan, like Dennett’s rebuttal is along the lines I imagined, and my own reading of Bennett and Hacker. Hopefully we can move on from attack and rebuttal to some common sense progress.

Just checked Dennett is still OK since his major surgery in 2006.

[Told] by friends and relatives that they had prayed for him, he resisted the urge to ask them, “Did you also sacrifice a goat?”

Wound or Heal ?

Interesting to contrast yesterday’s UK parliamentary knock-about between Milliband and Campbell with Obama’s call for healing in the rhetorical wars between US partizan politicians and commentators.

Blaming opponents for “all that ails the world” was unhelpful, he said.

Cameron and Milliband were downright personal in trading insults of each other and their colleagues – despite the thin veil of humour. Ridicule is not funny. OK in moderation, but not the basis of free and democratic progress. Politicians should focus on governance, not auditioning as court-jesters for Have I Got News For You.

[Post Note :

I was thinking how to contrast Palin’s response to Obama’s and found this on Sam’s Elizaphanian blog.

What Palin has in common with Wittgenstein ?

Excellent. Rob Minto’s tutor was Hacker no less. Really taken with Bennett and Hacker’s “Philosophical Foundations of Neuroscience” on a second reading last year, and now reading Bennett, Hacker, Dennett and Searle’s “Neoroscience and Philosophy”. Hacker meets my personal hero Dennett. Small world.]

The Full Macondo

And now the full and final commission report to the president on the BP Deepwater Horizon disaster, including the Chapter 4 on the blow-out failure itself, reviewed previously.

Finally, to the American people, we reiterate that extracting the energy resources to fuel our cars, heat and light our homes, and power our businesses can be a dangerous enterprise. Our national reliance on fossil fuels is likely to continue for some time—and all of us reap benefits from the risks taken by the men and women working in energy exploration. We owe it to them to ensure that their working environment is as safe as possible. We dedicate this effort to the 11 of our fellow citizens who lost their lives in the Deepwater Horizon explosion.

I have to say, my general feeling, is the quality of the investigation and reporting seems excellent. I sincerely hope the findings and recommendations – technical and rhetorical – are fully understood and actioned accordingly, avoiding knee-jerk “never again” simplistications.

As the Board that investigated the loss of the Columbia space shuttle noted, “complex systems almost always fail in complex ways.” Though it is tempting to single out one crucial misstep or point the finger at one bad actor as the cause of the Deepwater Horizonexplosion, any such explanation provides a dangerously incomplete picture of what happened—encouraging the very kind of complacency that led to the accident in the first place.  Consistent with the President’s request, this report takes an expansive view.

And as a little context, beyond this drilling operation, and beyond BP:

Since 2001, the Gulf of Mexico workforce—35,000 people, working on 90 big drilling rigs and 3,500 production platforms—had suffered 1,550 injuries, 60 deaths, and 948 fires and explosions.

[Post Note : Just assembling a collection of all my blog links on this subject:
11th Jan 2011 this post https://www.psybertron.org/?p=3699
10th Jan 2011 https://www.psybertron.org/?p=3694
6th Jan 2011 https://www.psybertron.org/?p=3689
2nd Nov 2010 https://www.psybertron.org/?p=3598
30th Sep 2010 https://www.psybertron.org/?p=3548
8th Sep 2010 https://www.psybertron.org/?p=3534
10th Jun 2010 https://www.psybertron.org/?p=3426
28th Apr 2010 https://www.psybertron.org/?p=3312

Remember, despite blaming no single failure above:

… the failures at Macondo can be traced back to
underlying failures of management and communication.

Must collate a coherent summary report of my main proposals – the communication of information supporting management decisions at all levels, particularly the criticality of information in context driving the escalation of levels of management to be informed and involved in those decisions.]

More on Macondo

I’ve now had time to read the whole US Commission report on the BP Deepwater Horizon disaster in the Gulf of Mexico – the discussion sections that I’d not read earlier, in order not to be influenced, when I published my initial conclusions. It is ever clearer.

“Most, if not all, of the failures at Macondo can be traced back to underlying failures of management and communication. Better management of decision-making processes within BP and other companies, better communication within and between BP and its contractors, and effective training of key engineering and rig personnel would have prevented the Macondo incident.”

My emphasis this time on their positive use of “would” – ie without doubt. My own agenda here is to pick up those communication and decision-making aspects of business management systems, but as an engineer in the downstream business and as a human, you have to feel for the guys who made the mistakes and struggled with their consequences, in many cases to their deaths.

It’s a long time since BP has been a “British” company, and any finger-pointing between BP and Haliburton an Transocean is unhelpful. Creditable to notice lines in the official (US) report like

“As BP’s own report agrees …”

compared to

“Halliburton has to date provided nothing … “

or

“Haliburton should have …”

My point is that the responsibility is shared industrially (as the report concludes), and I see BP taking its share.

I make that point because I did make an observation earlier about the hairy-arsed “wild-catting” culture present at the sharp end in this industry, with a US frontier freedoms mentality wherever in the world the operation is. Any sophisticated business managing such operations – however good BP is – would be unlikely to change that “by design” and in fact should think hard before attempting to do so.

Remember this was one of the largest, newest and most sophisticated rigs in the world. There is a recommendation about the control and monitoring systems in use, particularly during the fateful period when the “kick” had already started and the fatal blow-out was on its way :

Why did the crew miss or misinterpret these signals? One possible reason is that they had done a number of things that confounded their ability to interpret [the] signals ….

In the future, the instrumentation and displays used for well monitoring must be improved. There is no apparent reason why more sophisticated, automated alarms and algorithms cannot be built into the display system to alert the driller and mudlogger when anomalies arise. These individuals sit for 12 hours at a time in front of these displays. In light of the potential consequences, it is no longer acceptable to rely on a system that requires the right person to be looking at the right data at the right time, and then to understand its significance in spite of simultaneous activities and other monitoring responsibilities.”

Hard to argue with that ? But, very important to distinguish decision-making from decision-support. You (we all) are relying on a tremendous amount of experience and judgement, not to mention risk-taking balls, at the upstream sharp-end of the business, drilling into the unknown. There will be blood ? Hopefully not, but it is part of the risk. There are some clear management and control-system safety-critical steps in all these processes, which need to be treated as such, with fail-safe steps needed, but we need to be careful not to (try to) automate all risk out of the system. People are highly ingenious at bypassing systems that prevent them doing their job. Applying controls in the wrong places can counter-intuitively increase the risks. We need systems that support people doing their jobs, not take them out of the loop entirely. There is good reason why the human eye is brought to bear on these processes. Proper risk assessment is one thing, but knowing when to do it and what to do with the result needs focus.

There are a number of other things also borne out by the report.

If you’ve never actually experienced a disaster first hand, it is difficult to appreciate that one is actually taking place, denial is naturally human – the hope for anything but that. By definition, the safer industry in general, the fewer participants have the necessary experience. The captain of the Titanic comes to mind. Drills and simulations of the worst case risks become so important to take seriously. This point is so important it makes it into the summary paragraph above.

Integrity & pressure testing is something of which I have considerable experience. Such testing inevitably occurs late in the process, as early as possible naturally, but nevertheless towards the end of the job. Inevitably the consequences of failing such a test can therefore have great business delay, cost and rework consequences, and all the attendant contractual responsibility wrangling that might entail. So, paradoxically, it is at the integrity / pressure test point when you most want failure to occur. Such tests may be potentially destructive by design and if it’s going to fail, this is precisely when we need it to happen, when the health and safety risk is lowest and the business value risk almost at its peak. You need to be looking for failure here. It takes balls to fail a pressure / integrity test, and the people & processes here need real authority and independence from the business productivity roles. I already mentioned the need to acknowledge safety criticality in levels of surveillance and regulation imposed from outside the working team. Again the report (and BP’s own actions since their own investigation) well recognize this issue. There really should have been (almost literally) alarm bells ringing before this test process even started. It could hardly have been more critical.

From the most significant failure point to an incidental one, though both are examples of communication of information for decision-making in the summary paragraph; The confusion about whether or not the specified spacers had actually been delivered and available as the correct type (design-class), affecting the decision as to the spacing arrangement actually deployed. Several ironies in that inconclusive chain of decisions, that provided the unfortunate quote used as the headline in the report.

“Who cares. It’s done … we’ll probably be fine …”

Supply chain confusion about the type of materials actually delivered and available. How hard can it be for supplied items to be marked and systems informed with their true class (type) ? One for the information modelling and class libraries aspects of the ISO15926 day job.