Google+ buzz = new Wave ?

I’m liking the buzz around Google+, and from seeing only the free “tour” (no working account yet), I like the fact it’s the relationship and not the person that is the focus, as was the case with Wave. Groups (circles, hangouts, huddles, etc.) arise from the nature of the relationships, not limited to the crass friending and following paradigms – which maybe made sense in the original university / college campus environment, or early-learning steps in social media, but are just too – well – crass for the real world.

Wave had it right because the “Waves” were emergent from the communication activity, not defined by groups of (yeuch!) friends. The only thing wrong with Wave was how to present the enormous power in a sufficiently usable UI – perhaps the social paradigm for the Google+ UI will work. Hopeful. (Sadly, TechCrunch appears to have a politically motivated agenda against it succeeding.)

Managing Complexity

Been a trend in the day job to look at complexity as a subject in itself. Whether Oil&Gas or Nuclear Power, the systems view seems to acknowledge complexity as an explicit variable to be addressed. Thanks to David Gurteen for the link to this piece by Nick Milton – knowledge management, whatever you believe that is (*), is part of the solution. Topical on the scale of human generations, in the post-Macondo, Post-Fukushima context.

(Agree with David Gurteen’s observation that it would be interesting to hear Dave Snowden’s take – in the light of the simple BCG Grid, given his extrapolation of the grid concept into the world of complexity.)

Sadly the New Scientist link appears broken – looking into that.

(* The Ron Young version, or the Euan Semple version. Being too well defined is counter-productive.)

Project Management Memetics

Leon sent me a link to this paper a couple of years ago, to which I responded “interesting” – he knows I’m interested in memes. I didn’t actually read beyond the title until today.

The essence of memes is that there is something “self-serving” about patterns of information (*1) which is independent of any rationally intended human purposes in using them. The same is as true of (say) project management procedures and practices as it is of any rational processing of information – my agenda is that this is a problematic feature of management and governance in the most general sense, not just businesses and projects, any decision-making-to-act process, knowledge-management practices, even the rational domain par-excellence science itself. So I have no doubt about the problems of failing to see the memetic aspect of project management activities – it’s is of course where my concerns began in Oil & Gas industry and Information Management projects, 15 or 20 years ago – the reason I’ve been blogging since blogging was invented …. but this is not about me.

In fact none of this is new in management circles, just the new(ish) memetic language, and part of the problem now is that memetics itself is contentious to some people (*1). But even without memetics, the idea that decision-rationality = action-irrationality has been part of action-science management theories (eg Argyris / Brunsson et al) and probably longer before that with (say) Parker-Follett – guru to the gurus in management.

In any “professional” management situation it is difficult (anathema) to suggest that doing a rational thing is the irrational (wrong) thing to do. You’re mad, surely. “Before we make this decision to act, we should study and agree upon this issue – right ?” Wrong. Act and experience the outcomes (with “care”, in the knowledge of the issue). It’s been called analysis-paralysis for years, but it’s not just “analysis”, it’s following any rational, objective process that delays action, because it is the action that provides experience. Experience is worth more than theory, in practice.

Performing rational (project) management analyses, modelling and management decision-making processes tends to lead to more (project) management activities – ie self-serving – rather than achieving the value-adding goals of the enterprise or project. (IT / IM projects, particularly new, integrated business and/or government (civil or defense) systems, are often legendary in terms of project failure, however they are actually post-rationalized. Not surprisingly there are newer “agile” IT project management processes that force the action and feedback cycle milestones.)

(*1) Patterns of information, known as memes because they are copied (not the other way around), come in many levels; patterns (upon patterns) upon patterns of information (statically defined) and patterns (upon patterns) of their (dynamic) relations, procedures, patterns of use, communication and processing. Because genes – the biological analogue of memes – are based on 4-bases (*2) and n-chromosomes in any given species (*3), there is a popular misconception that genetic copying in biological reproduction is well defined in terms of atomically discrete “digital” genes, whereas memes are somehow more woolly – anything from a single word representing an identifiable concept to the whole idea of ideas, concepts, interpretations, representations even internet crazes, fashions, cultural patterns (even whole religions and cultures) etc. Many people baulk at the idea that “cultural units” (memes) can be considered as discretely as “biological units” genes. Now, reducing things to discrete objects (genes or memes, or anything else) is part of a wider issue, but genes and memes, their own definitions and the processes and patterns involving their transmission and reproduction are equally complex and ultimately flaky – just equally useful in describing the processes involved – information processing processes both (*4). The analogy is in fact a very good one. It’s about what IS copied and communicated, not prescriptive about what they should be, or how they might be represented when communicated and processed. Naturally, simpler patterns of information (memes or genes) – patterns of information which are simpler to represent – are communicated, processed (and replicated) more easily, so unsurprisingly discrete objects are much more “popular” than complex patterns of information – another self-serving aspect. Simple ideas rule, but often simple may be dumb.

(*2) Even the 4 DNA / RNA bases are not in any sense absolute. They just happen to be the basis of the most prevalent and most studied organic biological forms. Other biochemical possibilities exist. And of course even in R/DNA based life, there are many other non-R/DNA cell structures involved in the processes too. Doesn’t change the essential pragmatic truth of genetic reproduction.

(*3) And even the definition of a discrete species is highly context dependent and controversial when it comes down to it. Different definitions are accepted for different practical purposes.

(*4) Objective reductionism is full of contentious topics when it comes to more subjective things like free-will and consciousness, but this is true even at the most fundamental levels of physics too. Arguments in these topics need to be conducted extremely carefully – avoiding “misplaced-objectivity” and “greedy reductionism” – more self-serving memes.

[Need to come back and link to the implied sources throughout.]

[Post Note : Existentialism and Evolutionary Psychology – Heidegger, Foucault, Dennett and many more in Jon Whitty’s project management presentations. A man after my own.]

More on Macondo

I’ve now had time to read the whole US Commission report on the BP Deepwater Horizon disaster in the Gulf of Mexico – the discussion sections that I’d not read earlier, in order not to be influenced, when I published my initial conclusions. It is ever clearer.

“Most, if not all, of the failures at Macondo can be traced back to underlying failures of management and communication. Better management of decision-making processes within BP and other companies, better communication within and between BP and its contractors, and effective training of key engineering and rig personnel would have prevented the Macondo incident.”

My emphasis this time on their positive use of “would” – ie without doubt. My own agenda here is to pick up those communication and decision-making aspects of business management systems, but as an engineer in the downstream business and as a human, you have to feel for the guys who made the mistakes and struggled with their consequences, in many cases to their deaths.

It’s a long time since BP has been a “British” company, and any finger-pointing between BP and Haliburton an Transocean is unhelpful. Creditable to notice lines in the official (US) report like

“As BP’s own report agrees …”

compared to

“Halliburton has to date provided nothing … “

or

“Haliburton should have …”

My point is that the responsibility is shared industrially (as the report concludes), and I see BP taking its share.

I make that point because I did make an observation earlier about the hairy-arsed “wild-catting” culture present at the sharp end in this industry, with a US frontier freedoms mentality wherever in the world the operation is. Any sophisticated business managing such operations – however good BP is – would be unlikely to change that “by design” and in fact should think hard before attempting to do so.

Remember this was one of the largest, newest and most sophisticated rigs in the world. There is a recommendation about the control and monitoring systems in use, particularly during the fateful period when the “kick” had already started and the fatal blow-out was on its way :

Why did the crew miss or misinterpret these signals? One possible reason is that they had done a number of things that confounded their ability to interpret [the] signals ….

In the future, the instrumentation and displays used for well monitoring must be improved. There is no apparent reason why more sophisticated, automated alarms and algorithms cannot be built into the display system to alert the driller and mudlogger when anomalies arise. These individuals sit for 12 hours at a time in front of these displays. In light of the potential consequences, it is no longer acceptable to rely on a system that requires the right person to be looking at the right data at the right time, and then to understand its significance in spite of simultaneous activities and other monitoring responsibilities.”

Hard to argue with that ? But, very important to distinguish decision-making from decision-support. You (we all) are relying on a tremendous amount of experience and judgement, not to mention risk-taking balls, at the upstream sharp-end of the business, drilling into the unknown. There will be blood ? Hopefully not, but it is part of the risk. There are some clear management and control-system safety-critical steps in all these processes, which need to be treated as such, with fail-safe steps needed, but we need to be careful not to (try to) automate all risk out of the system. People are highly ingenious at bypassing systems that prevent them doing their job. Applying controls in the wrong places can counter-intuitively increase the risks. We need systems that support people doing their jobs, not take them out of the loop entirely. There is good reason why the human eye is brought to bear on these processes. Proper risk assessment is one thing, but knowing when to do it and what to do with the result needs focus.

There are a number of other things also borne out by the report.

If you’ve never actually experienced a disaster first hand, it is difficult to appreciate that one is actually taking place, denial is naturally human – the hope for anything but that. By definition, the safer industry in general, the fewer participants have the necessary experience. The captain of the Titanic comes to mind. Drills and simulations of the worst case risks become so important to take seriously. This point is so important it makes it into the summary paragraph above.

Integrity & pressure testing is something of which I have considerable experience. Such testing inevitably occurs late in the process, as early as possible naturally, but nevertheless towards the end of the job. Inevitably the consequences of failing such a test can therefore have great business delay, cost and rework consequences, and all the attendant contractual responsibility wrangling that might entail. So, paradoxically, it is at the integrity / pressure test point when you most want failure to occur. Such tests may be potentially destructive by design and if it’s going to fail, this is precisely when we need it to happen, when the health and safety risk is lowest and the business value risk almost at its peak. You need to be looking for failure here. It takes balls to fail a pressure / integrity test, and the people & processes here need real authority and independence from the business productivity roles. I already mentioned the need to acknowledge safety criticality in levels of surveillance and regulation imposed from outside the working team. Again the report (and BP’s own actions since their own investigation) well recognize this issue. There really should have been (almost literally) alarm bells ringing before this test process even started. It could hardly have been more critical.

From the most significant failure point to an incidental one, though both are examples of communication of information for decision-making in the summary paragraph; The confusion about whether or not the specified spacers had actually been delivered and available as the correct type (design-class), affecting the decision as to the spacing arrangement actually deployed. Several ironies in that inconclusive chain of decisions, that provided the unfortunate quote used as the headline in the report.

“Who cares. It’s done … we’ll probably be fine …”

Supply chain confusion about the type of materials actually delivered and available. How hard can it be for supplied items to be marked and systems informed with their true class (type) ? One for the information modelling and class libraries aspects of the ISO15926 day job.

A380 Experience

Interesting (?) to fly in a Singapore Airlines A380 between the Quantas incident and the Singapore decision to change the engines. Not used to flying business class these days, but all I can say it was as quiet and smooth a flight as I’ve experienced, up front, top deck on a Singapore A380.

Having survived, I can tell the tale.

The Impersonal Filter Bubble

The dangers of web access being too personalized. Hat tip to Johnnie Moore.

Like most things we need both in balance – totally open linking and personally (contextually) filtered channels. Clearly our personal filters need to be known to us personally …. or they are impersonal filters. So now you know, if you didn’t already.

The idea that it inhibits active dissent seems entirely spurious. Criticism is all too easy. Anyone wanting to dissent actively needs active intent to get off their ass, not find dissent opportunities on a plate. That’s a good reason for filtering.

System Complexity Hits Home

Discovery News article on the complexity of computer systems in domestic cars, prompted by the recent Toyota recall. Hat tip to Donald Firesmith for the link on LinkedIn.

More lines of code than F22 / F35 / B787 / A380 avionics systems.

I have experienced that complexity myself recently. Last year I bought a new car and was staggered to discover a 500-page manual explaining its operations, along with a 200-page companion manual for the GPS and radio systems. One of the new features touted was the much larger glove compartment, a size probably dictated by that of the required manuals.

And nobody reads the manual any more, anyway. Interesting to compare the modular replace vs repair consequences as “the system” gets this complex, with say Crawford’s messages in “The Case for Working with your Hands“. Will humans ever really “buy” the loss of control, the detachment from the real.

Strategic Direction

Post from Anecdote about Values, Direction, Identity and Purpose of an organization. Interested in how Values and Purpose are captured right now, but I thought the arrow diagram that separates Strategy (direction) from Strategic Goals / Targets / Plans is really useful. So many people confuse a strategy with a plan.

Separating Functions

Interesting in this latest post-Deepwater Horizon BP story, not just creating an independent safety group with teeth (which I’m not sure about, being seen to do something decisive I guess), but more importantly re-organizing E&P into three separate operations Exploration, Drilling and Production.

Mentioned in my earlier post on BP’s accident report [the second post-note] that there must be some cultural hangover between wild-catters and producers in terms of who takes what kinds of risks to get their respective jobs done. BP takes corporate responsibility for the whole, but behaviour patterns within the whole are complex and culturally conditioned by local history. Separating the areas may allow greater focus on the systemic problems of each.