If I Only Had a Brain — how to become the wisest in Oz

Last week, a random tweet by James Grandage prompted a chain of thought. He tweeted:

My response was to suggest that he had it already: a brain.

On reflection, however, it appears that James was seeking what many firms want — a brain for the whole organisation. To be able to create and recall institutional memories, to process sensations gathered by ears and eyes and to use those sensations to engage with other organisations (or people) and their brains.

In the name of knowledge management, many organisations have created databases and repositories that are intended to operate as brains as far as the technology will allow. Unfortunately, their actual performance often falls somewhat short of this promise. Why might this be?

One answer is suggested by the experience of the Scarecrow in Frank L. Baum’s Wizard of Oz. You will recall that he accompanied Dorothy on her journey to Oz in order to ask the Wizard for a brain, because that is what he wants above all else. As they travel down the Yellow Brick Road, the Scarecrow’s shows by his actions that in fact he has a brain, and can use it. When they get to Oz, he is recognised as the wisest man there.

Many law firms are on a similar journey. They labour in the belief that all they need to complete themselves is a know-how system, or database, or whatever terminology they use to describe their brain. In reality, they have one — distributed amongst their people — which they often use to spectacular effect. (For examples, see the FT’s report on Innovative Lawyers, which highlights a range of activities — very few (if any) of which depend on the existence of a KM system.)

Often, however, brains (whether individual or organisational) are used spectacularly poorly. I suspect that this is partly why KM databases fail so well: people just use them badly — they don’t use them, or they don’t volunteer their insights to them. (There are other, better, reasons, but I want to concentrate on this one for now.)

How actively do people use their own brains to reflect and learn from their experiences? Or to seek information or insight that challenges what they think they know? I must confess that I see little of this. (I try to do it myself, but I am sure I have blind spots where I accept a partial view of reality, rather than continuing to seek a better truth.) I am sure this critique and creativity happens, but for most people it is concentrated in areas where they are already experts. For lawyers, that is their area of legal expertise — not the work that goes on around them to support the firm in other ways.

As an example of this, consider the know-how system. Whilst the research I linked to above (and again here), dates from 2007, I still see people advocating such repositories as the cure-all for law firms’ knowledge ailments. At the very least, they ought surely to recognise that there is a contrary view and argue against it?

Another example that comes up repeatedly is the assertion that creative thought depends on using one’s right brain, rather than the analytical left brain. However, this depends on an understanding of neuroscience that was undermined twelve years ago. The origin of the left-right brain model was the research of Roger Sperry, who was awarded the Nobel Prize in 1981. Despite the attractiveness of this model (especially to a range of management authors), neuroscience, like all the sciences, does not stand still — all theories are challengeable.

The watershed year is 1998, when Brenda Milner, Larry Squire, and Eric Kandel published a breakthrough article in the journal Neuron, “Cognitive Neuroscience and the Study of Memory.” Kandel won the Nobel Prize two years later for his contribution to this work. Since then, neuroscientists have ceased to accept Sperry’s two-sided brain. The new model of the brain is “intelligent memory,” in which analysis and intuition work together in the mind in all modes of thought. There is no left brain; there is no right. There is only learning and recall, in various combinations, throughout the entire brain.

Despite the fact that this new model is just as easy to understand, people still fall back on the discredited left-right brain model. Part of the reason, I think, is that they don’t see it as their responsibility to keep up with developments in neuroscience. But surely using 30-year-old ideas about how the brain works brings a responsibility to check every now and then that those ideas are still current.

Something similar happens with urban legends. Here’s a classic KM legend: Stewart Brand on the New College roof beams.

It’s a good story, but not strictly true. In fact the beams had been replaced with pitch pine during the 18th century, the plantation from which the oak came was not planted until a date after the hall was originally built, and forestry practice is such that oak is often available for such a use.

It is not the case that these oaks were kept for the express purpose of replacing the Hall ceiling. It is standard woodland management to grow stands of mixed broadleaf trees e.g., oaks, interplanted with hazel and ash. The hazel and ash are coppiced approximately every 20-25 years to yield poles. The oaks, however, are left to grow on and eventally, after 150 years or more, they yield large pieces for major construction work such as beams, knees etc.

If we rely too heavily on documents and ideas that are familiar (and comfortable), we run the risk of selling ourselves short. As Simon Bostock has recently pointed out, there is almost invariably more interesting stuff in what we have not written down than in what we have captured (or identified as ‘lost knowledge’). Referring to another KM story (NASA have lost the knowledge that would be necessary to get to the moon again), he points out that what was really lost was not the documentation, but the less tangible stuff.

This means, basically, that even if NASA had managed to keep track of the ‘critical blueprints’, they would have been stuffed. Design trade-offs are the stuff of tacit knowledge. Which usually lives inside stories, networks, snippets of shoptalk, chance sneaky peeks at a colleague’s notes, bitter disputes and rivalries…

In knowledge terms, we’re about to live through another Black Death, another NASA-sized readjustment.

Smart organisations will recognise this in advance and avoid the archaeological dig at the junkyard, the museum and the old-folk’s home.

Archaeology is interesting, and can shed light on past and present activities, but we don’t use Grecian urns to keep food in any more. We use new stuff. The new stuff (whatever it might be) should be our continuing focus. That’s how we should use our brains, and how those supporting effective knowledge use should encourage brain-use in their organisations.

1 thought on “If I Only Had a Brain — how to become the wisest in Oz”

  1. There’s an issue here with the adversarial language of the 2.0 crowd. Anything that’s not ‘agile’ enough has been labelled as a ‘relic of command and control’. And therefore it’s wrong, wrong, wrong I tell you.

    I’m guessing our smarter and wiser kids will discover that a lot of the old ways we did things weren’t so much about command and control or Dilbertism, but cognitively useful.

    That report I wrote which nobody read except me? Cognitively useful. Those policy manuals which everybody subverts, even the HR idiots who wrote them? Cognitively useful.

    The worst thing is, it’s people *like* me who’re responsible for this baby-bathwater situation. For example, we all *know* rote learning is stupid. But then you go to Japan and you see how their rote learning habits contribute enormously to their attentional skills.

    We keep talking about better when we’d be better off focusing on trade-offs. The inverse of the ‘Conservation of Complexity’ law is the ‘Conservation of Simplicity’; we’re tweaking and tuning more than boosting performance..

    Most organisations carry some ‘luxury’ staff. Why don’t more orgs have historians? Peter Drucker said the point of marketing was to make selling superfluous. The point of historians is to make archaeology superfluous?

Comments are closed.