Transplanting practices between organisations

It is time to revisit the best practices meme again. Over the past few months I have been struck by the way the term is sometimes used in an all-encompassing way, without necessarily clarifying its scope.

Lamb House, Rye

One relatively recent post of this type “Innovation Builds on Best Practice” was written by Tom Young of Knoco, and refers to their intriguing Bird Island exercise. Over the last ten years, Knoco have been running workshops in which the participants build a tower with a given set of materials, then improve their designs following a number of KM interventions. The decade of experience has been documented in a set of ‘best practices’ which are used as part of the exercise. As the exercise progresses, tower heights increase significantly, and the maximum heights have also grown over the ten year period. (There is a longer account of the exercise in the April 2009 issue of Inside Knowledge magazine.)

Tom defines ‘best practice’ by reference to work done with BP:

A recognised way of [raising productivity or quality level across the board] is to identify a good example of how to do it and replicate that in other locations. We used the term ‘good practice’ in the BP Operations Excellence programme. After we had identified several ‘good practices’, we developed from them, the ‘best practice’. It was only after the ‘best practice’ was identified (and agreed by the practitioners) that it was rolled out and all plants encouraged to implement that method. After all if there was an agreed ‘best practice’ to do an activity, why would you not want to use it? Learning was captured on an ongoing basis and the ‘best practice’ updated periodically.

If I understand him correctly, Tom is comparing performance in an activity, process or task in one part of an organisation with the same activity, process or task elsewhere in the same organisation. In this context, I can see that practices may well be comparable and replicable across silos. (Although, to answer his rhetorical question, I can easily envisage situations where the context may well require a ‘best practice’ to be ignored. Offshore oil extraction will be very different in the different climatic conditions of the Gulf of Mexico and the North Sea.)

However, greater problems arise in attempts to transfer ‘best practice’ between organisations, or even within organisations where more processes or activities are at stake.

More years ago than it is comfortable to recall, I studied Comparative Law. (I even taught it briefly at a later stage.) One of the key readings was an article by Otto Kahn-Freund, “On Use and Misuse of Comparative Law” (1974) 37 Modern Law Review 1. (The article is not online, but I found a very good summary of its key points, together with a later piece by Gunther Teubner.) Kahn-Freund’s argument is that a law or legal principle cannot be separated from the culture or society that created it, and so even when there is a common objective, transplanting the law from elsewhere will rarely work. There is a useful example in the criminal law. The way in which criminal investigations and prosecutions proceed varies wildly between countries. It would make little sense to take a rule of evidence from the adversarial system used in England and transplant it into the French inquisitorial system. William Twining has elaborated considerably on this argument in an interesting lecture given in 2000 (PDF).

The problem that I have with much of the ‘best practice’ discourse is that it often strays into assertions or assumptions that such practices can readily be transplanted. However, like the law, such transplants will often be rejected.

The other aspect of Tom Young’s post that, frankly, confuses me is his treatment of innovation. Here’s an extended quote.

Now I hear some mention the words like ‘innovation’ and ‘creativity’. Perhaps you are thinking that the use of best practice will inhibit innovation and creativity. For me this is where context is vital.

In some situations, you don’t want innovation or creativity, you just want it done in a standard, consistant fashion.

If you are running a chemical plant, you don’t want the operator to innovate. If you are manufacturing microchips, you don’t want the technicians to innovate. If you are launching a new product into a target market, you perhaps don’t want innovation but standardisation. If you are decommissioning a nuclear power plant, perhaps you don’t want innovation during the work phase.

I am comfortable with this so far. Where things are working well, we should carry on. However, there is always room for improvement, even in simple systems.

Innovation should be built on current best practice. One of the key lessons from the Knoco Bird Island exercise is that if you ask people to do something, they will frequently start based on their own experience. When you illustrate the current best practice that has been achieved by several hundred people before them, they are frequently overwhelmed as to how poor they achievement was compared to what has already been established. 

Where appropriate give them the best practice and ask them to innovate from there. For example if by the introduction of AAR’s the time to change filters has been reduced from 240 hours per screen to 75 hours and a best practice created illustrating how this is achieved, innovate from the best practice figure of 75 hours, not the previous figure of 240 hours but only if it is safe to do so. In some instances innovation must be done in test area, ideas thought out, prototypes created and tested before the agreed modification is installed in the main plant.

My problem here is that I don’t think Tom is describing innovation. These are improvements in existing processes, rather than adaptations to new scenarios where adherence to the current way of doing things would be counter-productive. In a comment to Tom’s post, Rex Lee refers to kaizen. This is something that is often associated with Toyota. To be sure, the lean production processes in Toyota’s main, automotive, division are partly responsible for its continuing viability. However, another critical aspect is the way in which the company has diversified into other areas such as prefabricated housing, which it has been building since the mid-1970s. This response to crisis is an innovation, and goes beyond process improvement. Toyota encourages both through its well-documented suggestion system.

Going back to the Bird Island, it is certainly correct that no sensible business would expect people to embark on tasks or activities without guidance as to the ways in which they have successfully been done before. However, if the business needs a different way to achieve the same outcome, or a different outcome altogether, getting better at doing the same thing isn’t going to cut it.

Now and then

A couple of days ago, Patrick Lambe posted a really thoughtful piece considering the implications of heightened awareness from the new generation of social software tools as opposed to the traditional virtues of long-term information storage and access. If you haven’t read it, do so now. (Come back when you have finished.)

Laid down

The essence of Patrick’s piece is that when we focus our attention on the here and now (through Twitter or enterprise micro-blogging, for example), we forget to pay attention to the historically valuable information that has been archived away. This is not a problem with technology. He points to interesting research on academics’ use of electronic resources and their citation patterns.

How would online access influence knowledge discovery and use? One of his hypotheses was that “online provision increases the distinct number of articles cited and decreases the citation concentration for recent articles, but hastens convergence to canonical classics in the more distant past.”

In fact, the opposite effect was observed.

As deeper backfiles became available, more recent articles were referenced; as more articles became available, fewer were cited and citations became more concentrated within fewer articles. These changes likely mean that the shift from browsing in print to searching online facilitates avoidance of older and less relevant literature. Moreover, hyperlinking through an online archive puts experts in touch with consensus about what is the most important prior work—what work is broadly discussed and referenced. … If online researchers can more easily find prevailing opinion, they are more likely to follow it, leading to more citations referencing fewer articles. … By enabling scientists to quickly reach and converge with prevailing opinion, electronic journals hasten scientific consensus. But haste may cost more than the subscription to an online archive: Findings and ideas that do not become consensus quickly will be forgotten quickly.

Now this thinning out of long term memory (and the side effect of instant forgettability for recent work that does not attract fast consensus) is observed here in the relatively slow moving field of scholarly research. But I think there’s already evidence (and Scoble seems to sense this) that exactly the same effects occur when people and organisations in general get too-fast and too-easy access to other people’s views and ideas. It’s a psychosocial thing. We can see this in the fascination with ecologies of attention, from Tom Davenport to Chris Ward to Seth Godin. We can also see it in the poverty of attention that enterprise 2.0 pundits give to long term organisational memory and recordkeeping, in the longer term memory lapses in organisations that I have blogged about here in the past few weeks…

Jack Vinson adds another perspective on this behaviour in a post responding to Patrick’s.

I see another distinction here.  The “newer” technologies are generally about user-engagement and creation, whereas the “slower” methods are more focused on control and management activities much more so than the creation.  Seen in this light, these technologies and processes spring from the situation where writing things down was a time-consuming process.  You wanted to have it right, if you went to that much effort.  Unfortunately, the phrase “Document management is where knowledge goes to die” springs to mind.

In knowledge management, we are trying to combine the interesting knowledge that flows between people in natural conversation as well as the “hard knowledge” of documented and proven ideas and concepts.  KM has shown that technology just can’t do everything (yet?) that humans can do.  As Patrick says, technology has been a huge distraction to knowledge management.

I think Jack’s last comment is essential. What we do is a balance between the current flow and the frozen past. What I find fascinating is that until now we have had few tools to help  us with the flow, whereas the databases, archives, taxonomies and repositories of traditional KM and information management have dominated the field. I think Patrick sounds an important warning bell. We should not ignore it. But our reaction shouldn’t be to reverse away from the interesting opportunities that new technologies offer.

It’s a question (yet again) of focus. Patrick opens his post with a complaint of Robert Scoble’s.

On April 19th, 2009 I asked about Mountain Bikes once on Twitter. Hundreds of people answered on both Twitter and FriendFeed. On Twitter? Try to bundle up all the answers and post them here in my comments. You can’t. They are effectively gone forever. All that knowledge is inaccessible. Yes, the FriendFeed thread remains, but it only contains answers that were done on FriendFeed and in that thread. There were others, but those other answers are now gone and can’t be found.

Yes, Twitter’s policy of deleting old tweets is poor, but even if they archived everything the value of that archive would be minimal. Much of what I see on Twitter is related to the here and now. It is the ideal place to ask the question, “I’m looking at buying a mountain bike. For $1,000 to $1,500 what would you recommend?” That was Scoble’s question, and it is time-bound. Cycle manufacturers change their offering on a seasonal and annual basis. The cost of those cycles also changes regularly. The answer to that question would be different in six months time. Why worry about storing that in an archive?

Knowledge in law firms is a curious blend of the old and the new. Sometimes the law that we deal with dates back hundreds of years. It is often essential to know how a concept has been developed over an extended period by the courts. The answer to the question “what is the current position on limitations of liability in long-term IT contracts?” is a combination of historic research going back to cases from previous centuries and up to the minute insight from last week’s negotiations on a major outsourcing project for a client. It is a real combination of archived information and current knowledge. We have databases and law books to help us with the archived information. What we have been lacking up until recently is an effective way of making sure that everyone has access to the current thinking. As firms become bigger and more scattered (across the globe, in some cases) making people aware of what is happening across the firm has become increasingly difficult.

Patrick’s conclusion is characteristically well expressed.

So while at the level of technology adoption and use, there is evidence that a rush toward the fast and easy end of the spectrum places heavy stresses on collective memory and reflection, at the same time, interstitial knowledge can also maintain and connect the knowledge that makes up memory. Bipolarity simply doesn’t work. We have to figure out how to see and manage our tools and our activities to satisfy a balance of knowledge needs across the entire spectrum, and take a debate about technology and turn it into a dialogue about practices. We need to return balance to the force.

That balance must be at the heart of all that we do. And the point of balance will depend very much on the demands of our businesses as well as our interest in shiny new toys. Patrick is right to draw our attention to the risks attendant on current awareness, but memory isn’t necessarily all it is cracked up to be. We should apply the same critical eye to everything that comes before us — how does this information (or class of information) help me with the problems that I need to solve? The answer will depend heavily on your organisational needs.

First, think…

I wasn’t at the Reboot Britain conference today, but there were some valuable nuggets in the twitterstream for the #rebootbritain hashtag. Of these, Lee Bryant’s reference to Howard Rheingold’s closing keynote resonated most for me.

@hreingold triage skills vital to new world of flow

The most common challenge I see from people about social software, Enterprise 2.0, whatever you want to call it, is that it looks interesting, but they are busy enough as it is, and can’t we do something about information overload. “Where do you find the time to do all this?” I can point to examples where these technologies can save them time (using a wiki over e-mail, for example), but these are often seen as problematic for some reason or another.

Wood stack

What Lee has spotted in Howard’s keynote is that people are being faced with a new challenge in life and work, and it probably frightens them.

Up until now, much of the information we need (as well as a huge amount that we don’t need) has been selected by someone else. Whether it is stories in a newspaper, TV programmes on the favourite channel or information within an organisation, someone has undertaken the task of choosing what the audience sees. As a result, we often have to live with things we don’t want. For example, I have little interest in most sports, so all newspapers have a sports section that is too long for my needs. Our tolerance for this redundancy is incredible. But we still resist changing it for a situation in which we can guarantee to see just what we want (and more of it).

According to Wikipedia (and this chimes with other accounts that I have read, so I trust it for now), triage was formalised as a means of dealing with large volumes of battlefield casualties in the First World War. One approach to medical emergencies might be to treat them as they arise, irrespective of their chances of survival. However, doing this is likely to lead to pointless treatment of hopeless cases and to a failure to treat those with a chance of survival in time. The result is a waste of resources and a higher than necessary death rate. Triage means that immediate treatment can be focused on those whose chances of survival are not negligible and where urgency is most important. Triage in medical emergencies is now a highly-developed technique, with incredibly effective results. (However much it may be resented by the walking wounded who are inevitably kept waiting in hospital accident & emergency departments.)

What would triage mean for information consumption? In the first place, it means no filtering before triage. One of the causes of information overload is that traditional selectors (the TV scheduler or news editor) inevitably pay no attention to the personal needs or interests of the audience. How could they? So, unlike the A&E department, we cannot rely on a triage nurse to make our choices for us. Rule zero, then, is that everyone does their own triage.

One of the key things about hospital or battlefield triage is that we don’t waste time with it if there is a clear life-saving need. So rule one of information triage is that anything life-threatening for the organisation or for ourselves needs immediate attention.

After that, we can sit down calmly to review and classify information as it comes in. Rule two: only two questions need to be asked. These are: “is this important to me in my role?” and “does this need attention now, or will its message still be fresh later?

Taking the answers to these questions together, we should be able to assess the importance and timeliness of anything that comes up. Anything that is time-bound and important needs attention now. Anything that can wait and is not relevant must be junked.

The final stage isn’t strictly triage, although it might correspond to a medical decision about who treats a patient. Having decided than a piece of information or an information flow is worthy of attention, we need to decide what to do with it. That is rule three: don’t just read it, do something with it. If information is important, it should need action, filing, or onward communication. What form each of those take is not a question for now, but there is no point paying attention to something if you or your organisation immediately loses the benefit of that attention.

Information triage is just like medical triage in that it stops action before thought. That is potentially a huge change if people have been accustomed to taking in pre-digested information flows without any thought and either acting immediately or not acting at all.

That’s all off the top of my head. Have I missed anything?

Good, better … best?

A while ago, I promised Mary Abraham a summary of my thoughts on “best practice”, which grew into quite a long draft, and then WordPress and I conspired to lose it. Rather than try to re-create it, I have started again. Undoubtedly, the lost version was much better.

In my comment on Mary’s blog, I mentioned that I have tried to avoid using the phrase “best practice”. I have identified three contexts in which my unease about the term arises. These can be summarised as compliance, comparison and complacency.


Last week I visited another firm to share our thinking on a project that we are (separately) engaged in. One of the people on the other side of the table was their Head of Best Practice. I found this job title intriguing, as her main role was to ensure that the firm complied with its regulatory obligations. It is fair to say that mere compliance was not her aim in performing that role — she was thinking imaginatively about how their lawyers could work more effectively as well. However, in this organisation (and possibly many others) the term “best practice” is equated with compliance with some external standards.

In my mind, “compliance” suggests doing something because you have to, even if you think that it is not necessarily the best thing to do for the business. Sometimes it means doing the minimum necessary to avoid the disapproval of a regulator. That doesn’t sound like the best practice possible.


Sometimes the appeal to best practice is a veiled request to emulate other businesses. As Matt Elliott puts it, this stifles innovation:

I’ve been seeing a lot of the Best Practices Guy lately. If you’ve been in the work world long enough, you’re probably familiar with this person: he or she is the one at any and every meeting whose only real contribution to the discussion is to harp on the need to look at “best practices.” Before we can do anything, Best Practices Guy argues, we need to determine what everyone else is doing.

And then, presumably, we’ll just copy them. Because that’s how profits are made.

In the interests of balance, I should point out that Dennis McDonald has issued a riposte:

It’s hard to argue with that. Simply adopting how someone else does something — without thinking long and hard about similarities and differences — would be stupid.

Still, there may be instances where adopting another organization’s best practices might make good sense. There may also be situations where doing so actually stimulates innovation. The trick is to be able to understand how the “best practice” relates to your own organization’s unique needs.

On the whole, my experience is that very few people who look to emulate someone else’s best practice are actually doing the analysis that Dennis advocates.


Interestingly, Matt Elliott suggests that the Best Practice Guy is fearful:

The problem with the type of person I’m describing is that he or she is often motivated almost entirely by fear. It’s not so much research they crave, but safety. If we just do what someone else has done (and succeeded with) we thus have no risk of failure.

Dave Snowden confirms that this safety is misguided in a short note referring to Six Sigma: “Systems that eliminate failure,  eliminate innovation.”

I think the real problem with using best practice to create comfort is that it is always backward-looking. In my comment on Mary’s blog, I referred to a post by Derek Wenmoth about the use of “Next Practice” as an alternative.

At worst, the best-practice approach leads to “doing things right rather than doing the right things. As cited in the presentation; Best Practice asks “What is working?”, while Next Practice asks “What could work – more powerfully?”

Rather than focusing on what has worked for a small number of others in the past to create a universal recipe for action, next practice (as I understand it) focuses on what might be possible in the future, drawing on the widest possible range of stimuli, and requiring a high degree of imaginative thought in applying those stimuli to current problems. As the Innovation Unit puts it in the context of the UK educational system:

Best Practice looks at and promotes leading educational activity for the benefit of the education system as it currently exists. Next Practice works with outstanding practitioners and other interested groups to try to take us beyond the current system into new territory, both in terms of school-based educational activity and in terms of the systems needed to nurture and develop such activity.

That sounds much more interesting to me than the safe best practice.

Finally, a word or two from Ovid (Ex Ponto, II 2 31-32):

Tuta petant alii: fortuna miserrima tuta est;
nam timor euentu deterioris abest.

(Let others seek what is safe: safe is the worst of fortune; for the fear of any worse event is taken away.)

By trying to avoiding the worse events we are driven to create things that are better. A complacent feeling of safety cannot help us do that.