Knowledge what (management or capability)?

Earlier this week, David Griffiths published a thought-provoking post summarising the current state of organisational knowledge management. He highlighted some real concerns, but his conclusion is a positive one.

The past needs to be forgiven, treated as a learning experience. The future is about Knowledge Capability. This requires a change in mind set. Knowledge Capability is not about managing a resource. Knowledge Capability is about embedding, developing, sharing and, most importantly, activating a resource by better coordinating emergent conditions.

This means expanding understanding, influence and integration in such a way so as to enable people to develop as sense makers, problem solvers, decision-makers, collaborators, managers and leaders.
People can choose to ignore the inevitable, but, to remain relevant, this is the future. Knowledge Management maybe dying a slow death, but Knowledge Capability is alive and kicking!

The history of KM is littered with the dreary battles about nomenclature. I have no intention of starting another one. (And nor has David, I think.) I am, however, torn between redefining an existing term and inventing a new one. In the end, both demand a similar effort.

Embed from Getty Images

I think the best one-line definition of ‘knowledge management’ is Nick Milton’s:

“Knowledge Management” is “Management with a focus on Knowledge”.

Management is what we do to make organisations work; to make them prosper and succeed. And if we don’t manage with knowledge in mind, then they won’t prosper and succeed to the same extent.

This gives a clear message to be delivered when discussing ‘KM’ or ‘knowledge management’. Often, however, people hearing that term bring their own (mis)understanding to it. Typically, they start with an analogy to information, document or records management, where the phrase refers to a thing (which is often understood as being close to knowledge) that needs to be managed. As a result, those arguing for a more nuanced meaning must first wrestle it away from often basic conceptions of classifying, storing, retrieving and organising. Worse: those activities tend not to be highly valued by organisations, and that affects the perception of value that might flow from intelligent knowledge management.

By contrast, organisations also manage with people in mind, or money, or customers/clients. Those functions (HR, Finance, and Sales/Marketing) tend not to use the term ‘management’ at all. People also understand better the value that they provide for the business — often to the point that they are represented directly at the most senior level.

The other business disciplines are better developed and understood than knowledge management. This is an ideal opportunity to reframe people’s understanding of our work. Referring to ‘knowledge capability’ can help to do this — it is much clearer about what will change as a result. It also refocuses away from the possibility of managing knowledge towards making changes in organisational practices and people’s behaviour.

In its early days, knowledge management grew out of information management and technology. It is not surprising that those fields still affect the way people still perceive the discipline. Newer influences — such as psychology and other behavioural sciences, organisational design, strategic management — have given KM a role that should place it much closer to the heart of the organisation and the way people work.

This video by Patrick Lambe illustrates this journey. He describes a project to create a set of competencies that is clearly distinct from historic descriptions that were rooted in information management.

[vimeo 86395864 w=508]

The focus in Patrick’s work was on people and how they worked, whereas the TFPL material he describes is rooted in the way knowledge and information might be treated. (That material appears no longer to be available from TFPL.)

For me, that is the heart of the way modern knowledge activities should be appreciated. They prioritise how people work towards improving the success of their organisation — using knowledge. The older approach (which can still be seen in some places) thinks first about knowledge and what might be done with it, rather than organisations or people.

 If you want to move your firm towards better use of knowledge, please get in touch.

Positioning — what is this thing called KM?

One of the most fruitless recurring activities in the knowledge management world is the irregular call to ‘define KM.’ I have touched on this here before, but today is slightly different.

Study at Calke AbbeyMatt Moore has produced an intriguing list of things people call knowledge management. It includes an eclectic mixture of things that don’t always sit well together. However, there are some themes. The descriptions in the predominant group refer to knowledge in some way (typically attached to verbs like transfer, sharing, retention, exchange, development or enablement). The next largest group is ‘social’, closely followed by ‘information’ and ‘learning’, then ‘collaboration’ ‘best practice’ and ‘innovation’.

Overall, the list strikes me as being very dependent on organisational context — a business that depends heavily on marketing products to customers is more likely to react well to “Multi-channel, digital information sharing strategy and customer intelligence capability” rather than “Knowledge and Process Management.” And KM teams may change their focus over time — either as a reaction to changes in the wider business or to take advantage of new tools and technologies that might be seen to further the cause. This would explain the proliferation of ‘social’ titles, for example.

Coincidentally, last week Nick Milton provided an interesting template to help organisations thinking about their approach to knowledge management. His blog post, “Knowledge of process, knowledge of product, knowledge of customer” suggests that organisations locate themselves between three extremes of a triad — process, product, customer — depending on where their efforts are (or should be) focussed.

I like Nick’s approach (even though he underplays the extent to which law firms need to be aware of client needs), but it proceeds on an assumption that organisations are self-aware enough to say honestly where they are. In my experience, few have that awareness. Instead they hoodwink themselves with aspirational assertions about their goals and the value they provide. One of the things that Cognitive Edge techniques can do for a firm is to help them avoid entrained patterns of thinking and unlock the real business culture and aspirations driving knowledge needs.

Once it is understood what (and who) the business is for, and where it is heading, the choice of knowledge activities (and perhaps their name) will flow from that.

If you are interested in exploring these techniques, I can help — get in touch.

Knowledge and information are different (no doubt about that)

In one of those internet coincidences, I have encountered (or re-encountered in some instances) a number of assertions today that we need to distinguish knowledge management and information management. Largely for my own benefit I have synthesised these in the following post.

David Gurteen’s regular newsletter contained the first pointer, to a blog post by Stephen Bounds.

I don’t agree that Information Management should be primarily backwards looking. The use of BI tools like Cognos et al are squarely IM but they are just as useful for forecasting as analysis. More generally, effective IM should always be done with a view to enabling KM process improvements.

I define the difference in this way: Knowledge Management is practised through activities that support better decision-making. IM is practised by improving the systems that store, capture, transmit etc information.

In this sense, a librarian neatly captures both sides of the coin. The act of building and making a library catalogue available is covered by IM. But the transaction by which a person can approach a librarian and leave with a relevant set of data to make a better decision is covered by KM.

Stephen’s post builds on a comment he made to a blog post of Nick Milton’s, in which Nick gives vent to a self-confessed rant:

If, as many people claim, Knowledge Management is “getting the right information to the right people at the right time” then what on earth do they think Information Management is?

Management of X is not concerned with delivery of Y.

Interestingly, although I have had similar experiences to Nick’s of people muddling knowledge and information, many of the links from the linked Google search use the quoted phrase to highlight the same error. One of the clearest of those rejections is that provided by Joe Firestone in one of a series of posts exploring US Governmental Knowledge Management.

If to do KM, we must understand problem seeking, recognition, and formulation, and knowledge production (problem solving), in order to know what is “knowledge,” and what is “just information,” then why not simply recognize that a First generation KM program based on “Getting the right knowledge . . . “ is not a clean alternative that allows one to forget about problems, problem solving, and innovation, but that since it also requires knowledge of these things, we may as well pursue a version of Second Generation KM that seeks to enhance not only “Getting the right knowledge . . . “, but also how we make that “right knowledge,” in the first place.

And as long as we’re at it, let’s also make that distinction between “doing” and “managing” that is at the very basis of the field of Management, and say KM is not primarily about Knowledge Managers “making knowledge” or “Getting the right knowledge to the right person at the right time,” but rather is primarily about enhancing the ways in which knowledge workers do these things. If we do that, we in KM won’t be stepping all over the turf of other managers, who, from a point of view distinguishing managing “knowledge processing,” from “doing knowledge processing,” are some of the primary knowledge workers part of whose job it is to actually make and integrate knowledge into organizations.

Independently, and most freshly, John Bordeaux has revisited an aspect of his critique of KM in the US Department of Defense. Specifically, what is the difference between Information Management and Knowledge Management. His answer:

The difference between IM and KM is the difference between a recipe and a chef, a map of London and a London cabbie, a book and its author.  Information is in technology domain, and I include books (themselves a technology) in that description.  Digitizing, subjecting to semantic analysis, etc., are things we do to information.  It is folly to ever call it knowledge, because that is the domain of the brain.  And knowledge is an emergent property of a decision maker – experiential, emotional framing of our mental patterns applied to circumstance and events. It propels us through decision and action, and is utterly individual, intimate and impossible to decompose because of the nature of cognitive processing.  Of course, I speak here of individual knowledge.

John’s position is especially interesting for his assertion that knowledge is distinct from information in part because of its location. If I understand him correctly, once knowledge is captured, stored, or manipulated outside the brain, it ceases to be knowledge — it is information.

This makes sense to me, but it is at odds (I think) with Joe Firestone’s position, as expressed in a paper elsewhere: “My Road to Knowledge Management through Data Warehousing” (pdf).

[T]he desire to get beyond “arid IT-based” concerns and to take the human-side of decision support into account, is about a view of KM that sees knowledge as subjective and personal in character, largely “tacit” or “implicit”, and as distinct from codified expressions, which are really not knowledge, but only information. Knowledge is frequently viewed as “justified true belief” in this approach, a definition that has been the dominant one in philosophy since Plato, but which has been under vigorous attack since at least the 1930s. People who take this road to KM, view it as primarily an applied social science discipline, whose role is to “enable” better knowledge creation and sharing by facilitating the “conversion” of tacit and implicit knowledge to codified expressions.

The problem with this road to KM is that (a) in viewing knowledge as “justified true belief” it makes it dependent on the “knower” and therefore basically subjective. And (b) in restricting knowledge to beliefs in the mind, it neglects the role of management in providing a framework of rules and technology for testing and evaluating codified expressions or knowledge claims and thereby creating a basis for producing objective knowledge. In a number of other places, I’ve specified two types of knowledge found in organizations: surviving beliefs and surviving knowledge claims. In restricting attention to facilitating expressing surviving beliefs alone, this road to KM misses one of its major objectives: to enhance Knowledge Production and, in this way, indirectly improve the quality of surviving knowledge claims used in future decisions.

I am not sure that I understand Joe’s position completely, especially as his comprehension of the philosophical foundations far exceeds mine. However, the final sentence of the first paragraph above appears not to fit John Bordeaux’s position, although I think the first part of the paragraph does fit. I also struggle with the second paragraph. Even if one can separate knowledge from the ‘knower’, there remains the possibility that what is known depends on the context. As Nick Milton puts it in a comment on his original post:

I could give you a whole stack of information about the rocks below the North Sea – seismic sections, maps, core samples – but could you make an effective decision about where to site an oil well?

I think this comes to a practical problem. Capturing what is known in an objective sense would require a correlative capture of enough context to make it comprehensible by anyone at any point in the future. How much effort would that take, and at what point would it be more economical just to ask the relevant person (or even to start again from scratch)?

First, think…

I wasn’t at the Reboot Britain conference today, but there were some valuable nuggets in the twitterstream for the #rebootbritain hashtag. Of these, Lee Bryant’s reference to Howard Rheingold’s closing keynote resonated most for me.

@hreingold triage skills vital to new world of flow

The most common challenge I see from people about social software, Enterprise 2.0, whatever you want to call it, is that it looks interesting, but they are busy enough as it is, and can’t we do something about information overload. “Where do you find the time to do all this?” I can point to examples where these technologies can save them time (using a wiki over e-mail, for example), but these are often seen as problematic for some reason or another.

Wood stack

What Lee has spotted in Howard’s keynote is that people are being faced with a new challenge in life and work, and it probably frightens them.

Up until now, much of the information we need (as well as a huge amount that we don’t need) has been selected by someone else. Whether it is stories in a newspaper, TV programmes on the favourite channel or information within an organisation, someone has undertaken the task of choosing what the audience sees. As a result, we often have to live with things we don’t want. For example, I have little interest in most sports, so all newspapers have a sports section that is too long for my needs. Our tolerance for this redundancy is incredible. But we still resist changing it for a situation in which we can guarantee to see just what we want (and more of it).

According to Wikipedia (and this chimes with other accounts that I have read, so I trust it for now), triage was formalised as a means of dealing with large volumes of battlefield casualties in the First World War. One approach to medical emergencies might be to treat them as they arise, irrespective of their chances of survival. However, doing this is likely to lead to pointless treatment of hopeless cases and to a failure to treat those with a chance of survival in time. The result is a waste of resources and a higher than necessary death rate. Triage means that immediate treatment can be focused on those whose chances of survival are not negligible and where urgency is most important. Triage in medical emergencies is now a highly-developed technique, with incredibly effective results. (However much it may be resented by the walking wounded who are inevitably kept waiting in hospital accident & emergency departments.)

What would triage mean for information consumption? In the first place, it means no filtering before triage. One of the causes of information overload is that traditional selectors (the TV scheduler or news editor) inevitably pay no attention to the personal needs or interests of the audience. How could they? So, unlike the A&E department, we cannot rely on a triage nurse to make our choices for us. Rule zero, then, is that everyone does their own triage.

One of the key things about hospital or battlefield triage is that we don’t waste time with it if there is a clear life-saving need. So rule one of information triage is that anything life-threatening for the organisation or for ourselves needs immediate attention.

After that, we can sit down calmly to review and classify information as it comes in. Rule two: only two questions need to be asked. These are: “is this important to me in my role?” and “does this need attention now, or will its message still be fresh later?

Taking the answers to these questions together, we should be able to assess the importance and timeliness of anything that comes up. Anything that is time-bound and important needs attention now. Anything that can wait and is not relevant must be junked.

The final stage isn’t strictly triage, although it might correspond to a medical decision about who treats a patient. Having decided than a piece of information or an information flow is worthy of attention, we need to decide what to do with it. That is rule three: don’t just read it, do something with it. If information is important, it should need action, filing, or onward communication. What form each of those take is not a question for now, but there is no point paying attention to something if you or your organisation immediately loses the benefit of that attention.

Information triage is just like medical triage in that it stops action before thought. That is potentially a huge change if people have been accustomed to taking in pre-digested information flows without any thought and either acting immediately or not acting at all.

That’s all off the top of my head. Have I missed anything?

Your boom is not my boom

I am currently reading Generation Blend: Managing Across the Technology Age Gap. (There will be a review when I have finished it.) In the first chapter there is a graph of the birth rate in the United States which brought home to me how much our unarticulated assumptions matter.

Here is the graph (taken from Wikipedia):

This shows a clear increase in birth rate between 1946 and 1962 (known as the Baby Boom), followed by a slump between 1963 and 1980 (Generation X) and a rise again between 1981 and 2000 (Generation Y, or the Millennials). Compare this with the birth rate in the UK, as illustrated in the graph below (drawn using figures from the Office of National Statistics).

 uk-demographics

I have shaded three areas in the UK chart, marking years after the 1930s in which the birth rate rose significantly above the norm. The peaks occurring in the periods 1944-49 and 1957-1972 exceeded the mean for the century (just over 700,000 births per annum, apart from a slight dip below this in 1945). I have marked another bulge between 1986 and 1996, but the birth rate in these years is still below the mean for the century (the peak year is 1990, with 706,140 births — 2000 below the mean). By comparison, the birth rate over the same period in the US exceeded that in some years of their post-war baby boom.

For me, this difference between the US and UK is striking. It means that we need to be careful when using terms like “baby boom” and when assessing the impact of generational change in the workplace. As the US Bureau of Labor Statistics noted in 1985 (“Recent trends in unemployment and the labor force, 10 countries”):

In North America, birth rates peaked in the late 1950’s . In Western Europe, however, the peak occurred in the early to mid-1960’s, which coincided with the tapering off of North American birth rates. In Australia and Japan, the peak was reached much later, in the 1970’s.

In the United States and Canada, the children born during the baby boom reached working age in the early 1970’s, whereas those in Western European countries reached working age nearly 10 years later, during a period of generally declining economic growth . For Australia and Japan, the entry of the baby-boom generation is just beginning or yet to come.

There are two significant implications for the workplace. The first is that the UK baby boomers will be retiring ten years after those in the US. As a result, whereas the US needs to cope now with the “yawning gap in skills, experience, leadership, knowledge, and experience” (as Generation Blend puts it) that the loss of this cohort will bring, UK businesses have another decade to work out how to respond. On the other hand, the United States can almost count on their Millennials to replace the Baby Boomers, given the significant similarity in the birth rates. In the UK we do not have that luxury — there are not enough people in Generation Y to fill the places of the retiring generation. As a result there is almost certainly a different dynamic between the generations in the UK than there is in the US, and those of us on this side of the Atlantic need to be conscious of this when taking our cue from American studies and commentary.

Social software in law firms

About ten days ago, I attended a law firm breakfast meeting hosted by Headshift, the social software consultancy. Penny Edwards has blogged about the event and posted the presentation on Slideshare. It was a really interesting meeting and discussion, and well worth the very early start I had to make to get there from Manchester.

The presentation focuses on the value that social software can bring to law firms in the area of current awareness, which is a really interesting use-case. I think there is a lot that lawyers can do with social software, but it will take a while to wean them off Word and Outlook. (That isn’t to say that those tools do not have their place, but we know they are used sub-optimally.) On the other hand, information professionals in law firms are crying out for better ways of managing client and legal updates and research. Once they are up and running with new tools such as the ones demonstrated by Headshift, I think the lawyers will quickly come to understand the ways in which they can work better than Word or Outlook.

Following the presentation, Penny demonstrated some work that Headshift have done for Dewey & LaBoeuf. This integrates a wiki (Confluence, I think, although Headshift also work with Socialtext) with an enterprise RSS service (Newsgator). The main virtue of this work, as far as I could see, was the simplicity with which the elements were fitted together. Obviously we couldn’t see how they integrated with Dewey’s existing intranet, but I could see how they could slot in quite seamlessly.

As with most of these events, though, the really interesting part was the discussion. Fired by the presentation and demonstration, there were many questions round the table. These carried on even after the formal part of the meeting was over. One of the comments that really stuck in my mind was something that Lars Plougmann said. He reckoned (without having been able to test it) that the participation dynamic is different when social software comes inside the firewall.

The now-traditional assertion about wikis is that usage breaks down in three ways: 90% of people read but do not contribute; 9% contribute from time to time; and 1% participate heavily — accounting for most of the material. As far as I can find out, there is nothing to suggest conclusively that Lars’s view is accurate. (His hope is reflected by others, though.) But what are the consquences if the 90-9-1 rule does hold true for enterprise wikis?

If we construe it strictly, this usage profile should mean that no wiki can succeed if it serves less than 100 people (since a fraction of a person would be required otherwise). Some enterprise wikis might cover a much smaller group than this (such as a client-focused knowledge-sharing wiki where the client team is only 50 lawyers or so). However, if a single person were to support more than one wiki, their efforts could sustain 99 people overall. This leads me to the (I think inexorable) conclusion that we should focus our wiki efforts on areas where there are keen contributors rather than those where we could see a significant RoI, but no obvious wiki leaders. This appears a little counter-intuitive, and would need some nifty footwork to convince Ricky Revenue.

In all, then, a thought provoking morning and a welcome distraction. Many thanks to Penny and Lars!