Power isn’t everything

I have been reacquainting myself with some of the materials science reading that I did as part of my Physics studies over 30 years ago. My brain is too far removed from the maths to deal with the more technically complex stuff, but there is a classic pair of books by J.E Gordon that are easily accessible to the lay reader: The New Science of Strong Materials, or Why You Don’t Fall Through the Floor and Structures, or Why Things Don’t Fall Down. Reading the latter, I was struck by some of the insights in the chapter on shear and torsion, more from a historical perspective than an engineering one.

Gordon reflects on the development of the aeroplane, and remarks that some aspects of the new aeronautical engineering were easier to tackle than others. 

The aeroplane was developed from an impossible object into a serious military weapon in something like ten years. This was achieved almost without benefit of science. The aircraft pioneers were often gifted amateurs and great sportsmen, but very few of them has much theoretical knowledge. Like modern car enthusiasts, they were generally more interested in their noisy and unreliable engines than they were in the supporting structure, about which they knew little and cared less. Naturally, if you hot the engine up sufficiently, you can get almost any aeroplane into the air. Whether it stays there depends upon problems of control and stability and structural strength which are conceptually difficult. (p.259)

He then goes on to tell the story of the German monoplane, the Fokker D8, which initially had an unfortunate habit of losing its wings when pulling out of a dive. As a result, the Germans could not capitalise on its obvious speed advantage over the British and French biplanes. Only once Fokker had analysed the effect of the relevant forces on the wings did he realise that the loads imposed on the plane were causing the wings to twist in a way that could not be controlled by the pilots. Once the design of the wings was changed so that they no longer twisted, the D8 served its purpose much more effectively.

Gordon makes a similar observation with regard to automobile development.

The pre-war vintage cars were sometimes magnificent objects, but, like vintage aircraft, they suffered from having had too much attention paid to the engine than to the structure of the frame or chassis. (p.270)

Reading this, I wondered whether organisational KM efforts have had similar shortcomings. Certainly, in many businesses, the KM specialists proceed by trial and error, rather than careful scientific study. There is also a tendency (driven in part by the need for big strong metrics and RoI) to focus on things like repositories and databases. Are these the powerful engines of KM, destined to shake apart when faced with conceptually difficult structural challenges? I suspect they may be.

Instead of concentrating on raw power, we need to work out what our KM activities actually do to the structure of the organisation, and how they affect the parts different people play in making the business a success. In doing that, we may find that small changes make a significant difference. It is not an easy task, but it is a worthwhile one.

Model KM — iTunes or Spotify?

The current craze amongst the UK musical digerati is a service based in Sweden called Spotify. Simply put, it is a legal way to play any music you could imagine (although there are the usual absences — The Beatles and Led Zeppelin, for example) without buying a copy of the CD or downloading a permanent digital version. Previously, when I thought about subscription-based music services, I was not sure what the attraction would be. I like to have my music with me wherever I am — in the car, on the train, walking to work. On reflection, however, that is just a habit that I developed when I first acquired an iPod. Prior to that, my music was something that belonged at home or (in limited quantities) in the car. 

Spotify has changed my outlook. I still have all my music to carry with me, but I also have a vast collection at my disposal when I am at the computer. As a result, I have renewed my acquaintance with music that I would never have bought in permanent form. It is like having a radio station with an amazingly large and personal playlist. I can also decide whether I like something enough to buy it to add to the portable permanent collection.

So why the KM connection in the title? It occurred to me that a personal collection of music (often housed in iTunes) could be likened to an organisational knowledge repository. By comparison, access to a remote and unimaginable database of music if replicated in the knowledge context could be so many things. For law firms there are online information and knowledge services, but they tend to be structured in the way that the service provider dictates. Spotify imposes no structure. Consider this “Spotification” of a domestic CD collection. It is effectively a visual mapping of a CD collection onto Spotify links. The physical asset (a CD) is used as a metaphor for a virtual one. At a more basic level, users can create playlists of the music they like (just as in iTunes, but a little more basic at present).

Another important distinction between iTunes and Spotify is that I don’t have to worry about uploading new content to Spotify. Someone else does that (just as they do in an online legal information service). So we can think of Spotify in the KM context as a place where anything one might want to hear (know about) is available, as opposed to the place where one can only hear (find out about) what one already knows. Surely that is a better position to be in?

There is also a social feature to Spotify. Playlists can be personal (here is all the music I like) or collaborative (let’s share all the tracks of a certain type that we like). That feature could be replicated in know-how systems as a form of joint research (here are the useful resources on a topic of mutual interest). As yet there is no tagging in the system, and it is tied to the internet and to computers (although I gather there is an iPhone client in the works, and rumour has it that a limited offline capability may also be forthcoming). The future looks interesting for music and for KM, but I wonder what will be the ultimate mortal wound for internal knowledge repositories.

Social = people = personal first

I have been thinking recently about the power of social software at work — prompted in part by my post earlier in the week, but also by news that Cogenz, an enterprise social bookmarking tool, is now available as an on-premises version at a strikingly reasonable cost. (This may not be new news, but I only heard of it this week.) I have also been pondering the 800lb gorilla in this room: Sharepoint. After this cogitation, I have come to the view that successful enterprise social software has to put the enterprise last. This is a reversal of the traditional paradigm of business computing.

Since the birth of LEO nearly 58 years ago, computers have been part of business. By and large, their role has been to automate, speed up, replicate, organise, make more efficient, or otherwise affect work activities. That is, their primary impact has been on things that people would not do unless there was a business reason for them to be done. As a by-product in later years, people started to use business-related software to manage domestic or private activities (writing letters, making party invitations or balancing household accounts, for example), but these tended to be peripheral. During this time, if they had a home computer at all, people would expect technology at work to be ahead of what they had at home.

Over the past 5-10 years the balance between personal and business technology has changed completely. Driven by (a) the spread of internet connectivity (especially wi-fi) into the home, (b) the need to support other digital technologies (cameras, music players, gaming devices, for example), and (c) increased functionality and connectivity in small-format devices (mainly mobile phones), it is now frequently the case that people’s domestic technology outstrips that provided to them at work. Alongside this change in the hardware balance of power (and for similar reasons), software has also become much more focused on enhancing the things that people might want to do for themselves, rather than for a salary.

These changes are part and parcel of Web 2.0, social software, social networking — call it what you will. Those tools work because they serve an individual need before they do anything else. A couple of examples by way of illustration.

  • Delicious works in the first instance because it helps people store pointers to web pages that they find useful. Because that storage takes place independently of the computer the user sits at, it is ideal for people who access the internet from a variety of locations (home, work, a public library, and so on). Better than that, delicious allows people to start to classify these pointers, or at least tag them with useful aides-memoire. Both of those things — location-independent storage and tagging — mean that delicious is already more useful than the alternative (browser-based bookmarks). The final piece in the jigsaw — sharing of bookmarks — is just the icing on the cake. The social aspect only comes into play once personal needs are satisfied.
  • Flickr has a similar dynamic. As digital camera use spreads, people start to need different ways of showing pictures to their friends and families. It is rare that people will print all of their holiday snaps so that they can take them to work and make their colleagues jealous. Instead, they can upload them to the website and share the link. After a while, having uploaded hundreds or thousands of pictures, finding the right ones becomes difficult. But flickr offers the possibility of tagging individual pictures or grouping them in sets. That organisation makes it much easier to show them with the right people. But it also means that other people’s pictures can be discovered because they have used the same tags. Like delicious, the social aspect — sharing, commenting on, and collecting other people’s pictures — comes after the personal.

Unlike the telephone, or e-mail, which depend for their efficacy on network effects, these social tools have value at a non-networked, private, personal level. Unsurprisingly, the early adopters of those first communications technologies were large organisations. If nothing else, they were able to create small network effects internally or between each other. For example, universities were early users of e-mail because it sat well with traditional inter-institution academic collaboration. By contrast, businesses and other organisations have typically lagged behind individuals in the adoption of Web 2.0 tools. (To be clear, individuals at work may well be early users of these tools, but their employers tend to see the light much later.)

As a general rule of thumb then, technologies supporting new types of social interaction tend to be proved by use in a non-commercial context and by providing real personal value ahead of any network effect. Sometimes this doesn’t quite work out. Twitter, for example, provides little personal value without the network effect. However, I think the fact that there is such a low barrier to entry to the twitter network explains that. It also came late to the social party, and so it could piggyback on existing networks. Sometimes the social element doesn’t have a particularly great impact. Many people on flickr do not use the full range of tools (commenting, tagging, etc). I use Librarything primarily as a catalogue of my books, to make sure that I don’t duplicate them. There is a social side to the service, but I haven’t really engaged with it. That does not diminish the utility of the site for me or for anyone else using it.

This week’s McKinsey Quarterly report on “Six ways to make Web 2.0 work” includes a similar point:

2. The best uses come from users—but they require help to scale. In earlier IT campaigns, identifying and prioritizing the applications that would generate the greatest business value was relatively easy. These applications focused primarily on improving the effectiveness and efficiency of known business processes within functional silos (for example, supply-chain-management software to improve coordination across the network). By contrast, our research shows the applications that drive the most value through participatory technologies often aren’t those that management expects.

Efforts go awry when organizations try to dictate their preferred uses of the technologies—a strategy that fits applications designed specifically to improve the performance of known processes—rather than observing what works and then scaling it up. When management chooses the wrong uses, organizations often don’t regroup by switching to applications that might be successful.

In practice, I suspect this means that corporate information is less likely to lead to social interactions (even inside the firewall) than personal content is (such as collections of links, and views expressed in blogs). People are more likely to appreciate the value of other people’s personal content than anonymous material, no matter how relevant the latter is supposed to be to their work. More importantly, when someone appreciates the value of being able to create their own content by using a tool or system provided by their employer, they are more likely to support and promote the use of that tool or system amongst their colleagues. That way success lies.

But what of existing corporate systems? Can they have social elements successfully grafted onto them? This question is most commonly asked of Sharepoint because, as Andrew Gent has put it “Is SharePoint the Lotus Notes of the 21st Century?“. He starts with praise.

The result is a very powerful collaboration, simple document management, and web space management system. It didn’t hurt that V2 of the team collaboration portion of the product (known at the time as Windows SharePoint Services) was “free” for most enterprise Office customers. SharePoint essentially invented a market segment which until that point had been occupied by “integrated” combinations of large and/or complex product sets. Just as Lotus Notes did 20 years ago.

Another similarity is the limitations of the basic architectural design of the product. All products have what could be called a “design center” — a focal point — an ideal business problem that the product tries to solve. The design center defines the core architectural goals of the product. SharePoint’s design center is flexible collaborative functionality centered around light-weight document management and customizable portals.

And the fact is SharePoint’s design center hit a bull’s eye. The need for easy-to-use collaboration spaces and web sites that don’t require web programming — that work well with Microsoft Office and the Microsoft security model — has been a big hit inside corporations. As a salesman for a competing product once told me, his job is not so much selling their own product, but explaining why customers shouldn’t use SharePoint.

But then things get ugly:

SharePoint is designed with flexibility at the space or site level. It allows individuals to take responsibility for managing their own sites and collections of sites. But if — from a corporate or even a divisional level — you want to manage the larger collection, SharePoint becomes resistant — almost belligerent — to control.

The inability to create even simple relationships between lists in different spaces (beyond simple filtered aggregation) without programming is the first sign of strain in SharePoint’s design. Then there are site columns. Site columns let you — ostensibly — define common metadata for multiple lists or libraries. However, you cannot enforce the use of site columns and site columns only work within a single site collection. There is no metadata control across multiple site collections. In other words, simplified control within the sites leads to lack of control at the macro level.

These are all just symptoms of a larger systemic issue: SharePoint is designed around the site. In Version 3 (also know as MOSS 2007) site collections have been introduced to provide some limited amount of cross-site control. But the underlying design principles of SharePoint (ie. user control and customization) work against control at the higher level.

So there is a fundamental reason why Sharepoint will not be able to move from the merely collaborative to the genuinely social. It is driven by the need to support existing business structures and pre-defined designs. Sharepoint uses cannot be emergent, a key feature of Enterprise 2.0 tools (as explained by Andrew McAfee and Rob Salkowitz), because they need to be planned from the outset. In David Weinberger’s terms, the filtering takes place on the way in, not on the way out. As JP Rangaswami suggests, filtering on the way out provides opportunities for more interesting knowledge management.

1. In order to filter on the way in, we need to have filters, filters which can act as anchors and frames and thereby corrupt the flow of information. We’ve learnt a lot about anchors and frames and their effect on predilections and prejudices and decision-making. With David’s first principle, we reduce the risk of this bias entering our classification processes too early.

2. I think it was economist Mihaly Polanyi  who talked about things that we know we know, things that we know we don’t know and things that we don’t know we don’t know. Again, filtering on the way in prevents us gathering the things that we don’t know we don’t know.

3. The act of filtering is itself considered necessary to solve a scale problem. We can’t process infinite volumes of things. But maybe now it’s okay to be a digital squirrel, given the trends in the costs of storage. [Sometimes I wonder why we ever delete things, since we can now store snapshots every time something changes. We need never throw away information]. Filtering on the way out becomes something that happens in a natural-selection way, based on people using some element of information, tagging it, collaboratively filtering it.

Thanks to Euan Semple, we do at least know that Microsoft’s heart is in the right place:

…the highlight so far for me of FASTForward ’09 has been getting to know Christian Finn, director for SharePoint product management at Microsoft. Christian is a really nice guy who has been going out of his way to spend time with the bloggers from the FASTForward blog and myself getting his head around the social computing world we all get so excited about.

I am interested to see how this engagement works out for Microsoft and for us. Especially because I think one of the underpinnings of the Microsoft/Apple dichotomy is the two companies’ different approaches to the corporate and the personal. Apple has always been more focused on the personal, while Microsoft concentrated on enterprise needs. This nearly killed Apple in the years when “personal computers” were really more likely to be desktop enterprise systems. Apple has made a comeback on the back of increased personalisation of technology. Can Microsoft work out how that is done?

With a little help from my friends

Knowledge management activities in UK law firms depend very heavily on people power — being more reliant on Professional Support Lawyers (PSLs) than their US and continental European counterparts. Despite this, the recent Knowledge Management in Law Firms conference had a noticeable technology focus. I’m afraid I set the tone in the first session with a couple of case studies on KM/IT implementations, but in my defence I did concentrate on the people issues rather than the technology. After that we had many screenshots of systems, mashups, search tools, RSS blogs, wikis and more. All the time we kept telling ourselves that KM wasn’t all about technology, but I wonder whether the historically divergent US and UK law firm traditions are moving closer together. We are using more technology and they are using more PSLs (or KM lawyers).

And then the final question silenced us all. One of the two search engine suppliers at the conference mentioned that they were accustomed to hosting conferences with the IT directors of their main customers — to find out what keeps them up at night and to gather information to drive development of their products. Coming at the end of a panel discussion focusing on how we meet client needs for KM support, I think many of us expected this statement to be followed by a suggestion that law firms might do something along similar lines for their top clients. But no — instead the question was whether the suppliers of the IT tools that we had all been discussing for the previous two days should be speaking to us instead of our IT directors. And, more pointedly, how did we feel about our project spend being controlled by someone who did not necessarily know (or at least understand) the strategic objectives underpinning our KM projects?

The supplementary question was probably a bit provocative. I hope most IT directors do understand and buy into their firm’s KM strategies. However, there is a bit of truth in the assumption behind it. KM projects have to fight for IT time and resources along with everything else that the firm needs — from recurrent and inevitable hardware replacement to big infrastructural projects or change driven by other parts of the firm. How do we feel about that?

Actually, is that the right question? Like the lawyer-client relationship, the IT/KM relationship is just that — a relationship. In order to prevent it becoming disfunctional (or to rectify it if a breakdown has already happened), I think it is helpful to remember two key points. Neither of them refer directly to how we feel. The two points are these:

  1. If something is wrong in a relationship, you cannot change it by focusing on someone else’s behaviour. The only behaviour you can guarantee to change is your own.
  2. The changes you make will have most impact if you understand what preoccupies the other person and play to it.

Let’s elaborate these two points, using the IT-KM relationship as an example.

It’s not you, it’s me…

One of the things that we often forget to take account of in our relationships is that what is important to us in not necessarily a priority for the other person. Just as our jobs give us a full workload, and many challenges, those whose services we need to call on are equally burdened. If we are lucky, they may respond well to a simple plea for attention, but this is most likely when our needs are already important to them. If a simple plea does not work, it will not be any more successful if it is just repeated more loudly. The toddler having a tantrum on the supermarket because they have been refused the sweets they demanded has yet to learn this lesson.

If we change our approach, we may be more successful in getting attention and changed behaviour on the other side of the relationship. If our needs are not a priority for someone else, we might be able to get what we want by framing our request so that it appeals to them more. A demand for more IT resource for KM is likely to fall on deaf ears, but a suggestion that IT and KM (perhaps together with BD) might develop products for knowledge sharing with clients (for example) is likely to command more interest. That would allow IT to demonstrate alignment to the firm’s strategic objectives. This is a similar (although more finessed) approach to that adopted by the teenager who argues that use of the family car would give them a safer return from a late party than waiting for a night bus.

It is rare in a relationship that any difficulties are due solely to the behaviour of one party. There is usually a balance of responsibilities. If we accept that, and consciously change our own behaviour, we can swing the balance in our favour.

What do you want?

Bearing all this in mind, what should KM people know about their IT colleagues? What are the pressure points for technology in law firms? It is difficult to generalise — firms and culture differ — but here are some suggestions. Think scalability, robustness and support.

Scalability: What are the implications of your proposed KM solution for more than a handful of users? OK, you can knock up a quick blog or wiki installation on your home PC, but how does that compare to a platform to support the needs of a thousand or more users? Does your ‘free’ software actually come with significant costs when scaled up beyond more than handful of users?

Robustness: Law firms are not unique in needing high levels of IT security, but that does not mean that the demands of a resilient technology platform should be minimised. It takes time and effort to keep a system running 24/7. At the moment, you may be comfortable that your new system does not need that kind of resilience, but you probably want it to integrate with existing security systems so that users do not have to log in afresh. Likewise, IT will need to be comfortable that no harm is done to the existing critical systems.

Support: Are the technologies that your favoured solution depends on known or unknown within your IT team? It is easy to underestimate the challenges involved in supporting new things. Once your new system takes hold, your less technically-savvy colleagues will expect the same levels of personal support that they currently get for the firm’s established systems. Behind the scenes, your apparently simple blogging platform (for example) is probably actually quite complex. Without an established body of knowledge in the IT team, supporting that platform is expensive — either in training or external consultancy. Whose budget is that coming from?

Bearing those concerns in mind, it becomes easier to understand the IT professionals’ exasperation at comments like those of silicon.com’s resident devil’s advocate, the Naked CIO, when s/he refers to IT’s weasel words. This comment is particularly telling:

But the part of this article that us foot troops are most likely to disagree with is the idea that we are scared to tell the real story. Not scared, but fed up. Fed up with being told that we are making it deliberately complicated. Fed up with our words being distorted by those that don’t understand our jobs. Fed up with our senior managers not having the courage to fight our corner after those distortions.
It takes two to tango. If colleagues in other functions were prepared to treat IT with respect, long suffering troops wouldn’t be driven to evasive tactics. We obfuscate because non-IT colleagues are getting worse in their assumptions about what is and isn’t a simple problem in IT. “I’ve knocked something up in Access, how hard could it be to make it work for 1000 concurrent users in a distributed environment with no performance issues?” People don’t challenge how hard it is to construct a major building or manufacture a car. That’s because those things are tangible. They can see that it’s difficult. IT is almost invisible, so otherwise sensible people somehow equate invisible to simple “because I can imagine how to do it in my head”.
Until we find a way to address the almost wilful lack of trust and understanding of IT in non-IT colleagues, this situation will worsen.

So the ball is back in our court. Trust and understand your IT colleagues — cooperation and effective collaboration will follow.

(Having said all that, I still have no idea why Neil Richards’s experience of IT projects in a bank was so different from his previous life in a law firm.)

Going with the flow

I had a number of discussions with people last week that brought to mind Michael Idinopulos’s description of the relationship between wikis and work.

Wikis can be used for many different activities, which fall into two broad categories:

  • In-the-Flow wikis enable people do their day-to-day work in the wiki itself. These wikis are typically replacing email, virtual team rooms, and project management systems.
  • Above-the-Flow wikis invite users to step out of the daily flow of work and reflect, codify, and share something about what they do. These wikis are typically replacing knowledge management systems (or creating knowledge management systems for the first time).

When wiki champions complain that it’s hard to get people to use wikis, they’re usually thinking of above-the-flow wikis. Modeled on  Wikipedia, these wikis typically aspire to capture knowledge and insights that people collect in the course of their work. That’s a hard thing to get people to do.

Michael’s assertion is true for know-how systems generally. Too many appear to have been built on the Field of Dreams principle (“if you build it, they will come”). Even if that isn’t true, my impression (derived from reading articles and listening to conference presentations) is that many firms spend inordinate amounts of time devising elaborate systems of incentives and cajoling their people to contribute to the know-how database. Why do they have to do this? Because the capture and collection of knowledge is above the flow.

More importantly, the availability of rewards for know-how is not likely to make participation in the knowledge process part of the flow. In fact, I suspect that they make it easier for people to opt out of that process (or to become free-riders on the labours of those who support it). The challenge is not to get people to break out of their flow so that they engage with KM systems, but to change the flow so that those systems become part of the flow — a natural part of the daily routine.

That is easier said than done, of course. There is an upside, though. In essence, this is what we need  to do in order to create a knowledge culture. Expressed in those terms the problem becomes much more tangible and therefore potentially more manageable. If someone is asked to create or change the knowledge culture, the task looks (and probably is) insuperable. If it is defined as moving certain activities into mainstream processes, the task can be broken into smaller pieces that are less daunting.

The classic example of knowledge systems that fit in the flow must be the case of the Xerox photocopy engineers. As part of work developing an expert system for photocopier fault diagnosis, Xerox discovered that their engineers were less than impressed with the system because the hardest problems they encountered were not already documented and therefore outside the scope of the expert system.

We decided to spend more time observing what technicians actually did day to day. We started with US technicians, accompanying them on their service calls. Most of the time, they would look at the machine, talk to the customer, and know exactly what to do to put it in good working order. Occasionally, they ran into a problem that they hadn’t seen before and for which there was no documented answer. They would try to solve these problems based on their knowledge of the machine. This often worked, but sometimes they were stuck. They might call on a buddy for ideas, using their two-way radios, or turn to the experts—former technicians now serving as field engineers—who were part of the escalation process. When they solved unusual problems, they would often tell stories about these successes at meetings with their coworkers. The stories, now part of the community, could then be used at similar gatherings and further modified or elaborated.

The story-telling referred to actually took place informally as part of the daily routine. As these technicians travelled around their areas, they would meet their colleagues in coffee-shops or rest-stops. Their conversations — part of the daily flow — would probably cover the usual range of topics, as well as the tips they had learnt or developed to deal with previously unforeseen copier faults. In order to make the most of these interactions, the system that Xerox developed (named Eureka) had to expand on them, and keep the knowledge-sharing process in the flow. As their director of knowledge management for worldwide customer services, Tom Ruddy, put it:

When people hear about Eureka, they always want to see the software. But it’s really the environment that we are creating. We realized early on that technology wasn’t the solution-that if we didn’t work on the behavioral side of the equation, it wouldn’t be successful. We concentrated on understanding what would make people want to share solutions and take their personal time to enter stuff into the system.

Coincidentally, one of the hottest blog posts of the weekend touches on a similar cultural problem: how do you get people to share knowledge using blogs? Tim Leberecht’s solution: mandatory employee blogs. As Doug Cornelius points out, there is a strong tide of opinion against this choice. In part, this tide is driven by the desire to reflect people’s normal work patterns and habits in knowledge sharing and collaboration. In part, I think there is also a recognition that blogging is not yet mature enough to become a part of those work patterns and habits. (In a separate post, Doug refers to some work done by Forrester which suggests that only 13% of consumers would participate in social media to the extent of creating content (which covers blogging as well as uploading video to Youtube.)

Pollyanna-like, I have a tendency to see opportunities in problems. I don’t want to make blogging mandatory, but I am sure that there are people who would blog, but don’t share knowledge, just as there are people who would like an easier way to share knowledge as part of their work. I want to bring those people together (and then throw a few non-sharing non-bloggers into the mix) to see what kind of culture we can create. At the same time, I want to find as many ways as possible of bringing knowledge into the work-flow. (It looks like Mary Abraham and I have similar views.)

Beyond the Golden Rule

I am still catching up with unread blogs, but I want to add something to Mary Abraham’s commendation of the Golden Rule as the key to collaboration. As the Wikipedia entry on the Rule suggests (at the moment), it can be the cause of problems when there are differences in values or interests:

Shaw’s comment about differing tastes suggests that if your values are not shared with others, the way you want to be treated will not be the way they want to be treated. For example, it has been said that a sadist is just a masochist who follows the golden rule. Another often used example of this inconsistency is that of the man walking into a bar looking for a fight. It could also be used by a seducer to suggest that he should kiss an object of his affection because he wants that person to kiss him.

I was not alone in admiring the late Jon Postel, perhaps the quietest genius behind the creation and early management of the Internet. One of his lasting legacies, sometimes forgotten in the rush for innovation, is found in the heart of one of the basic definitions of the Internet’s Transmission Control Protocol, RFC 793:

TCP implementations will follow a general principle of robustness: be conservative in what you do, be liberal in what you accept from others.

I have found this a useful general principle for human communications too, even though I sometimes forget it myself. The wheels of collaboration run much more smoothly when one resists enforcing rules against others, whilst maintaining one’s own obedience to the same rules.

Lowering the sharing threshold

A common meme in knowledge management is that “people don’t share knowledge.” Here are a few examples:

The non-sharing statement is usually coupled with a set of purported justifications, and may also include a solution. However, I am not sure that the basic proposition is correct. In my experience, people are naturally willing to share what they know, except that some other factors might intervene. Some of those factors have their roots in professional habits, others in workplace politics. One of the core tasks of knowledge management is to investigate them and to demonstrate their falsity. If this is correct, non-sharing is a symptom, rather than the disease itself.

In a speech entitled “Gin, Television, and Social Surplus” at the Web 2.0 Expo (video | transcript), Clay Shirky identifies another obstacle to sharing: mother’s ruin. That is, the modern equivalent: television. In case this seems facile, consider Shirky’s argument. Referring to the argument of an unnamed historian, he proposes that just as excessive gin consumption was the way that British society coped with the societal and cultural rupture caused by the Industrial Revolution, with an eventual outpouring of civic energy when we sobered up, so we have dealt with the post-war lifestyle revolution by excessive consumption of television.

If I had to pick the critical technology for the 20th century, the bit of social lubricant without which the wheels would’ve come off the whole enterprise, I’d say it was the sitcom. Starting with the Second World War a whole series of things happened–rising GDP per capita, rising educational attainment, rising life expectancy and, critically, a rising number of people who were working five-day work weeks. For the first time, society forced onto an enormous number of its citizens the requirement to manage something they had never had to manage before–free time.

And what did we do with that free time? Well, mostly we spent it watching TV.

We did that for decades. We watched I Love Lucy. We watched Gilligan’s Island. We watch Malcolm in the Middle. We watch Desperate Housewives. Desperate Housewives essentially functioned as a kind of cognitive heat sink, dissipating thinking that might otherwise have built up and caused society to overheat.

In this analysis, people are beginning to realise that instead of sinking time into television-watching, they could be doing other things — editing Wikipedia, making videos for Youtube, writing and commenting on blogs, and so on. 

So how big is that surplus? So if you take Wikipedia as a kind of unit, all of Wikipedia, the whole project–every page, every edit, every talk page, every line of code, in every language that Wikipedia exists in–that represents something like the cumulation of 100 million hours of human thought. I worked this out with Martin Wattenberg at IBM; it’s a back-of-the-envelope calculation, but it’s the right order of magnitude, about 100 million hours of thought.

And television watching? Two hundred billion hours, in the U.S. alone, every year. Put another way, now that we have a unit, that’s 2,000 Wikipedia projects a year spent watching television. Or put still another way, in the U.S., we spend 100 million hours every weekend, just watching the ads. This is a pretty big surplus. People asking, “Where do they find the time?” when they’re looking at things like Wikipedia don’t understand how tiny that entire project is…

There is another part to the jigsaw. It is not enough to realise that there is another way of spending this time — the activation energy to engage in this alternative has to be sufficiently low. That is the power of these social technologies — they lower the threshold of participation, and they draw people in:

I’m willing to raise that to a general principle. It’s better to do something than to do nothing. Even lolcats, even cute pictures of kittens made even cuter with the addition of cute captions, hold out an invitation to participation. When you see a lolcat, one of the things it says to the viewer is, “If you have some sans-serif fonts on your computer, you can play this game, too.” And that’s [sic] message–I can do that, too–is a big change.

Not surprisingly, not everyone understands this.

This is something that people in the media world don’t understand. Media in the 20th century was run as a single race–consumption. How much can we produce? How much can you consume? Can we produce more and you’ll consume more? And the answer to that question has generally been yes. But media is actually a triathlon, it ‘s three different events. People like to consume, but they also like to produce, and they like to share.

[My emphasis.]

In my mind, this raises a challenge for people involved in knowledge management. Putting aside other excuses for not sharing knowledge (which we can deal with separately), it is inevitable that a range of displacement activities have grown up in businesses to create the illusion of busyness and thereby make it possible for people to argue that they have no time to share their knowledge. Here are three off the top of my head:

  • Meetings
  • E-mails
  • Self-justifying reports

Each of these can serve a useful purpose (just as gin and television have their place). Often, however, the production and consumption of meetings, e-mails and reports generates vanishingly small amounts of value for the enterprise. (Probably on a par with watching repeats of Friends.) At work, in blogs and on mailing lists, more and more people are declaring themselves to be fed up with these value-minimal activities. If we make it easier to share, collaborate, and engage meaningfully with our colleagues, then I think it will only take a small push to tip people into these new forms of interaction.

Web 2.0: life-enhancing technology

A quick post to draw attention to Scott Berkun’s report from the Web 2.0 expo. Here’s the bit that deserves memorialisation: 

The unspoken nugget / explanation / marketing line that might get me jazzed is this:

We have always been collaborative. Always been social. It’s in our genes and it’s what we have evolved to do well. Good technologies enhance our natural abilities, give us useful artificial ones, and help us to get more of what we want from life. Web 2.0 and social media make the process of collaboration and developing relationships more fun, efficient, powerful and meaningful.

Ok. Now we’re talking. With a statement like this I can walk the halls of the expo, or converse with the greatest web 2.0 pundit, and have a straight conversation. Will this get me more of what I want from life? More of what my customers want from me, or vice-versa? I can make tangible arguments about what I want or my customers need and sort some decisions out. But note that the statement above is devoid of hyperbole like revolution, ground breaking, disruptive or transformative, things that are entirely subjective. If you identify a real problem well enough, you never need those words: the people who have those problems will naturally find what you do revolutionary if you really solve their problems.

 ‘Nuff said.

Critical thinking about KM

Three thought-provoking KM-related articles have recently come to my attention, so I thought it might be useful to bring them together. Two of them embody a critical approach to the discipline, whilst the third is more mainstream (but can be read in a different way).

Those who participate in the actKM mailing list will know that Joe Firestone has strongly-held and coherent views about the definition of knowledge management. He argues passionately that there should be a definition and that the definition should be based on the philosophy of Karl Popper.

I have found that Joe’s writing has become clearer the more I read of it. I am not sure if this is because I understand his ideas better or because his expression of those ideas has improved. If it is the latter (which is admittedly less likely), then I highly recommend his latest article, “On doing knowledge management” in the current issue of the journal Knowledge Management Research & Practice. It is available to download for a limited time.

The article takes as its starting point a dispute on the ActKM list about the meaning of knowledge management. More specifically, it is driven by Joe’s frustration at the lack of agreement on what KM is:

The problem of lack of agreement on what KM is, suggests four possibilities:

  • people can be doing KM and calling it KM;
  • people can be doing KM and calling it something else;
  • people can be doing non-KM and calling it KM; or
  • people can be doing non-KM and calling it non-KM.

These possibilities exist from whatever point of view KM is defined. The first and fourth represent no problem if one wants to evaluate KM, but clearly, without agreement on what KM is, the second and third introduce serious problems in any evaluation of KM’s impact or effectiveness. And the more frequently these possibilities occur, the greater the error introduced into KM’s track record, regardless of the truth of impact models developed to assess the impact of instances of the first possibility.

How frequently do the second and third possibilities occur? Clearly, the more there is disagreement about what KM is, the more second and third possibilities exist, and the more any track record evaluating KM, either formal or informal, will be distorted and misleading in telling us what percentage of KM efforts are successes.

The three-tier modelJoe goes on to explain his view of KM, which is that it is the top tier in a three-tier model of business processes. (The diagram here should help in understanding these, and there is another article by Joe and Mark McElroy that explains them in more detail.) The three tiers are operational business processes (the basic work done to create the outputs we expect of the business), knowledge processes (which seeks to fill epistemic gaps preventing effective participation in operational business processes), and knowledge management (which aims to fill the systemic gaps that obstruct effective knowledge processing. More succinctly, Joe defines knowledge management as “the set of activities and/or processes that seeks to change the organization’s present pattern of knowledge processing to enhance both it and its knowledge outcomes.”

The article goes on to define and give examples of two different approaches to knowledge management. Joe argues that all KM interventions can be classified into these two types. Once we understand this, he claims, it is easier to determine which interventions are successful and which are not. I am still grappling with this aspect of his work, but I have found the three-tier model very useful in explaining to my colleagues (in the KM team and elsewhere) exactly what it is that we do (and, just as importantly, what everyone’s responsibilities are).

In the same edition of Knowledge Management Research and Practice, there is an article by Daniel G. Andriessen: “Stuff or love? How metaphors direct our efforts
to manage knowledge in organisations
“. This sets out to examine how our understanding of some basic concepts is moulded by assumptions we make about the context in which we think of those concepts. (There is also a conference paper on his website covering the same ground in more detail.)

When one uses abstract concepts without giving examples or stories to illustrate what one means, people will impose their own understanding of the concepts and so misunderstandings can arise. We deal on a daily basis with abstract (and contested) nouns like “knowledge”, “learning”, “information” and “development”, so what we do is always open to misunderstanding.

One way in which people manage abstract concepts is to liken them to other things — either physical objects or other abstractions that are more familiar and easier to comprehend. The choices they make at the outset will determine some of the conclusions that they come to. Andriessen’s article outlines a small piece of research into the use of different metaphors for knowledge, and the impact those metaphors have on people’s views of valuable knowledge activities.

In the research, a group was asked to think of knowledge as water, and then as love. Using each of the metaphors, they were guided through a set of exercises designed to extract their views on what their organisation should do about knowledge. The end result was striking.

I asked the participants to identify a number of problems related to KM in their organisation and think of a number of solutions. However, I asked them to do this using a particular metaphor for knowledge. First I asked them to do this using the KNOWLEDGE AS WATER metaphor. This resulted in a number of problems and solutions…. [M]ost of these are in line with [a] mechanistic approach to KM….

Then I asked them to do the same, but this time using a metaphor that is much more in line with an Eastern view of knowledge. I asked them to discuss problems and solutions regarding knowledge while thinking of KNOWLEDGE AS LOVE. What happened was quite remarkable. The topic of conversations changed completely. Suddenly their conversations were about relationships within the organisation, trust, passion in work, the gap between their tasks and their personal aspirations, etc.

The third article, “Putting Ideas to Work” by Thomas H. Davenport, Laurence Prusak, and Bruce Strong was published as an insert to the Wall Street Journal earlier this week and is also available from the MIT Sloan Management Review. It highlights that KM is not a single thread of activity — it has to encompass knowledge creation, knowledge sharing and knowledge dissemination — and that it needs to depend more on solutions that are not based technology. As Ron Friedmann points out, this is not particularly controversial in the KM community (although it might come as news to organisations who think that KM is done when the know-how database is installed). However, Ron also points to a more subtle conclusion:

I’ve frequently written about legal KM morphing into practice support. As I read this article, it suggests that corporate KM is being absorbed by the building blocks of other functions. Sounds like a similar theme to me, only one that is not articulated.

This is an interesting theme, and I hope to be able to dwell on it at some point in the future, perhaps building on some of the insights in the other two articles.

Document management and collaboration

James Dellow has neatly summarised a discussion about the relative merits of wikis and document management in law firms. Reading both reminded me that I owe an former co-conspirator my view on document management systems as a tool for collaboration. I hope what follows will suffice.

Like most law firms, we have a document management system (DMS). It was adopted some time ago as a replacement for personal network folders. Compared to what it replaced, the principal benefits of the DMS for us are:

  • capture of key pieces of metadata at the point of document creation or storage
  • an effective audit trail and versioning capability
  • ease of search

We are now in the throes of a project to change the way the DMS works so that documents are presented to users in a set of ‘workspaces’ and folders, rather than as a mass of undifferentiated records. (The suppliers call this mode ‘matter-centric’, which is fine for the lawyers, but not especially meaningful for business services people or for work which is not a formal client matter, so we are referring only to electronic workspaces.)

The outcome of this work will be to enhance the potential for collaborative document creation and editing. In the old network folder model, a document clearly ‘belonged’ to the person whose filespace it was stored in. Without an effective search mechanism it was often difficult to find documents without guidance from their author, and practically impossible to discover interesting (and useful) information serendipitiously. This changed radically when we first implemented the DMS. As documents were stored in a common space, people were more inclined to work jointly on them. The openness of that space also meant that people could see much more clearly what was going on around them. However, there were still cultural and technical obstacles to deep collaboration.

In order to protect the integrity of documents, the DMS locks them while they are being edited. This, naturally, means that they need to be unlocked before being available for editing by anyone else (read-only viewing is still possible). Because of the variety of ways in which people access documents — at the desktop, through a web client, or checked out to a laptop for offline working — it is often the case that documents are unavailable for editing for significant amounts of time. This technical issue leads people to revert to thinking of documents as ‘belonging’ to their first author.

As the DMS holds all of our documents, it is essential to be able to apply some form of document-level security. The document creator can restrict access (to view and/or edit) to individuals or groups, but typically our people have come to use this setting in a much less granular way — it boils down to a simple choice between complete openness (document open to all to view and edit) and absolute secrecy (where the document is effectively invisible to everyone else). The middle setting — read-only for everyone but the author — is also used widely by people who have discovered that the way the DMS locks documents when they are opened can lead to them being locked out of their own documents. As a result, although the default system setting is for openness, many people have chosen more restrictive settings that limit the information capacity and collaboration potential of the DMS.

For these reasons, at least, I am not convinced that a formal DMS facilitates collaboration particularly well. For law firms, the features offered by a DMS to protect business-critical documents are likely to be more important than full-blown collaboration. However, there are other documents where a more informal sharing of responsibility is appropriate.In an environment where the robustness and wizardry of a full-blown DMS is less important than facilitating collaboration, such as for academic writing, I think a wiki probably suffices. My experience of collaboration in an academic context is limited to co-authoring with just one other person at a time. However, even this small-scale sharing of responsibility is different in nature to the collaboration I see in a law firm. I imagine that scientific papers with six or seven authors will be different again. In academic collaboration, the audit trail tracking who has read and printed a document is less significant than a record capturing each and every edit, whether minor or not. I would have welcomed being able to use a wiki page to facilitate my co-authoring activities, in preference to Microsoft Word. I am not sure what value a DMS would bring to academic collaborations that a wiki could not offer. Culturally, and technically, the wiki appears to be better suited to the flexibility of academic relationships.

Effectively, I think the reason why this might be is that the object of a DMS is different from a wiki. As its name suggests, the DMS is all about documents (which are containers for content). I think a wiki is less about manipulating documents, and more about the content itself (and, in part, the human and information relationships expressed by that content).