2009 Prediction: There Will Be Pronouncement That Unnecessary Interruptions and Information Overload Tops $1 Trillion ($1,000,000,000,000)

February 17, 2009 at 5:05 pm | Posted in Attention Management, etiquette, Information Work, interruption science, knowledge management | 10 Comments

Commentators and average folk alike were aghast as the amount of the financial bailout crept towards the $1 trillion mark.  But as Congress backed away into sub-$800bn territory (for now), another cost is likely to be announced that beats them to this lofty mark: the cost of information overload.

These Basex figures get quoted a lot in the press and, while I do believe that many people and organizations do suffer from information overload, I’m not buying into attempts to quantify it and certainly not at a price tag of $1,000,000,000,000.  In fact, I think there’s long term harm from trying to get people to act by shocking them with inflated numbers.  Just look at knowledge management.  KM was a real issue and worthy cause too, before it was done in by money-losing attempts to recover the huge dollar estimates of its inefficiencies.

How do I know this pronouncement is coming?  I’ve been following their stats on “unnecessary interruptions” for some time.  They went from $588 billion in 2005 (for interruptions without the “unnecessary” tag) to $650 billion in 2007 (you’d think the number would decrease when just the unnecessary ones are counted, but it jumped up instead).  In December, they posted a blog entry saying

According to our latest research Information Overload costs the U.S. economy a minimum of $900 billion per year in lowered employee productivity and reduced innovation.  Despite its heft, this is a fairly conservative number and reflects the loss of 25% of the knowledge worker’s day to the problem.  The total could be as high as $1 trillion.

I’ll have to examine that new research more.  How do the interruption and information overload numbers intersect?  Are they separate (totalling $1.6 trillion?) or are interruptions part of information overload (which makes sense, but then why is the umbrella number smaller than the 28% of worker’s day previously quoted for interruptions?).

If the $1 trillion figure is anything like the $650bn number I’m not going to buy it.  I haven’t seen a full disclosure on their methodology for workplace interruptions, but from what I could glean there were potentially several techniques used to generate a large number:

1. Lumping in social interactions and distractions with interruptions

Just lump all time wasting annoyances, distractions, and socializing in with the more scientific-sounding “interruptions” and you’ll get a pretty big number. Or better yet, don’t define interruption in any strict sense and survey takers will do the lumping for you. You’ll be able to lump 28% of the average information worker’s day into this category.

2. By counting all costs and no benefits (quote “total cost” instead of “net cost”)

How do you lose $10,000 at blackjack while walking out with the same $100 you went in with?  Simple, just tally up all the losing hands and ignore the winning hands.  Play 200 hands at $100 per hand, win half and lose half, and you’ll come out even.  But that means you lost $10,000 (the total of the 100 losing hands)!

If you don’t like the blackjack analogy, then plug in your own one-side-of-the-coin analogy.  How about totaling up just the expenses on a large company’s income statement without subtracting it from revenue and being shocked at how big the number is and the potential that even a small amount of improvement in that number could make?

One non-Basex study I saw asked how many interruptions people had, then assumed 50% of them were unnecessary based on other research.  Fine, but then interruptions as a whole average out, don’t they?  You can still optimize – a company that’s at break-even can always reduce costs, but the size of the total cost pool is not the issue then.  It counted all the losing hands (calling them “unnecessary”) and ignored the winners.  The implication is that you can keep all the necessary ones and chip away at the unnecessary ones, but who is involved in judging an interruption as “‘unnecessary?”

3. By ignoring closed-loop analysis

Here’s a surefire way to double the $10,000 in losses I quoted in #2.  Just interview everyone at the table (you and the dealer) and add up all their losses.  Since the half I lost was $10,000 and the half the dealer lost was $10,000, that’s $20,000 in total losses at that table.  But we both came away even!

Basex went to some effort to quantify “unnecessary” as not urgent, not important, could have been done another way, etc.  But if you ask individuals this instead of both sides of each transaction, you’re just interviewing the dealer and the player about their blackjack losses and forgetting that quite often one wins when the other loses.  Almost every possible model I can think of for interruptions (see interruption patterns) results in one of the parties involved losing on the deal, so pretty much every interruption will be counted as unnecessary by someone and without closed-loop analysis almost every interruption will get incorrectly totaled.

You need to do closed loop analysis – treating each interruption as an interaction between the interrupter and those interrupted and determining, as a whole, if it was useful to the organization.  Most interruptions are useful to someone, or why would they do it (I propose only a small proportion are careless etiquette transgressions)?  If it’s a matter of self-important timing on the part of the interrupter, consider if there is ever really a “good” time you could push these interruptions to.

4. By playing loose with the definition of “unnecessary”

Reversing a question can help validate it.  In this case, ask the question from the other side to see if you get the same answer.  Ask each survey taker how many times they interrupted someone else that day and how many of those were unnecessary.  If the interrupter thinks it was necessary, shouldn’t a conservative estimate give them the benefit of the doubt?  I predict the difference in results between the question that yields $900bn and this one would be enormous.  Only a small portion would be because the interrupter forgets they interrupted someone – the rest is the inaccuracy of the methodology.

In common parlance, any unnecessary activity interrupts a necessary one you’re working on.  Have to stop working on your coding to go to a stupid meeting?  That meeting interrupted your coding.  That’s 1 hour of interruption plus 15 minutes to get back to what I was doing.  If I decide to take a break and look at email, and then get sidetracked by a dumb one?  The email “interrupted” me unnecessarily.  If you want to let survey takers count all unnecessary activities as “unnecessary interruptions” that’s fine, but throwing interruption technology and etiquette solutions against the general problem of business inefficiency is like throwing a pebble at a wall to knock it down.  The survey definitions and the solutions have to use the same definition of “unnecessary”.

5. By comparing against perfect short-term productivity instead of long-term sustainable productivity

Yes, people take breaks and, being social creatures, they often interrupt others to do it with them.  People need breaks.  Even the best runners have to pace themselves for a marathon.  I calculated that optimal performance for the best marathon runner is obtained by running at only about half speed.  What if you spend a bunch of time and effort getting people to eliminate certain time-wasting habits, and they just re-fill that time with other habits because they need or want that time?  It may be worth figuring that out before throwing a lot of time and money away.

So you’ve figured out by now that I don’t buy the $900 billion number and I certainly won’t when it hits a trillion.  Maybe the surprising part if you don’t regularly read my blog is that I’m very much a believer that attention management is a very useful approach and that organizations and individuals can take real steps to manage their attention better (for enterprises see my Enterprise Attention Management conceptual architecture; for individuals my Personal Attention Management tips).  But I also believe in having an accurate picture of costs and benefits.

Another techno-cultural topic I believe in is knowledge management.  KM’s basis tenets were sound- that knowledge (or at least “information” if you don’t want to sound too pompous about it) is an asset just like a factory or an employee and needs to be managed as such.  But KM became a dirty word after a few years of consultants exaggerating the size of the problem and what could be done about it.  It’s taken about a decade for KM to get back on its feet, and only now under new names so as not to arouse those burnt on KM before.  I don’t want this to happen to attention management and information overload too.  It’s a real problem, but a complex one that is impossible to pin a real number on.  And it has real solutions too that can help when recognized problems exist – if you don’t promise too much.

Note: This version has been updated due to a helpful comment from Mark Worth pointing out the shift from quoting “unnecessary interruptions” to “information overload”.


Mind Mapping: Is It Finally Going to Take Off?

November 12, 2007 at 9:53 am | Posted in knowledge management | 3 Comments

Mind mapping, concept mapping, brainstorming tools have been around a long time and the question has always been whether they will ever take off.  I’ve been using them for a while now in my research, first using FreeMind and then Mindjet MindManager Pro (after getting an eval copy).  My research process involves reading a whole lot from many different sources and angles on a topic until a mental map of the topic and a conceptual model that I’d like to convey starts forming.  That’s when I start the mind map.  Then, as I continue to read, I decide whether each new piece of information is already accounted for in my model, is an enhancement I can add on, or points out a flaw in which case I need to fix it.  I know I’m ready to start writing when I get to the point that I read several articles in a row with no changes needed to my map since everything in them is already accounted for in my conceptual model.

These tools occupy a comfortable middle ground between two extremes of structure.  On one extreme, people are used to dumping unstructured information into emails, documents, or presentation slides that later get copy/pasted into some semblence of order.  The larger and more complex the topic, the more difficult this becomes and, accordginly, the more likely the author is to avoid testing out another way of organizing the information.  On the other extreme are structured content creation tools (e.g., XMetal, ArborText) and database forms that require rigid planning and adherance to structure.  But it’s difficult to get people to think in structured enough fashion for DITA or XML tools.

Mind and concept mapping tools allow the user to enter information in a very unstructured manner, but organize it later.  And that’s really the point – adding a step to the process that allows one to digest the information that has been accumulated and look for patterns.  I think there’s so much emphasis on getting things done that the content planning process gets overlooked.   People spend a lot of time accumulating information – from RSS feeds, from emails, from web pages, from Google searches – but not enough time taking a step back to reflect on what they’ve accumulated.  What does this information tell you?  How do you try out different hypotheses to see which connect up the most information? 

Your standard productivity suite can provide some relief. In Microsoft Office, OneNote allows the separate little content pieces to maintained as separate pieces and organized with tabs and clicked/dragged around, which is a little better.  And Visio 2003 had a brainstorming template that drew mind maps, although its flexibility with regards to page size when entering large maps made it of limited use.

As useful as brainstorming tools can be, I don’t think they are for everyone or every situation.  They are mostly useful for discovering patterns and innovating.  Putting your grocery list in them is a waste of time since there’s no creativity involved.  Unless you own a chain of different ethnic restaurants and are trying to do menu planning.

The real fun will begin when these tools get collaborative.  I don’t mean tracking changes on them, but treating them like wikis where someone’s body of knowledge (not just the information, but their way of organizing, categorizing, and drawing conclusions from it) can be posted out to someone’s Myspace page, blog, or a workspace and others can build on it, adding new examples that confirm a view, pointing out others that contradict the model, or show how new ways of organizing the model make it more clear. 

Accordingly, these tools will have to get better at enabling multiple conceptual models to be applied to the same data points, saved, and quickly switched between.  It is common in preparing for speeches, for example, to want to flip between a chronological or topical narrative until you’ve decided on the best choice, but changes to the nodes themselves should appear in either organizational scheme.

Still, I think there’s a bright future for mind and conceptual mapping tools and I encourage anyone who hasn’t tried one of these tools to take one for a spin.

Turning Informal Networks into Formal Ones

October 17, 2007 at 2:36 pm | Posted in collaboration, knowledge management, social software | Leave a comment

McKinsey has just authored another article in what could be considered a series on collaboration in organizations. The first two (“Mapping the Value of Employee Collaboration” and “Competitive Advantage From Better Interactions”) were very good at articulating the value of collaboration from a point of view (and a source) that business executives can understand. I use them frequently when setting the stage for the usefulness of collaboration technology at an infrastructure level (in other words, not applied to a specific one-off project).

But I’m not as thrilled about the next in the series. The new article is about formal networks, but is called “Harnessing the power of informal employee networks”. The change in nomenclature comes about because the authors’ answer to how to harness the power of informal networks is to make them formal.

After describing how prevalent, necessary, and useful informal networks are, the article describes how they can be made more useful by providing a leader (“coordinator” is a better term since the person has no managerial authority), a budget, evaluation measures, and IT resources (wikis, document libraries, and the like).

There’s no doubt in my mind that the type of formal network they describe is better than the matrix org chart they say it replaces. And for companies that have no similar mechanism in place (centers of excellence, communities, sponsored “birds of a feather” get-togethers), a formal network would certainly be an improvement.

But I think a formal network is just one type of network or community and that the authors should have been more careful to acknowledge that and provide tips for differentiating which networks are appropriate for formalizing and which aren’t.

I also felt the article did not address several tricky issues related to the formation and subsequent growth of the network such as:

  • Does one wait for networks to self-organize and then (at what point?) jump in and pave the cowpaths by making it formal?
  • How comfortable can members be about leaving the network if they do not see the value?
  • How to people outside the network petition to join?
  • Given how the article demonstrates the importance of a “linchpin” in a network, how does the network retain resiliency in the case that the leader/linchpin is a jerk?
  • In what cases should an informal network be left alone and not formalized?

Overall I still believe the article to be a good one. It’s based on a lot of research whereas my observations are based on a 12 page summary, personal experience, and gut feel so I may not be capturing the whole essence of their points. And I do think formal networks are a very valuable form of network. Just not the only form of network.

iWow! iPhone Kills Dozens of iTrees and Ships in its Own iBox

August 15, 2007 at 11:39 am | Posted in knowledge management, Mobile and pervasive computing, Web 2.0 | 2 Comments

If you haven’t seen the video of a woman opening her 300 page iPhone bill, check out the article and link here.  I’ll admit – I’m not currently a fancy phone kinda person, so you won’t see me commenting a lot on the mobile industry unless I get assigned that as a research topic. However, brand management and information management are passions of mine and in those terms I consider this a minor disaster.

From my view it’s a cautionary tale in 3 ways:

  • The potential for collateral damage to brand image from partnerships. Manufacturers of products endorsed by athletes have often had to deal with this type of problem. In fact, it has become so prevalent that some companies and sponsors of events have decided the risk of collateral damage outweighs the benefit and now avoid such spokespeople. Now it’s Apple’s turn. Apple has earned a strong brand image that associates them with sleek, streamlined, innovative (not tied to legacy), understanding young people, and hip. But their relationship with AT&T has resulted in a brand management issue that is getting heavy exposure (including CNN Headline News) that will associate “Apple” and “iPhone” with something non-sleek, tied to an old way of doing things, unhip, and abhorrent to the values of many young people.
  • The Web 2.0 generation has massively greater power to embarrass large organizations than previous generations. Accordingly, large organizations need to allocate budgets massively greater than those of a generation ago to mitigate this risk through continuous monitoring of legacy and Web 2.0 communication channels as well as a general PR contingency plan for unpredictable disasters.
  • Old information dissemination practices must be reviewed in light of new information demands. When the only thing a cellphone did was make calls (and expensive ones at that), a paper itemized bill made sense. Text messages are far more numerous (an astounding 30,000 for Justine) so the same format will be practically useless.  Even if one was interested in the information on those pages, they would have great difficulty finding and using it.

And to those people who say it’s her fault for not selecting e-bill, you should have to opt-in to a bill that may require being shipped in a box, not opt-out. And I don’t think one would reasonably expect that their paper billing would result in a few redwoods worth of itemization.

What Do Knowledge Management and Bell Bottoms Have In Common?

May 7, 2007 at 12:21 pm | Posted in collaboration, knowledge management | Leave a comment

They are both back in style. Bain survey of 1,221 international executives by Bain found:

Of course, KM definitions are always fuzzy and this one includes collaboration and innovation as well.  Still, it wasn’t that long ago that executives needed a cootie shot if someone mentioned knowledge management in a meeting.   

Lessons for Social Software and Shaping Organizational Culture

May 2, 2007 at 12:15 pm | Posted in collaboration, knowledge management, social software, Web 2.0 | 1 Comment

Set the way-back machine for 1999, long before “Web 2.0” became a buzzword. I think it’s useful to take what Kanter says as a sociologist studying corporate interaction and apply it to the social software trends we are seeing in 2007. I think she deeply understands the value of networks and innovation, but she asserts the need for actively organized networks. I would be interested to hear her opinions on self-organizing networks which are much more visible today (I hesitate to say “prevalent” since actual statistics are hard to find) then they were 8 years ago.

This is an excerpt, but I recommend reading the whole article: An Interview with Rosabeth Moss Kanter

S&B: A lot of what you’re describing seems to do with the inability of many companies to transfer knowledge. Even when the hierarchies within companies are knocked down, it sounds like islands are created in their place. And the islands aren’t linked; each has its own path forward. Whoever is developing the food product, for instance, is not talking to the people who are actually consuming it. Is it true that this is something you have to attack, and if so, how do you do that?

ROSABETH MOSS KANTER: It’s very tough to attack when you’re huge and the problem cuts across big distances, even though information technology now theoretically makes it possible to go those distances. The companies that I talk to – many of whom now have “officers for knowledge management” or “chief learning officers” – still say that the human tendency not to share information is getting in the way, even as there is more and more information to share and the need to share it is increasing. The problems here, though, are not really ones of technology. They’re problems of how people communicate and collaborate, and they are also a factor of the amount of work people have; today’s workload itself inhibits sharing something with somebody in another country in another unit in another function.

And so the companies that have been successful at using networks to share knowledge have developed some rules of thumb. For one, they’ve found that knowledge-sharing works best when there’s a regular face-to-face encounter, every quarter or so. In between, people can communicate electronically or by telephone, but the face-to-face component is crucial.

The networks that work better are also actively managed; they’re not just spontaneously self-organizing. Somebody must feel responsible for whether communication is occurring on these issues across the divide. They also work better when people get something out of it that directly benefits the work they do; no one has time for altruism.

Du Pont has made a kind of art of networking because it had somebody who wanted to be the champion of networks. At one point, the company estimated that it had about 400 knowledge networks out of central R&D alone. Some were groups of people who formed a kind of resource network on a particular technical topic, like abrasion or adhesion. Some involved best-practice sharing, where participants identified a project they could do together. Some were task-oriented networks that were given a particular assignment to improve or fix something and would draw more people into accomplishing that task. The plant-maintenance network, for example, had the task of taking out a lot of costs. Eventually, more than 600 people got involved in pieces of that network, and they ended up taking out several hundred million dollars a year.

But one other comment about that. Networks based on things like best-practice sharing tend to fall apart quickly because there’s no real continuing path. It’s basically just getting together to pass along the practice. So these things come and go too.

S&B: So how do you keep a network going?

ROSABETH MOSS KANTER: Well, first of all, not all need to be kept going. Some have their purpose and then die. But even for them you need to create roles in your organizations for people to serve as a kind of ambassador or diplomat whose job is to work with certain customers or for a certain product line and travel from place to place sprinkling seeds of knowledge. “Oh, yes, I was just in Chicago. And here’s what they’ve come up with that you can take to Hong Kong.” Here the technology infrastructure by itself isn’t enough; you need the human infrastructure.

On this point, I was really struck by a recent change at British Telecom. One of the people interested in knowledge management there declared that I.T., information technology, should now be called I.F., information flow. This person is much less interested in technology than in making sure that actual communication occurs between people.

This is one reason the face-to-face factor is so important. One of my globally connected models is an emerging company that’s been growing quickly by acquisition and that has worked hard on this factor. It has a weekly management call that involves people from every office for an hour. The phone call is scheduled at a reasonable time for Tokyo and London and New York, and in each place it’s held in a big conference room with a big screen. The call is backed up by data transmission, so everybody can immediately see the same numbers being presented. Everybody involved knows that they have to make time for this call because it is an occasion to get knowledge quickly from one place to another. This has become part of the way of life at this company. It pulls people together and it keeps them moving.

A Social (Software) Handshake: BEA, Meet Web 2.0; Web 2.0, Meet BEA …

April 9, 2007 at 5:17 pm | Posted in BEA, collaboration, knowledge management, portals, social software, Web 2.0 | 1 Comment

internetnews.com posted an article about BEA’s foray into wiki building, project workspaces, knowledge directory, search, and social networking (“The O’Reilly Factor: BEA Ushers in Web 2.0 Products“). I attended a BEA analyst summit and got a chance to see where they are going with this and came away impressed at the effort, but unsure if there’s enough there to steal a significant chunk of market share rather than just keeping their existing customers happy.

First, a quick summary of what they showed. There are 3 products they added to the AquaLogic line: Pages, Ensemble, and Pathways. For those of you who have been following BEA’s efforts for a while, these were codenamed builder, runner, and graffiti. These will be released in the second half of 2007. Other related existing technologies include AquaLogic Interaction Collaboration, AquaLogic Interaction (the old Plumtree portal), and WebLogic Portal.

  • Pages is kind of a wiki builder or blog builder. That’s a key distinction – rather than being a blog or wiki, Pages is a tool to help people build wiki or blog-enabled applications.
  • Ensemble is a composite application / mashup tool
  • Pathways is BEA’s foray into activity analytics, social tagging/bookmarking, search, and people/expertise location.

First impressions? I am impressed with the scope of what BEA has done, particularly since it was created in house and not just by purchasing one of the myriads of small vendors in these spaces. By doing so, the products are more unique and more tied to BEA’s value prop. For example, providing tools to help developers create high-end wikis with data connections and application functionality extends BEA’s target market into the social software space.

BEA certainly has challenges ahead of it though. To start with, while organizations may mess with individual technologies at a grass roots level, ultimately they are tied into a platform decision. I don’t expect organizations to adopt a new platform or throw out their old one due to any one feature of course. What you could see is a slew of connected features and a roadmap cause an organization to adopt a platform for an isolated use case (a new partner site for example) or shift internal marketshare for a company with multiple platforms.

So if the other platforms were standing still there could be a shift towards BEA due to their overall social software direction. But they aren’t. IBM came out swinging at Lotusphere with Quickr and Connections. Microsoft has attracted significant attention with SharePoint (WSS and MOSS). Neither address wikis or blogs that well, but templates will improve. BEA may have eeked out a lead on actual released features, but leapfrogging will continue.

I also want to mention that I had a nomenclature issue that I share with much of the hype around empowering the user and the Time magazine person of the year being “you”. BEA stated a position that there is a movement from group-centric to user-centric software. Certainly users are becoming more empowered as they post their lives on YouTube, their every moment on twitter, and their personal personas on Facebook. But from an enterprise point of view personalization already targeted personal views of information. And while the new technologies help end users post up information, the real key to all this is community-centric software. Static intranets often had hardcoded groups based on organizational heirarchy. Breaking down heirarchy and uncovering the real social networks that information workers live in, in all their fluid, tacit, and inter-related forms, is where the real value is. It’s not wrong to say things are moving from group to user-centric computing – it’s just missing the next step of the metamorphasis.

I will be talking more to BEA soon to dig deeper than I could there, and will post up any additional details I find.

How to Stifle the Wisdom of the Masses

March 28, 2007 at 11:37 am | Posted in knowledge management, social software | Leave a comment

Great blog post from Don Cohen about the importance of letting innovation flourish at all levels of the organization and the tendency of business to stifle it to save face and avoid admitting there are things that don’t go according to plan. He writes about a consulting company where employees took it upon themselves to create a knowledge harvesting system and even won a big contract because of it, but the executives didn’t want the case study published.

I think the fiction he wanted to maintain—that all decisions are carefully deliberated at the top and carried out by those below, that nothing happens by accident—is a damaging one. To the extent that leaders tend to believe it, it stops them from seeking and learning from the innovative ideas and practices that bubble up in odd corners of their organizations. To the extent that they present themselves as the sole source of company wisdom, they stifle the creativity of the people who work for them. (Why bother if leaders won’t listen and then take credit for ideas that survive in spite of their opposition?)

By way of contrast, I think of a story from the early days of Hewlett-Packard. David Packard responded to an engineer who had disregarded an order to stop working on technology that turned into a successful product by calling a meeting of engineers and presenting him with a medal for “extraordinary contempt and defiance beyond the normal call of engineering duty.”

This organization was very lucky that a few consultants took it upon themselves to bring the wisdom of the masses forward. I doubt they will be lucky enough to have it happen again.

The KM Business Case: Assessing the situation

March 12, 2007 at 4:32 pm | Posted in business case, knowledge management | 4 Comments

When talking to clients about business cases for portal, content management, collaboration, or intranets I have found that they jump too quickly into their spreadsheets or lists of benefits.  Before jumping into the number crunching, it’s worth taking a step back to look at what is really being asked for. The intention of this list is to help business case authors to be aware of the catches and pitfalls, to know the questions to ask of their peers and financial folks, and to lay out the territory so they can avoid unnecessary work and find the path of least resistance to gaining approval.

Some questions to ask to assess the situation before diving into the business case:

Why me? Did every initiative have to have a business case? Did e-mail?

Business Case vs. Business Plan: Are you producing a business case (presented to a decision maker who will judge whether to proceed with the direction recommended in the plan) or a business plan (done after a business case is accepted and contains all sorts of things the judge probably does not need to know such as the phasing and schedule, staffing plan, and project plan)?

Business case vs. ROI: A business case is not necessarily numerical. Providing a spreadsheet when the boss expected a textual business case can sink your project, or at least waste a lot of time.

ROI, NPV, IRR, payback, TTV: If you’re doing quantitative analysis, do you have a choice of method? Are the time period and discount rate specified? If not, this just becomes an exercise in picking the most favorable parameters you can get away with.

Conceptual vs. Concrete: How fuzzy (“collaboration in the workplace”) or concrete (a specific product) is your initiative? Do you have the option of how to scope it? Business cases can be at many different levels: Technology, program/project, Organizational, and Conceptual

Project vs. infrastructure: A project has a discrete beginning, middle, and end. Infrastructure is ongoing. Quantitative analysis such as ROI was made for discrete projects and is difficult to apply to infrastructure where one does not know exactly all the ways in which the enabling infrastructure will be used. Also, the infrastructure simply makes other assets more efficient (people, systems), rather than directly providing benefit itself, so determining what percentage of improvement is due to the boost provided by the infrastructure is like trying to determine how many milliseconds faster a runner is because of the vitamins she is taking. One common way around this problem is to “projectize” the infrastructure: to take one major and immediate way in which the infrastructure will be used and justify its entire cost based on that project, treating the rest as free icing on the cake (“And we’ll get to use it for other things too …”).

Hard vs. soft: You can think of both costs and benefits as being on a hardness scale like the 1-10 one used in the Mohs Scale of Mineral Hardness. Provable, measurable impact to a company’s profit through increase of revenue or decrease of expenses, accurately attributable to a direct cause, is a diamond (10). And it’s equally rare and valuable. On the other hand, “It will save every salesperson 5 minutes per day” is talc (1). Simple, easy, and soft. Helping to decrease printing expenses by providing information online is maybe Orthoclase Feldspar (6. OK, I looked it up on Wikipedia). No non-investment business could function if it required a rating of 10 on benefits and costs for everything it did. Some faith (faith = (10 – hardness rating) * 10%) is generally involved.

Net benefit vs. status quo: Generally the status quo – the current state cost of doing nothing and proceeding as always without the infrastructure project – is incorrectly assumed to have a cost and benefit of $0. This math ignores the possibility that the status quo itself may have costs and benefits, particularly if the environment is changing in some way.

Prioritization vs. financial testing: The effect of the project on the time and money of the company and attention of your team and those touched by the project (including the end users) needs to be prioritized against other potential projects. On occasion, KM project owners jump through many financial hurdles only to find their project rejected anyways since other projects were deemed more important or worthy of the effort.

Lottery vs. followup: Is the business case treated as a lottery where winning results in a pile of cash showing up at the door? Or does the eye of the finance department continue to watch you after you win approval? You should have to show you actually used the resources allocated for the project in question and got the benefits you promised.

The KM Business Case: 50,000 Foot View

February 28, 2007 at 2:47 pm | Posted in business case, knowledge management | Leave a comment

I’ve been an analyst since 1998, focusing most of that time on various knowledge management technologies like portal, web content management, and intranets.  One client inquiry that has been consistent over time and between technologies is the need to prove value for knowledge infrastructure.  Sometimes this takes the form of financial analysis (ROI, NPV, time to payoff), and sometimes it is around metrics (how can I show improvement or prove in a year or two that it was worthwhile).

Overall, there are two reasons owners of collaboration infrastructure start work on a business case: because they have to and because they know it’s good for them. There’s an inbetween option which is that it behooves them to do it now because there’s a good chance they’ll be asked for it in the future. Even if not explicitly asked for, a business case should be an integral part of any collaboration plan or strategy to validate that the technology is aligned with business goals and objectives.

The journey is the destination when it comes to business cases. When the business case creation process is seen as valuable on its own rather than just a hurdle to get past to start work it can be used to steer the direction of the project and form the basis for an ongoing dialogue with the business.

At a high level, the business case for technologies that fit under the KM umbrella are very similar (portal, web content management, attention management, intranets, collaboration, search, etc.). Note that I will not distinguish between the business case and a business justification. The differences are very minor and lead to the same steps in my experience.

I’ve talked to 100+ organizations about their business cases over the past 7 years, worked in detail on a few, have read through many case studies, and have seen many different approaches that worked at different companies. It’s all about 1) starting with a worthy project, 2) understanding the situation – the why, what, and how, 3) building the business case by selecting the most useful methods out of the many available for the specific situation and the line items to apply those methods on, 4) presenting the business case (in multiple formats), 5) keeping the business case in mind while executing the initiative, and 6) following up once the initiative is in place.

That represents my high level view of this issue. I’ll post more thoughts in an ongoing fashion. I’ll focus on #2 since I think that’s where most of the misunderstandings occur and business case production often goes astray.

Next Page »

Create a free website or blog at WordPress.com.
Entries and comments feeds.