Productivity Future Vision

April 3, 2009 at 3:37 pm | Posted in Attention Management, collaboration, communication, Fun, Information Work, presence, social software, usability, User experience | 6 Comments

Peter O’Kelly’s blog pointed me to a Productivity Future Vision video from Microsoft Office Labs.  Highly recommended.

A few observations:

  • Airline seats in the future will be wider and have more legroom, even when you aren’t in first class (the seats on the plane don’t look like big, puffy, overwrought first class seats).  Furthermore, they will be clean and not have potato chips from the previous occupant smeared on them.
  • People will use their electronics calmly and be nice to each other.  People in the video seem to calmly make a few gestures, then relax and smile.  It seems that productivity expectations in the future have remained about constant with today rather than increasing along with the improvement in the capabilities of their applications.  The time saved through their more productive interfaces has been returned to the worker to allow them to stop and smell the roses instead of their employers and clients demanding more from them. This will allow people in the future to relax and use their new wondrous equipment in serene happiness.
  • Devices get thinner and more translucent.  But while you may think holding remotes that are as thin as a piece of glass and typing on hard, flat surfaces would be uncomfortable, they will actually be pleasantly ergonomic because people in the future will have dainty hands and features.  There seem to be no obese, elderly, overly tall, or overly short future workers.
  • There is no need for paper in the future, so working environments remain clean and clutter free.  Come to think of it, there seems to be no need for food, conference SWAG, books, printers, or desk lights either.  This explains the lack of garbage cans in the rooms shown in the video.
  • Office workers will not create content anymore, such as typing long streams of text or slaving over the graphics in the beautiful interfaces they use.  They simply do a few manipulations to content that already exists.  Presumably a new underclass of information workers (I’ll call them “information morlocks“) slave away underground crafting detailed content that the surface dwellers can then use through simple, intuitive, tap-exhale-and-smile interfaces.

OK, a few serious observations:

  • I like the thought leadership I see here.
  • The basis for some truly wonderous technology exists today, such as the machine translation, digital ink, mobile phone projectors, and OLED displays shown in the video.
  • Interfaces with touch and gestures can be much more natural than keyboards and mice.
  • Collaborative workspaces can be made more natural and incorporate many other useful technologies.

While innovating user interfaces and display devices have great gee-whiz factors, I’m really looking forward to much more mundane improvements in productivity.  To name just a few:

  • I want to see content that is created in easy-to-use tools that scale to the needs of the user and produce content that is easily componentized, tagged, reused, and reassembled. 
  • I want to see contextual meaning to be captured and guidance provided through integrated ontology and machine memory during authoring to enable better translation and localization.
  • I’d like to see powerful and consistent reviewing and commenting features across all productivity tools that can discover implicit collaborative authoring processes through observation. 
  • I’d like to see rich presence information that improves productivity by inserting routing and channel switching to messages that determine the most appropriate way to deliver a message while taking both sender’s and receiver’s contextual preferences into account.
  • I’d like to see operating systems and productivity applications working together to create more interruptable environments that fit the time sliced, interruptable nature of the workplace. They would allow better bookmarking that can save retrieve the exact state of applications and their relations to each other, easing the burden of remembering what activities were in motion during interruptions and reducing time to resuming work.  Snapshots of window layouts and application states would allow easy, instantaneous switching between multiple workstreams.
  • I’d like to see wearable electronics that utilize personal rich presence, mobile technology, and social networking profiles to alert people to others in their vicinity that share interests (or other programmable searches) and are open to serendipitous conversation.

I might be waiting a while …

Advertisements

A Personalized View of the Microsoft Office Ribbon Bar UI

September 30, 2008 at 11:07 am | Posted in Information Work, Office, usability, User experience | 2 Comments

The Wall St. Journal had a section on the Technology Innovation Awards yesterday (9/29/08)which included a trends section called “The Latest Buzz On …” on page R2.  In it, user interface guru Jakob Nielsen praises ribbons bars and, in particular, the ribbons in Microsoft Office 2007 (like those in PowerPoint 2007 below).  I’m going to disagree with Jakob here, and it isn’t the first time.  I’ve been diving into Office 2007 more extensively lately and am not a fan of the new UI.

 Powerpoint ribbon

As a UI, it seems to have sacrificed personalization for context.  By context I mean the drawing tools appear at the top when you click on a drawing object and otherwise they remain hidden so as not to distract you.  That’s nice.  But the toolbar used to adapt to who I am, not just what I’m doing.  If I was a user of the indent feature, it would show up on the toolbar and if I didn’t use it it would eventually disappear since there isn’t room for every icon.  If I wanted to have the review toolbar float near the text and keep other options at the top of the screen to fit my personal work style I could.  In fact, I could move any toolbar to float or dock in any side of the screen.  Now I can only appear at the top and you can only customize the quickbar which is permanently docked.

Besides that, there are still many items that seem to be randomly placed.  There is only so much screen real estate on the ribbon to lay out commands and have them attractively grouped, so certain commands couldn’t fit in their optimal spot.  Does “research” belong under Review?  Doesn’t turning on “snap to grid” in PowerPoint belong under some menu option – any menu – rather than having to right click in the workspace?

There is nothing under the “home” tab that one would guess should be categorized under “home”.  Is “home” a function, process, or task I do like insert, review, or view?  Why would I expect to change fonts, styles, and bullets under “home”?  Didn’t “paste” make more sense under “edit” (its old place) rather than “home” or “insert”?

I know that any UI design, particularly that of a complex system such as Office, is a choice between the lesser of evils.  Everyone thinks differently and I’m sure Microsoft did extensive research to ask people where they would look for things and my brain just isn’t on the same wavelength.  But that’s why I think personalization is so important.  You can never get it just right, so allowing the system to have dynamic last-used, first-shown buttons and movable tool bars helps each user adjust.  Sure training and support can be a little tougher when icons can move, but I think that problem is minimal compared to everyday use.  And I know you can do anything to the ribbons you want to programmatically, but it used to strike a better balance for the experienced user, but not one that wants to dig into code or buy a 3rd party product – like ones that put the interface back to what it was before.

Enterprise Communication Meets the World of Warcraft

April 10, 2008 at 8:36 am | Posted in communication, Gaming, presence, usability, User experience, virtual worlds | 2 Comments

I’m working on my Enterprise Virtual Worlds presentation and was filling in some detail on communication in game-oriented virtual worlds that I would like to share here as well. 

Enterprises are wise to look to gaming from time to time due to trends in:

  • Outside-in technology: how consumer technologies such as blogs and wikis increasingly find their way into enterprises
  • Emergent gameplay: the use of gaming technology in ways the original designer hadn’t intended
  • User experience lessons: UE improvements tend to filter from the competitive gaming market to generalized applications.  Gaming is an optional activity, so UE has to be at a high level when you want the users to pay you to use their systems rather than the other way around.

Communication is interesting to explore since the number of communication channels that enterprises use (and every information worker must now attend to) has increased a great deal over the past five years to include instant messaging, presence, websites, and blogs.  Getting enterprises used to the idea of “channels” and how to manage and select between them has taken some time and some pain.

I was quite impressed when all the methods of communication in World of Warcraft (which was released in November of 2003) are laid out. WoW communication is strikingly similar (and maybe more efficient) than enterprise communication technology in many areas.

It includes:

  • Channels: Players can subscribe to communication channels such as /trade to receive ongoing chat on the channel, or unsubscribe.  Another example is in EVE Online, which has a “newbie” channel that can put new players in touch with others taking their first steps, but can be turned off once the player is more confident.
  • Chat modes (IM): The variety of built-in IM modes goes beyond most enterprise IM implementations which rely on groups.  They are: /say (vacinity), /party (your group only), /guild (your broader community), /yell (all in larger region), /whisper (one person)
  • Presence: Friends can be selected and you are made aware when they come online/offline, and location is displayed (a feature still on the cutting edge in the enterprise)
  • Mail: Consists of normal mail, packages, and COD packages. The inbox is visited at WoW Postal Service facilities, which has the pleasant effect of isolating the player trying to accomplish objectives from the stream of email since they only check it periodically when they visit town.  Also, since email costs money to send (a few copper pieces), there is practically no spam
  • Emotes: There are over 100 emotes such as /wave, /thank, /cheer, /dance, etc. It is amazing how fluid the use of emotes gets in the real game, such that they do not feel like a conscious effort to be funny, but rather a natural way of expressing oneself in group situations. 

Social Software: Think “Baking Social Interaction In”, not “Blogs, Wikis, etc.”

March 24, 2008 at 7:58 am | Posted in social software, usability | 1 Comment

With all the talk about technologies associated with Web 2.0 and social software such as wikis, blogs, and ratings systems it sometimes helps to take a step back to remember how it is the underlying concept of social software, not just the technologies often associated with it, that is important. And that concept can be applied even without fancy new technology.

The most hip, Web 2.0-anointed technologies can be used in ways that have nothing to do with social software. For example, a blog mechanism could be set up by an organization to allow a single executive an easier way to post announcements with the commenting feature turned off. This would not be social software.

And the other side of the coin is that old technologies can be used to build social software even though they don’t have ready-made components to build in or a fancy meme. For an example of building social software without Web 2.0, I’d like to introduce you to a circa 1990 OS/2 1.3 system called the Cost Tracker.

The Cost Tracker is the first system I ever wrote as a full-time corporate programmer. I was working at a large financial services company during the days when mainframe costs were the largest portion of the IT budget and expensive CPU time made every runaway process and abend a hit to the bottom line. Once a month IT would receive an inch thick printout (green bar paper, holes on the side, fixed space font of course) with all the mainframe jobs that ran, their CPU and DASD (disk) usage, cost, and 15 more columns of stats. The stack would be divided up into sections for each manager and circulated through inbaskets for perusal.

I worked in a central IT unit tasked with executive information systems and internal IT systems. My task was to replace that paper-based system with an easily accessible, graphical system that would make it easier to see the costs, compare actual to planned expenditures, and locate the root cause of costly overruns.

The system I developed looked like this:
IPF Cost Tracker

Note: This is just a mock-up of the system. The data is all faked, but similar to what was there.

For the longest time, I thought the unique value was the left half of the screen. It was an early example of a drill-down system business intelligence system which allowed the data to be rolled up to the highest level (departments) and then the user could drill down by clicking on bars or pie slices to drill into more and more detail (each manager in a department, each system for a manager, etc.) until you got a raw spreadsheet-like window with a narrow, manageable slice of the 10,000 rows of raw data. Plan is shown with slashed bars and anything over plan is in red. Now you’d use a BI system or maybe fancy spreadsheet to do this. But that wasn’t an option in 1990. I got to write all the charting routines from scratch in C and had a blast doing it. I even got to speak on drill-down systems and linked lists as a guest lecturer at Indiana University.

But now I think the unique part was really the right half. This is just an edit window where anyone could enter comments that were attached to the specific graph on the left side so that anyone who drilled to that node or looked at historical data would always have the explanation right there for reference. I wish I remember whose idea it was to do that – maybe mine, maybe my manager’s or CIO’s. The tendency, even today, would be to consider this a number crunching problem and the system would be “done” when it allowed you to drill into the data.

To think up this design required lifting one’s head up from what seems like a quantitative system to understand the social process that went along with the numbers. And that social process went like this: In the old paper-based system, when the numbers came out the managers would dig in first to find out how they did that month and prepare explanations if there were any major cost overruns. Then the directors would look at the numbers and ask managers that were over plan what happened. Then the CIO would do the same of the directors. These conversations were highly inefficient, taking place over team meetings or through email, at different times of the month, and without a good way to track explanations over time.

Going beyond analyzing the intricacies of the raw job-level cost data I pulled down from the mainframe to understanding this social process allowed the system to bind the quantitative and the social together. The simple addition of an edit window for each node in the data allowed explanations to be stored in a common form, ready and accessible by directors and the CIO, and available over time in a way that conversations and email could never be.

I see this now as a good example of how social software does not relate to fancy Web 2.0 product categories, but is the powerful idea that understanding and building social processes into software greatly improves the value of these systems by acknowledging and enhancing the interpersonal nature of modern business.

.what? Non-Roman URL Suffix Trial Begins Today

October 15, 2007 at 11:09 am | Posted in Attention Management, Globalization, usability, User experience | 5 Comments

According to ICANN:

The Internet Corporation for Assigned Names and Numbers will launch an evaluation of Internationalized Domain Names next week that will allow Internet users to test top-level domains in 11 languages.

“This evaluation represents ICANN’s most important step so far towards the full implementation of Internationalized Domain Names. This will be one of the biggest changes to the Internet since it was created,” said Dr Paul Twomey, ICANN’s President and CEO. “ICANN needs the assistance of users and application developers to make this evaluation a success. When the evaluation pages come online next week, we need everyone to get in there and see how the addresses display and see how links to IDNs work in their programs. In short, we need them to get in and push it to its limits.”

The evaluation is made possible by today’s insertion into the root of the 11 versions of .test, which means they are alongside other top-level domains like .net, .com, .info, .uk, and .de at the core of the Internet.

Next Monday, 15 October 2007, Internet users around the globe will be able to access wiki pages with the domain name example.test in 11 test languages — Arabic, Persian, Chinese (simplified and traditional), Russian, Hindi, Greek, Korean, Yiddish, Japanese and Tamil.

While it may seem like knowing just enough English to type “.com” is not a problem, the issue is twofold. First, writers of languages with non-Roman alphabets may not have an English keyboard that can type “.com”. They could always copy and paste it from other content when needed, but that brings me to the second point: they shouldn’t have to. The content on the Internet is not owned by the U.S. (even if ICANN is) and being able to use addresses in other alphabets has a great deal of symbolic meaning.

I’m currently researching and writing a paper on globalization due out around January. You’d have to be living under a rock to not understand the impact that globalization is having on the demographics of Internet usage and, accordingly, the web technologies, processes, and cultural sensitivity needed to support them. But the recent statistics were still surprising.

The fall of the Iron Curtain (generally considered to be 1989) began a change in market forces that is still being felt in global businesses. For machine translation, SDL reports that Eastern bloc countries account for seven out of the top 10 fastest growing languages for its translation modules in 2007 (Source: http://www.sdl.com/en/events/news-PR/Eastern- Europe-and-China-dominate-2007-translation- trends.asp). Internet World Stats reports that English is by far the most common language on the Internet (with 365 million users versus 184 million for #2 Chinese), but there has been massive growth between 2000 and 2007 for Arabic (+941%), Portuguese (+525%), and Chinese (+470%). The rest of the world’s languages (outside the top 10) still represent 15% of all internet users and had 440% growth from 2000 to 2007.

In terms of usage, Internet usage outside of North America dwarfs usage within it (see table below), although North America has the highest Internet penetration (69%).

Why ICANN picked Yiddish as one of the 11 languages though baffles me a bit. Couldn’t they have picked something more common, like Hebrew? Oh vey.

Placeholding Approaches

September 27, 2007 at 1:22 pm | Posted in Attention Management, interruption science, usability | Leave a comment

I wrote yesterday about the value of placeholding as an interruption management technology.  I believe placeholding is an important but rarely mentioned benefit of virtual and web/online desktops (see “Technology Review: Computer in the Cloud” and “Is the World Ready for a Web-Based Desktop?” ). 

An online desktop is using a desktop like the Windows desktop from within your browser.  A good list of online desktops is in the Wikipedia article on “web desktops”.  Usually web desktops are touted for their portability, manageability, and low cost, although they are not widely used.  I consider the web desktops out there today more like proof of concepts.  Obviously issues with performance, available software, and offline usage will be difficult hurdles to get over.  But the part I want to concentrate on is the ability that some of them have (Peepel for example) to have tabs on the side of the desktop that let the user immediately switch between desktops with all window placement, icons, and applications states intact.  And since these desktops are online, they are persistent as well, so they don’t disappear when you reboot.  Some virtual desktop applications provide similar functionality for installed desktop managers (Windows, OS X, BeOS, etc.) but I haven’t looked into them.

This minor-sounding bit of functionality (not even mentioned in the front-page highlights for Peepel) becomes quite powerful when you start using it.  You can name tabs for activities or states of mind (“house hunting”, “news”, “family budgeting”). When you’re working on house hunting and feel bored or stuck and want to distract yourself, you can click over to news and not be bothered by a desktop littered with your uncompleted task.  Then, when you get interrupted while following a thread of a few stories to answer a quick budgeting question, you can abandon the news and not worry about mentally jugging where you left everything.  It is like scratching an itch you didn’t know you had.

Adobe Acrobat has a very sophisticated system for bookmarking.  However its use case is more around the digital equivalent of placing those yellow sticky flags throughout the pages of a book for later reference rather than just keeping your place as you read through the document.  If it was meant for placeholding there would be only one special “place” bookmark, it would automatically store the current place in the doc when you close it or stay in one place for a long time or hit a simple “update placeholder” function key, and it would return to that point when you reopen the document.  Adobe Reader does not offer this functionality, although I believe it should.

I’ve heard Vista has some placeholding technology and plan to try it out soon.  This functionality requires active involvement from the developer of the application as well – it can’t be handled by the OS alone, so I expect mixed results.  Microsoft’s GroupBar research project allowed the user to save working state by taking snapshots of windows, groups, or the desktop restore their state later.  Also, the Microsoft TaskGallery research project is a fancy 3D approach to managing multiple desktop states.  I’m hoping that once users get a taste of placeholding, even in a few apps, it clicks and they become more vocal about requesting it.

Placeholding: Doesn’t Cure Interruptions, But It Reduces Symptoms

September 26, 2007 at 8:21 am | Posted in Attention Management, interruption science, Office, usability | 1 Comment

I’ve become convinced that one of the most significant attentional technologies that software vendors could incorporate to accommodate interruptions is what I’d call placeholding.  Since “bookmarking” has come to mean pointers to specific entries rather than points anywhere within an entry I prefer the word “placeholding”.

Why do applications that allow you to move around large pieces of content (Microsoft Word and Excel, Adobe Reader, Microsoft Internet Explorer, Mozilla Firefox) always assume you want to start at the beginning when opening a document instead of where it was the last time you looked at it?  More than half the time I think I would want it opened to where I left off when I last closed it.  And if I didn’t it’s easy enough to hit ctrl-home to get to the top whereas it is impossible to start at the top and hit a key to get to where you left off.  At a minimum it could be a preference checkbox.  This placeholding includes where the cursor was as well as what state various toggle buttons and selections were at (such as that I was in boldface, red text, the highlighter was yellow, and I had just selected a region). 

There are some technical issues to be worked around here.  Sometimes you don’t want to modify the file – a separate placeholder file, like a browser’s bookmark file could accommodate this issue.  Sometimes there are multiple users on a PC or files get shared – storing user name along with the place like some cookies do could fix that.  I don’t think the technical issues are a stopping point.  We tolerate other actions that don’t guess what we want to do correctly 100% of the time that have a lot less benefit.

In “No Task Left Behind? Examining the Nature of Fragmented Work”, a paper by Gloria Mark, Victor M. Gonzalez, and Justin Harris from the University of California, Irvine, they talk about technology requirements for supporting for fragmented work:

Two decades ago Bannon et al. suggested a set of
requirements for information technology to support
multitasking including providing fast task-switching and the
easy retrieval of mental context. Our work expands these
requirements for multi-tasking. We suggest three main
directions for supporting multi-tasking behavior: 1)
interruptions ideally should match the current working sphere
in order to provide benefits instead of disruptions, 2) one
should be able to easily and seamlessly switch between tasks,
and 3) interrupted tasks should be easily recoverable by
preserving the state of the task when it was interrupted and
by providing cues for reorienting to the task.

It’s this third design criteria that I’m describing here – being able to preserve the state of an application or a set of applications. 

But placeholding doesn’t seem to be high on request lists for new features, so vendors haven’t paid a lot of attention.  In the meantime I’ve gotten into the habit of doing a crude workaround while reviewing large Word documents where I place manual bookmarks within documents I’m reading by typing “[bookmark]” in long documents and saving a new local version.

While there is some minor time benefit to having placeholding, I believe the primary benefit is psychological.  A standard work pattern for information workers is that during a day they become deeply nested in what they are doing.  Multiple browser windows, a spreadsheet or two, a custom app, and a document may all be open to various palaces and the user becomes a juggler keeping all the balls circulating in their mind.  A system crash is the most extreme event that makes one realize how much they were juggling as they attempt to recreate their state upon rebooting.  Keeping these placeholders in memory hinders task-switching and increases the stress the user feels when being interrupted (or anticipates the potential of interruption).  Knowing that places are being held would not eliminate the need to retain mental context, but would reduce it by removing the burden to remember all the documents opened and places within those documents.  I hope that vendors do additional research into how users react to placeholding from an attention management, interruption science, and usability point of view. 

Jakob Nielsen on Articles vs. Blog Postings

July 20, 2007 at 3:09 pm | Posted in Analyst biz, Blogs, usability | 3 Comments

I’ve been writing this blog for about 10 months (and 115 postings) now and have enjoyed the opportunity to participate in my small way in various debates in the internet community. I’ve been able to get feedback to ideas I’m working on, publish smaller pieces of content that don’t normally fit the heft or formal voice I use in my professional writing, and plug an event or report now and then. In all, it’s been a good experience.

So reading Jakob Nielsen’s (a usability guru) recent screed against casual blogging (“Write Articles, Not Blog Postings (Jakob Nielsen’s Alertbox)“) I can’t help but feel he’s missed the point of this particular style of blogging (blogging technology can also be used to do other styles of blogs such as formal content publication and internal enterprise blogging).

He begins by relating a conversation he had with a “world leader in his field” about whether to blog.

… I recommended that he should instead invest his time in writing thorough articles that he published on a regular schedule. Given limited time, this means not spending the effort to post numerous short comments on ongoing blogosphere discussions.

I’d summarize the rest by saying Jakob describes how the wildly varying nature of most blogs (entries of varying level of quality, expertise, and depth) leads to a scattershot approach that sinks the writer below the thin upper crust of top experts in the field. Longer, in-depth, carefully written entries would be better since they would maintain the appearance of having the highest level of expertise.

That may be true if the goal is to be a good writer. But I think most bloggers want to be a good conversationalist. If you were trying to engage people at a dinner party I would not recommend you stand up, talk for thirty solid minutes in a properly formatted argument with numbered points and rebuttals for anticpated arguments, then sit down. If you were at a conference that would be appropriate. They are different forums. His comments seem to frame blogging as being about content when it’s really about community.

As for the variance in quality, expertise, and depth I think readers of blogs have different expectations than they do of a white paper, conference presentation, or academic thesis. In many cases, the reader simply wants to live in the head of the blogger to see what they find interesting and what they’ve been reading. That’s an attention management characteristic of new technologies such as social tagging/bookmarking as well – people pay more attention to content that people they respect are paying attention to.

Many bloggers just link to articles or provide minimal commentary on the topics of the day along with the links. Jakob dislikes this – “Blog postings will always be commodity content: there’s a limit to the value you can provide with a short comment on somebody else’s work.” But I think back to one of my first posts in 2006 where I talked about David Foster Wallace’s writing style: “The point of Wallace’s writing style, to me, is that the value of his content is the unique structure he superimposes on it. More than most other writers, Wallace really gives you a feeling of not just what he knows and thinks, but how he is thinking about it.” That is what is going on in many blogs as well. Even if a blogger is just linking to information, he provides value by the structure imposed on it – what is selected.

Maybe a Little Thing Called “Personalization” Would Help?

May 16, 2007 at 11:23 am | Posted in Intranet, portals, usability, User experience | 1 Comment

I was happy to see the Wall St. Journal had a 3 page section in the 5/14/07 issue on “Business Solutions: Building a More useful Intranet”. But I was disappointed to see the total absence of any mention of the importance of trimming down the information presented based on the the user’s profile. The “Portal” word was invoked once, but not explained or associated with “contextual delivery”. Personalization – not mentioned.

There is an interview with Kara Coyne from Nielsen Norman Group. I had the pleasure of sitting down with Kara a few years ago and discussing how I thought that the human factors industry was not keeping up with the times by continuing to treat screen design as a static home page issue – as if one was laying out a newspaper ad. But here she is, on page R6, responding to a statement about clutter by saying

What upper managers often don’t understand is, the more items that there are, the less likely it is that users are going to see your item. If you have too many choices, you’ll end up tuning them all out.

Right! That statement is the perfect lead-in for the need for a technology that helps line up information about who a specific user is (a profile compiled from the directory, HR information such as title and department, skills inventory, heuristic analysis of their contributions and attention stream, and self-profiling or “opt in”) with metadata about the content to determine what would be important to that individual. It’s personalization and portals have been doing it a very long time. You can make it very complicated if you want to, but even in its simplest form it can have tremendous value by narrowing down the information to be sorted through.

The problem is not that there are too many items on the intranet or on the home page. The problem is that no effort has been taken to determine which items are of value to a given user. 99%+ of the information on an intranet is useless to any one individual, so any simple filtering of information – even just by department or job function – can have a huge impact. Why force them all to see the same thing when technology has existed for years to help winnow down the information? Search (which is mentioned in the section) is part of the picture (the “push” part), but not a winning answer for home page design.

The American Electric Power example they give (winner of a Nielsen Norman Group award) is once again a demonstration of an advertising-like devotion to examining “the” homepage as a static work of art. Any design review, in my opinion, should be dynamic. It should start with asking what the main types of users are and then asking to see what the homepage looks like for each type (the executive’s homepage, a call center agent’s homepage, an IT developer’s homepage, etc.). The design has to serve the function and only by knowing what various users need from a website can you determine if the design is appropriate. And no one design will meet the majority of needs of even the top 5 categories of users, so why spend all that time on usability of a non-optimized page? The AEP page looks beautiful in a generic way, but why is it forcing an information worker to click and dig to get their information instead of caring a little bit about who they are and bringing it to them? Maybe it does, but that’s not apparent from the web page shown.

Also, on a separate peeve, I’ve spoken many times about the value of collaboration when used in context. The AEP site has a button at the top called “The Agora” that’s described as “a new area where employees can meet and collaborate”. Does the user need to go to a special area to collaborate? Does that mean the rest of the intranet is not for collaborating? I can’t tell from the picture or text, but I’ve seen that quite often when I’ve done design reviews. If someone’s job is to track sales leads, for example, collaboration should be used in context – right there next to the sales lead system and displaying collaborative discussions and documents relating to the sales lead being examined. Information workers rarely stop what they are doing to say “Gee, I think I’ll go collaborate for a while and then get back to work”. Collaboration can have its own home page and entry point too, but contextual collaboration is where I see collaboration having the most value.

Security Trimmed UI: Great for Reining in Precocious Users, Bad for Me

February 5, 2007 at 2:01 pm | Posted in Microsoft SharePoint, portals, usability | Leave a comment

I’d like to go back to a topic that’s been near and dear to my heart for about 15 years now – user interface design. I did quite a bit of work on web usability in OS/2 GUIs and again in the early portal and web days when it seemed UI design hadn’t caught up with the need for dynamic and personalized sites (it still hasn’t). Well, my rant today isn’t about UI design for portals, but for the security-trimmed interfaces that are all the rage these days. There seem to be an increasing number of applications whose interfaces make me feel like a precocious child being shielded from the dangerous consequences of my inquisitiveness.

Microsoft has taken the plunge, most noticeably for me in SharePoint. I’m trying to find the audience creation functionality in Microsoft Office SharePoint Server 2007 and the problem seems to be that my server doesn’t have that functionality enabled. But I can’t tell for sure. Searching for specific step-by-step instructions has proven futile except for one streaming video of a presentation of someone demonstrating it with the camera focusing (or trying to) on the screen behind the speaker.

I had a similar problem with an audio waveform editor that I have. The manual kept referring to some great pitch fixing functionality. It made it seem so easy to launch the editing screen that they didn’t even have to describe the process other than to say “After launching the pitch editor …”. I spent half an hour searching for it (right click on the waveform? Is it a button in another wave editing screen?) to no avail. Finally after an email to support I was told it’s only on the “pro” version of the product (this isn’t mentioned anywhere in the manual as it must have been a last minute marketing decision to make that a premium feature) and therefore the interface was trimmed not to show it to me.

I understand the usefulness of the security or rights trimmed interface. It can help both the app owner (keep users from asking about features they don’t have) and end user (avoid confusion by simplifying the interface). But I believe it’s being overdone and without due consideration for the negatives (it’s hard to definitively tell an option is not present, so one continues searching).

UI designers should:

1. Allow opting in or out of a trimmed UI. Consider an option for the end user to select “advanced mode” that shows all options with those not selectable grayed out
2. Make sure manuals and online help are accurate. They need to show exactly how to launch their functionality (the exact menu or button and where it’s found) and describe why it may not be visible
3. Consider the negatives of a trimmed UI as well as the positives and act accordingly. For applications where precociousness is not as prevalent or dangerous consider just graying out unavailable options.

Blog at WordPress.com.
Entries and comments feeds.