B  U  L  L  E  T  I  N


of the American Society for Information Science and Technology       Vol. 29, No. 6     August/September  2003

Go to
Bulletin Index

bookstore2Go to the ASIST Bookstore

Copies

Knowledge Compass: Opening Windows, Punching Holes in Stovepipes, Forming Communities, Connecting People to People
by Jane K. Starnes

Jane K. Starnes is senior information specialist at the Intel Library. She can be reached at jane.k.starnes@intel.com

Intel Corporation is a Fortune 100 company with over 80,000 employees located at multiple sites all over the world. As with all such organizations, Intel has divisions created by organizational structure, time differences among sites and geographic separation. These become barriers to sharing information within the organization. Add to that the fact that Intel does not have a central document repository or comprehensive intranet index, and it all adds up to so much information, so little time.

Looking for a Solution

We looked for a solution to these barriers that would enable employees with critical problems to discover the best expertise within the organization and facilitate the exchange of that expertise. We wanted to capture the outcome for reuse. And we knew that we would need the tools to analyze the results to be able to demonstrate that this effort was worthwhile.

The steps in the process were as follows:

  • The Technology Team and the Business Team collected business requirements and translated them into product requirements.
  • Twelve products were evaluated, four selected for testing.
  • One product was selected and a pilot project was kicked off in October 2001. The system was named Knowledge Compass (KC). The pilot
    • grew from 500 users to about 1300 in six months
    • showed savings of $1,000,000;
    • reduced training time, product cycle time and business processing time; and
    • ended in August 2003 with management commitment to “take it to the enterprise” (a statement indicating that a lot of funding and staffing would be allocated at a time when both were being cut throughout the company).

What is Knowledge Compass?

Knowledge Compass (KC) is a system for locating “experts,” asking questions and capturing the answers for reuse. The database is built in two ways: (1) experts pre-load documents and links to resources that already exist and (2) people ask questions, and the answers are kept, graded and sometimes recorded as “best practices.” KC is a customized implementation of the product from AskMe, Inc. (www.askme.com)

How Did We Do That?

In tight budget times, this project won high-level management support and major funding. The following were among the keys to success:

  • A Management Review Committee watched over the project from beginning to end.
  • The project was tightly managed, and time spent was carefully justified and accounted for.
  • Product requirements were developed in conjunction with customers and the selected product met most of the criteria.
  • The product was carefully marketed in a way that demonstrated its special niche in the flow of information.
  • Users were trained on how to use the product; there were several levels of training, tailored to the type of interaction the individual was expected to have with the product (moderator, expert or knowledge seeker).
  • Evaluation was continuous and very numbers-based. How many users? Repeat users? How many questions posted? How many answers?
  • The vendor was committed to making our installation successful.

Customers were recruited in two ways:

    1. “Followers” were signed up when their managers decided to join the pilot. The implementation team followed a repeatable process, focusing on the key behavior changes needed and the reuse of documents and processes
    2. The “infection model” encouraged word of mouth advertising. Enthusiastic users were given model presentations to showcase their new tool to co-workers and encourage them to sign up.

What is Knowledge Compass?

Figure 1 illustrates the Start screen of Knowledge Compass. Customers can browse through the vocabulary tree or enter keywords in the search box at the top right. Users can designate certain categories as favorites. The system also adds frequently visited categories to the favorites lists.

Searches can also be initiated from the e-mail system, using an easily installed plug-in. The results page is returned in an e-mail message containing links to KC product.

Figure 2 shows a typical category page. (Note: As the taxonomy consultant, I have administrator rights in all categories. Some of the options showing on this page are only available to administrators.)

Customers can review the experts’ profiles to find the ones who may be best able to answer questions. They can also review all the questions and answers that have already been posted. There are also links to resources, which may have answers to questions and other useful information.

When a question (see Figure 3) is asked, an e-mail notice is sent to the appropriate experts with a link to the question in KC. The experts can set a maximum number of pending questions they’ll accept, mark themselves out of the office or otherwise control their workload.

When an answer is posted, the requestor receives an e-mail with a link to the answer in KC. The requestor rates the answer with one to four stars, grading the usefulness of the answer. Average ratings for each expert are shown in their profiles and reported to their category moderators. The answer can be nominated as a Best Practice; the category moderator approves best practices.

Who Is an Expert?

Category moderators select experts or they sign themselves up. Experts’ profiles allow them to define the scope of their expertise (for instance, “I can write Excel macros, don’t know pivot tables”) so that requestors will be able to direct questions to selected experts.

Figure 4 shows an example of the basic profile that every expert enters:

The profile page also shows the summary statistics for that individual. The rating of answers is a self-policing mechanism – an expert who gets consistently low ratings should withdraw. Category moderators also monitor the answers and ratings.

How Knowledge Compass Was Deployed

Work teams of two to four people were formed. The Business Engagement team was largest – they developed training materials, conducted “sales presentations” and worked with customers on implementation. The Technical Support team was two people plus contract support from our internal programming support group. The Management Review Committee comprised the senior IT managers.

Once we got the commitment to expand the use of Knowledge Compass to the entire company, we knew that we would need to work on the taxonomy. We developed a small “backbone” taxonomy into which all other categories would fit hierarchically. Manufacturing, marketing, product design, research and development, sales, security and safety, and software development were among the categories in the backbone taxonomy.

We interviewed more than 40 people from all levels, and all organizations then proposed several different models for the taxonomy. A focus group selected an activity-focused vocabulary. We thought that this focus on the activity or task, rather than the organization, would encourage more sharing across organizations. A manual and training class for how to use and expand the taxonomy were developed and every new customer group received that training.

Preparation for the Big Expansion

During the pilot, a few customers got a lot of attention from the team. We knew that if we were going to succeed in taking this product to a lot of customers, we would need for the customers to carry more of the load. We developed checklists for customers on how to prepare their groups to participate, provided model timelines and gave the product advocates for the groups such responsibilities as doing presentations for recruiting experts. We gave them PowerPoint presentations and encouraged them to personalize them for their groups.

Training for product advocates, category moderators and experts was offered at appropriate points along the timeline, using PowerPoint presentations, delivered through an online collaboration tool. The members of the implementation team were “certified” for each level of the training so that we could be sure that consistent messages were being delivered. The training materials are now being converted to Web-based instructional classes that can be self-paced rather than instructor-led.

Definitions of Roles for Knowledge Compass

The project encompasses the following roles:

  • The implementation manager is a member of the KC team who helps the customer team set project schedules and provides or arranges training at each step of the timeline.
  • The taxonomy consultant is a part-time member of the KC team and works with the group to select the categories in which they will participate and helps to create new ones. The taxonomy consultant is a member of the library staff.
  • The product champion is the individual from the customer group who advocates Knowledge Compass within the group and convinces management to participate.
  • The category moderator(s) works on the vocabulary for the group, requiring familiarity with the work of the group. Moderators recruit experts and monitor group usage of the categories to make sure the groups are using the tools appropriately. They are also responsible for monitoring the quality of the content their experts are providing. For most customer groups, the product champion and category moderator were the same person, but the system allows for each category to have its own moderator, so some groups have several moderators.
  • Experts are customers who sign up to answer questions in a category and identify resources to link to categories such as FAQs, web links and departmental documents.
  • Knowledge seekers are all other users of Knowledge Compass – those who search the database for best practices and/or post questions.

How Do We Know It’s Useful?

Table 1 shows the kind of information we tracked.

Tracking usage statistics

  • Number of users
  • Questions posted
  • Answers posted (may be multiple)
  • Average time to answer
  • Average rating of answers

Activities

  • Searches
  • Browses
  • Questions
  • Answers
  • Resources
  • FAQs
  • Best Practices

Asking for ROI (return on investment) information

  • Did this save you time?  How much?
  • Did this save you money? How much?
  • Results are tallied & reported to department manager

 

 

     

Table 1: Categories of information gathered to measure performance.

Insights Gained from the Experience

 With a project of this size, many signs emerge to help guide future efforts. The following are among the insights we gained:

  • Experts” is an off-putting term for some users.
  • Using total users as the sole criterion for success led to
    • rushing customers through training
    • not spending enough time on vocabulary development
    • not leaving experts enough time to adequately collect resources.
  • It’s a big culture change to encourage sharing information and needs more time.
  • Management asks how to value and reward information sharing, and we’re still inventing answers.
  • Individuals want to know how management will value their effort to participate.
  • The urge toward “faster, better, cheaper” is pushing us toward less hands-on customer involvement which may make quality control harder.

Additional Information

For additional information about this effort, please see Knowledge Compass: Capturing and Providing Business Information (www.intel.com/eBusiness/pdf/it/wp030601.pdf) and AskMe Corporation (www.askmecorp.com). All opinions are those of the author and should not be interpreted as official INTEL policy.


How to Order

American Society for Information Science and Technology
8555 16th Street, Suite 850, Silver Spring, Maryland 20910, USA
Tel. 301-495-0900, Fax: 301-495-0810 | E-mail:
asis@asis.org

Copyright © 2003, American Society for Information Science and Technology