Thursday, March 22, 2012

Structured Content Management

Organizations of all sizes are beginning to realize how content and its reuse across the enterprise can improve productivity. The need for change is driven by the desire to better manage information assets (documents, creative ideas, illustrations, charts, graphics, multimedia, etc.) and eliminate costly processes that fail to facilitate the effective and consistent re-use of content.

Content reuse can take a variety of forms. The most common reuse scenario is dynamically updating multiple web pages when content is added or removed from a site. There are also content reuse opportunities across multiple web sites, as in the case of co-branding and syndication. Content reuse is critical and often complex when supporting print and web publishing. Perhaps the biggest impact content reuse is in efficient multilingual publishing.

To reuse content it must be structured. Structured content simply means that the information is stored in a format that defines and describes the content. Extensible Mark-Up Language (XML) is a simple and effective format for creating and managing information. Using XML you can describe the content that you are managing, so a headline will actually be defined as a headline, and likewise for a price, a product description, a caption, etc.

Although structuring takes some planning, the benefit is enormous. You can easily re-use text and media for a variety of purposes. You can create publications quickly because images and text are easy to find and put together. Updating your publications is easier because you only need to make changes in one place, and it updates everywhere the content is used. Managing structured content happens in an XML-based content management system (CMS).

There are often great benefits to content structure. Benefits include:
  • making content more retrievable and re-usable;
  • reducing costs and complexity of translation;
  • enforcing authoring, style, and branding guidelines;
  • improving information interchange.
XML is the industry standard format for structuring content. It is very easy to work with and is easy to migrate to other formats. Graphics, video, Word documents, PDF's and other files are wrapped in XML to provide structure and metadata that makes the files easy to find and manage. XML was explicitly designed to represent the very hierarchical models of content.

There are four basic parts critical to structuring information:
  • defining content types;
  • identifying rules of content hierarchy;
  • creating modular content units;
  • applying standards consistently.
Defining Content Types

When you begin to analyze your existing documentation and future requirements, think about your content according to its informational type rather than its format. Procedures, topics, facts, terms, definitions, prices, product numbers, and product descriptions are common information types.

As you continue to analyze the content you create, you will likely discover that many content types are reusable. For instance, you may discover that there is no reason that your product description should be any different regardless of where it is published.

Identifying Rules of Content Hierarchy

The most significant way that structured documents differ from unstructured ones is that structured documents include rules. These rules formalize the order in which text, graphics, and tables may be entered into a document by an author. For example, in an unstructured document, a paragraph has specific formatting - font, size, and spacing. In a structured document, this same paragraph also has an exterior wrapper that governs the elements that are allowed to appear before and after it. The elements' rules are defined in a document type definition (DTD) or schema.

Structured content management implies moving away from formatting cues to signal such relationships within a document and instead working with information rules. This is where the power of the information model comes from, but also the difficulties in change management, in ways authors are used to working with CMS.

Creating Modular Content Units

Structured content management requires that you begin to look at the content you create as separate, identifiable chunks of information that can be reassembled differently depending on audience, purpose, or delivery method. This represents an intentions based analysis, and not an academic exercise. How, where, and when you intend to re-use that content should drive your modularity.

These chunks of information, once identified and tagged, can be reassembled (reused/repurposed) in other information products. They can even be reused in a different order. Modular content from a source document could be reused in a marketing brochure, user manual, and customer-facing web site.

Using Standards Consistently

At a subconscious level, you may understand the importance of following internal standards, branding guidelines, and formalized structure. But, it is human nature to continue to find reasons to override templates or alter the format "just this one time."

Breaking the rules is not allowed when it comes to structured authoring. Reuse is only possible when your information is consistently structured. Imagine how useless a phone directory would be if the data entry clerks at the phone company were allowed to enter information in any order they choose. Some clerks use the first field for first name, others for last name. And instead of last name, first name, ordered alphabetically, what if some of the listings were first name, last name?

Of course, most enterprise content is not as highly structured as a phone book. But if your goal is to reuse content, it must be be structured consistently. If adhering to a particular document standard seems painful, re-examine whether the content is really as structured as you think, or change your expectations about how much information can be re-used and easily shared. Document models can be made easier and more flexible, but with a cost in downstream utility of the information.

XML Building Blocks

You have identified your content types, chunked them into modular components intended for re-use, established the relationships among those chunks, and decided that you can live with them in a componentized fashion to the extent that your team will follow that structure consistently.

These concepts are important in structured XML content management:
  • Elements
  • Attributes
  • DTDs and Schemas
Elements

The basic unit of information is called an element. Elements can be text, graphics, tables, or even containers for other elements. In short, everything is an element.

When you create an information model, you define a document hierarchy. A hierarchy specifies the order in which elements are allowed to be used in a particular information product.

For example, for a set of user documentation, a ChapterTitle always begins a chapter, followed by a synopsis and a bulleted list of topics in the chapter. Elements are powerful tools that allow you to create structured content appropriate for reuse.

Attributes

XML elements can be extended to contain more information than just a label. Elements have attributes which is additional information about each element. For example, a chapter element can have an optional attribute of author and the author's university affiliation. These attributes allow to find all instances of a specific author or university.

Because you can classify information based on attributes, you can create new information products from your source content that you would otherwise have to cobble together manually.

Documentation authors have long benefited from adding attributes to the elements of content they create, allowing readers to use "help" applications and user guides more intelligently. Attributes can help indicate in which information products an element should appear, and in which languages. For example, some elements should be present on a web site, but may not be appropriate for a printed guide; others should appear in the Spanish version of a document, but not in the Portuguese.

Attributes make content smart enough to know where to go. For example, elements and attributes can be harnessed to create dynamic content for web-based information products, based on the personal preferences of your users.

DTDs and Schemas

You define the structure of an information product in a document type definition (DTD) or a schema. A schema, unlike a DTD, is an actual XML document, but both are used to define information models. Both provide considerable modeling power and can help facilitate content reuse and multi-channel publishing.

Tuesday, March 20, 2012

Wiki Applications

Wiki is a Hawaiian word meaning "fast" or "quick". Wiki applications is is collaborative software that runs a wiki. A wiki is a web site which users can add, modify, or delete its content via a web browser using a simplified markup language or a rich-text editor. Wiki allows users to create and collaboratively edit web pages via a web browser. Examples of wiki use include community websites, corporate intranets, knowledge management systems, and notetaking.

Main features of Wiki:

  • Wiki users can edit any page or to create new pages within the wiki web site, using only a plain web browser without any extra add-ons. 
  • Wiki promotes meaningful topic associations between different pages by making page link creation almost intuitively easy and showing whether an intended target page exists or not. 
  • Wiki is not a carefully crafted site for casual visitors. Instead, it seeks to involve the visitor in an ongoing process of creation and collaboration that constantly changes the web site landscape.

Wiki enables users to write documents collaboratively, using a simple markup language and a web browser. A single page in a wiki website is referred to as a wiki page, while the entire collection of pages, which are usually well interconnected by hyperlinks, is the wiki. Wiki is essentially a database for creating, browsing, and searching through information. A wiki allows for non-linear, evolving, complex and networked text, argument and interaction.

A defining characteristic of wiki technology is the ease with which pages can be created and updated. Most wikis keep a record of changes made to wiki pages; often, every version of the page is stored. This means that authors can revert to an older version of the page, should it be necessary because a mistake has been made.

Within the text of most pages there are usually a large number of hypertext links to other pages. Wikis generally provide one or more ways to categorize or tag pages to support the maintenance of such index pages. Users can create any number of index or table-of-contents pages, with hierarchical categorization or whatever form of organization they like. Most wikis have a backlink feature, which displays all pages that link to a given page. It is typical in a wiki to create links to pages that do not yet exist, as a way to invite others to share what they know about a subject new to the wiki.

Wikis are generally designed with the philosophy of making it easy to correct mistakes, rather than making it difficult to make them. Thus, while wikis are very open, they provide a means to verify the validity of recent additions to the body of pages. The most prominent, on almost every wiki, is the Recent Changes page which is a specific list numbering recent edits, or a list of edits made within a given time frame.

From the change log, other functions are accessible in most wikis: the revision history shows previous page versions and the diff feature highlights the changes between two revisions. Using the revision history, an editor can view and restore a previous version of the article. The diff feature can be used to decide whether or not this is necessary. A regular wiki user can view the diff of an edit listed on the "Recent Changes" page and, if it is an unacceptable edit, consult the history, restoring a previous revision.

In case unacceptable edits are missed on the "recent changes" page, some wiki engines provide additional content control. It can be monitored to ensure that a page, or a set of pages, keeps its quality. A person willing to maintain pages will be warned of modifications to the pages, allowing him or her to verify the validity of new editions quickly.

Some wikis also implement "patrolled revisions," in which editors with the requisite credentials can mark some edits as not vandalism. A "flagged revisions" system can prevent edits from going live until they have been reviewed.

Most wikis offer at least a title search, and sometimes a full-text search. The scalability of the search depends on whether the wiki engine uses a database.

There are essentially three types of usage for wiki: public wikis with a potentially large community of readers and editors, enterprise wikis for data management by corporations and other organizations, and personal wikis, meant to be used by a single person to manage notes, and usually run on a desktop. Some wiki applications is specifically geared for one of the usage types, while other software can be used for all three, but contains functionality, either in its core or through plugins, that help with one or more of the usage types.

Public Wikis

Public wikis are wikis that can be read by anyone and they usually (though not always) can be edited by anyone as well with sometimes required registration. Among public wikis, MediaWiki is the major application. It is used for the most popular public wiki, Wikipedia, as well as for the most popular wiki farm, Wikia, and it is the most popular application in use on other public wikis as well. Other wiki engines used regularly for public wikis include MoinMoin and PmWiki, along with many others.

Enterprise Wikis

Enterprise wiki application is used in a corporate (or organizational) context, especially to enhance internal knowledge sharing, with a greater emphasis on features like access control, integration with other software, and document management.

Most wiki applications are enterprise solutions including Confluence, Socialtext, Jive Engage, SamePage, and Traction TeamPage. In addition, some open source wiki applications also describe themselves as enterprise solutions including Foswiki which calls itself "free and open source enterprise collaboration platform", and TWiki, which calls itself "Open Source Enterprise Wiki". Some open source wiki applications, though they do not specifically call themselves as enterprise solutions, have marketing materials geared for enterprise users, like Tiki Wiki CMS Groupware and MediaWiki. Many other wiki applications have also been used within enterprises.

Within organizations, wikis may either add to or replace centrally managed content management systems. Their decentralized nature allows them to disseminate needed information across an organization more rapidly and more cheaply than a centrally controlled knowledge repository. Wikis can be used for document management, project management, customer relationship management, enterprise resource planning, and many other kinds of data management.

Features of wikis specifically helpful to a corporation include:

  • Allow to connect information using quick and easy to create pages containing links to other corporate information systems, like people directories, CMS and thus build up knowledge bases. 
  • Avoiding e-mail overload. Wikis allow all relevant information to be shared by people working on a given project. Only the wiki users interested in a given project need look at its associated wiki pages, in contrast to high-traffic mailing lists which may burden subscribers with many messages, regardless of relevance to particular subscribers. It is also very useful for the project manager to have all the communication stored in one place, which allows them to link the responsibility for every action taken to a particular team member. 
  • Organizing information. Wikis allow users to structure new and existing information. As with content, the structure of data is sometimes also editable by users. 
  • Building consensus. Wikis allow the structured expression of views disagreed upon by authors on a same page. This feature is very useful when writing documentation, preparing presentations and so on. 
  • Access rights, roles. Users can be forbidden from viewing and/or editing given pages, depending on their department or role within the organization. 
  • Knowledge management with comprehensive searches. This includes document and project management, as well as using a wiki as a knowledge repository useful during times of employee turnover, retirement and so on.

Personal Wikis

Application that is specifically designed for running personal wikis includes NotePub, Pimki and Tomboy. Other, more general, wiki applications have components geared for individual users, including MoinMoin, which offers a Desktop Edition.

Wiki applications can include features that come with traditional content management systems, such as calendars, tasks lists, blogs, and discussion forums. All of these can either be stored via versioned wiki pages, or simply be a separate piece of functionality. Applications that support blogs with wiki style editing and versioning is sometimes known as "bliki" software. Tiki Wiki CMS Groupware is an example of wiki software that is designed to support such features at its core. Many of the enterprise wiki applications, such as TWiki, Confluence and SharePoint, also support such features, as do open source applications like MediaWiki and XWiki, via plugins.

Wiki software can let users store data via the wiki, in a way that can be exported via the Semantic Web, or queried internally within the wiki. A wiki that allows such annotation is known as a semantic wiki. The current best-known semantic wiki software is Semantic MediaWiki, a plugin to MediaWiki.

Friday, March 16, 2012

SharePoint User Interface

In my last post on SharePoint architecture, I mentioned that there are few parts in SharePoint architecture: farms, web applications, site collections, service applications, administration and security. User interface is represented by site collections. In my today's post, I am going to describe SharePoint user interface.

Site collection is the top-level SharePoint site which contains children sites which are organized in a hierarchy. When you create a site at the root of a Web Application, you create a site collection. In other words, a SharePoint site collection is a hierarchical set of sites that can be managed together. Sites within a site collection have common features, such as shared permissions, galleries for templates, content types, and web parts, and they often share a common navigation. Creation of site collections usually performed by a system administrator.

Site collection has hierarchical structure. When a site collection has been created, next step is to create a site. A site is a single SharePoint site within a site collection. Creation of sites can be delegated to users of a site collection. A site can inherit permissions and navigation structure from its parent site collection, or they can be specified and managed independently.

There are times when it is appropriate to create an entire site collection, and there are times when it makes more sense to create a single site. For instance, if you have many projects that fit within a larger context, it makes sense to create a single site collection for that context, and create sites to manage each project. For example, it makes sense for the engineering department to have a separate site collection from the legal department. An engineering department might have one site collection and use that site collection to house multiple sites, one site per engineering project.

A site may contain sub-sites, and those sub-sites may contain further sub-sites. Typically, sites need to be created from scratch, but sites can also be created according to pre-defined templates that provide previously set-up functionality. Sites have navigation, themes/branding, custom permissions, workflows, and have the ability to be configured or customized in a number of ways.

If the site is used for the document management, next step after the site in the SharePoint hierarchy are libraries and lists. A SharePoint Site is a collection of lists, libraries, pages, and web parts.

A list is a collection of similar items. A list contains columns that define the item metadata. Each item stored in a list shares the same metadata. For instance, you can have a list of links called "my links", where each item has a URL, a name, and a description.

Lists resemble database tables in structure and behavior. Lists support various field or data types, and can have triggers that react to list events such as creating, updating or deleting items. In addition lists can be configured to filter, sort or group items based on item data or properties.

Lists can be based on list templates, such as calendars, contact lists, tasks, announcements. You can create multiple lists based on a single list template. Lists can include workflows.

A library contains documents. In a library a document is the item with library metadata supporting the document. Each item in the library refers to a file that is stored in SharePoint database. A library is a location on a site where you can create, collect, update, and manage files.

You can customize libraries in several ways. You can control how documents are viewed, tracked, managed, and created. You can track versions, including how many and which type of versions, and you can limit who can see documents before they are approved. You can use workflows to collaborate on documents in libraries. You can specify information management policies to manage the handling and expiration of documents within libraries.

There are few types of libraries:

Document library is used for all types of documents.

Picture library is used to store digital pictures or graphics. Although pictures can be stored in other types of libraries, picture libraries have several advantages. For example, from a picture library you can view pictures in a slide show, download pictures to your computer, and edit pictures with graphics programs. Consider creating a picture library if your team reuses many graphics, such as logos and corporate images, or if you want to store pictures of team events or product launches.

Wiki page library is used create a collection of connected wiki pages. A wiki enables multiple people to gather routine information in a format that is easy to create and modify. You can add to your library wiki pages that contain pictures, tables, hyperlinks, and internal links. For example, if your team creates a wiki site for a project, the site can store tips and tricks in a series of pages that connect to each other.

Form library is used to store and manage electronic forms.

Reports library helps organizations create, manage, and share information contained in business data web parts, key performance indicator (KPI) web parts, and Excel web access web parts used for business intelligence analytics. The Records Center site template has a reports library by default, but anyone who can create document libraries within a site collection can create a report library. The reports library includes a version history for each report, and archives previous versions. Users can create new versions of reports for special events or milestones, and later revert to a previous report.

Translation management library - helps organizations create, store, and manage translated documents by providing both views and specific features that facilitate the manual document translation process. The translation management library is designed specifically to store documents and their translations. The library tracks the relationship between a source document and its translations, and it groups all of these documents together to make them easy to find. Additionally, the library can be configured with a special translation management workflow that is designed to help manage the manual document translation process.

Data connection library is used to centrally publish connection files to make it easy for users to find and use the data sources they need. Data connection files are easy to create and update, and solution designers can easily reuse them from within the Microsoft Office system client applications.

Slide library is used to share individual slides from a presentation, reuse slides, track the history of a slide, compile individual slides into a presentation, and receive notifications when a slide in a presentation has changed. Users can publish slides to a Slide Library from PowerPoint.

SharePoint has three primary page content-types: Wiki pages, Web-part pages, and Publishing Pages. The default page type is a Wiki Page, which enables free-form editing based on the ribbon toolbar. It is possible to insert Web-parts into any page type.

Web parts are sections that can be inserted into Pages in SharePoint sites. Their typical uses are displaying content defined in the web-part's settings, displaying items from lists and libraries, providing access to features in the SharePoint.

SharePoint site can also be created as Wiki site, blog site, project management site, etc. I described Wiki sites in one of my previous posts on SharePoint, In my future posts, I will describe other site types.

Tuesday, March 13, 2012

Business Analysis - Use Cases

In my last post on business analysis, I described user stories and gave a comparison between user stories and use cases. In my today's post, I am going to describe use cases.

A use case (a case in the use of a system) is a list of steps, typically defining interactions between a role known as an "actor" and a system to achieve a goal. The actor can be a human or an external system.

There is no standard way to write a use case, and different formats work well in different cases. There are few templates to use for a use case.

Simple Use Case

Title: goal the use case is trying to achieve
Main Success Scenario: numbered list of steps
Step: a simple statement of the interaction between the actor and a system
Extensions: separately numbered lists, one per Extension
Extension: a condition that results in different interactions from the main success scenario. An extension from main step 3 is numbered 3a, an extension from main step 4 is numbered 4a, etc.

Complete Use Case

Title: an active-verb goal phrase that names the goal of the primary actor
Primary
Actor
Goal in Context
Scope
Level
Stakeholders and Interests
Precondition
Minimal Guarantees
Success
Guarantees
Trigger
Main Success Scenario
Extensions
Technology and Data Variations List
Related Information

Casual Use Case

Title: goal
Primary Actor
Scope
Level
Story: the body of the use case is simply a paragraph or two of text, informally describing what happens

Example of a Simple Use Case

Use Case Name: Place Order

Actors: Shopper, Fulfillment System, Billing System

Use Case Description: after the user selected items to purchase, user orders the items. The user will provide the payment and shipping information. The system will respond with confirmation of the order and a tracking number that the user can use to check on order status in future. The system will also provide the user with an estimated delivery date for the order which will include all selected items. The user may already have an account with the company with billing and shipping information.

Actors

A use case defines the interactions between external actors and the system under consideration to accomplish a goal. Actors must be able to make decisions, but don't to be human. An actor might be a person, a company or organization, a computer program, or a computer system — hardware, software, or both. The term "actors" are frequently interchanged with the term "users".

Actors are always stakeholders, but many stakeholders are not actors, since they never interact directly with the system, even though they have the right to care how the system behaves. For example, the owners of the system, the company's board of directors, and regulatory bodies such as the Internal Revenue Service and the Department of Insurance could all be stakeholders but are unlikely to be actors.

Similarly, a person using a system may be represented as different actors because he is playing different roles. For example, user "Joe" could be playing the role of a Customer when using an Automated Teller Machine to withdraw cash from his own account, or playing the role of a Bank Teller when using the system to restock the cash drawer on behalf of the bank.

Actors are often working on behalf of someone else. If actor is "sales rep for the customer" or "clerk for the marketing department" to capture that the user of the system is acting for someone else. This tells the project that the user interface and security clearances should be designed for the sales rep and clerk, but that the customer and marketing department are the roles concerned about the results.

A stakeholder may play both an active and an inactive role. For example, a Consumer is both a "mass-market purchaser" (not interacting with the system) and a User (an actor, actively interacting with the purchased product). In turn, a User is both a "normal operator" (an actor using the system for its intended purpose) and a "functional beneficiary" (a stakeholder who benefits from the use of the system). For example, when user "Joe" withdraws cash from his account, he is operating the Automated Teller Machine and obtaining a result on his own behalf.

Look for actors among the stakeholders of a system, the primary and supporting (secondary) actors of a use case, the system under design, and finally among the "internal actors", namely the components of the system under design.

The relationships between all the use cases and actors are represented in a Use Case Diagram.

A use case diagram is a graphical representation of a user's interaction with the system and depicting the specifications of a use case. A use case diagram can portray the different types of users of a system and the various ways that they interact with the system. This diagram is typically used in conjunction with the textual use case and is often accompanied by other types of diagrams as well.

While a use case itself might drill into a lot of detail about every possibility, a use case diagram can help provide a higher-level view of the system. They provide the simplified and graphical representation of what the system must actually do.

Limitations

Limitations of Use cases include:

  • Use cases are not well suited to capturing non-interaction based requirements of a system (such as algorithm or mathematical requirements) or non-functional requirements (such as platform, performance, timing, or safety-critical aspects). These are better specified declaratively elsewhere. 
  • Use case templates do not automatically ensure clarity. Clarity depends on the skill of the writer(s). 
  • Use cases are complex to write and to understand, for both end users and developers. 
  • As there are no fully standard definitions of use cases, each project must form its own interpretation. 
  • Some use case relationships, such as extensions, are ambiguous in interpretation and can be difficult for stakeholders to understand. 
  • In Agile methodology, simpler user stories are preferred to use cases. 
  • Use case developers often find it difficult to determine the level of user interface (UI) dependency to incorporate in a use case. While use case theory suggests that UI should not be reflected in use cases, it can be awkward to remove this aspect of design, as it makes the use cases difficult to visualize. In software engineering, this difficulty is resolved by applying requirements traceability, for example with a traceability matrix. 
  • Use cases can be over-emphasized. For example, driving system design too literally from use cases, and using use cases to the exclusion of other potentially valuable requirements analysis techniques. 
  • Use cases are a starting point for test design, but since each test needs its own success criteria, use cases may need to be modified to provide separate post-conditions for each path.

Monday, March 12, 2012

Business Analysis, User Stories, and Agile Methodology

In my last post on business analysis in content management, I mentioned that methods used in business analysis are: focus groups, documentation analysis, surveys, user task analysis, process mapping, stakeholders interviews, use cases, user stories, personas, storyboards, etc. In my today's post, I am going to describe user story method.

A user story is a short, simple description of a feature told from the perspective of the person who desires the new system capabilities, usually a user or a customer of the system. It is one or few sentences in the everyday or business language that captures what a user does or needs to do as part of his or her job functions.

User stories are used with Agile software development methodologies as the basis for defining the functions that a business system must provide. It captures the "who", "what" and "why" of a requirement in a simple, concise way, often limited in detail by what can be hand-written on a small paper notecard. User stories are written by or for a business user as that user's primary way to influence the functionality of the system being developed.

User stories are a quick way of handling users' requirements without having to create formalized requirement documents and without performing administrative tasks related to maintaining them. The intention of the user story is to be able to respond faster and with less overhead to rapidly changing business requirements.

During the process of collecting users requirements, a business analyst gets together with a users' representative to identify the specific needs of the business and create user stories.

As the customer conceives the user stories, business analyst writes them down on a note card with a name and a description which the customer has formulated. If the business analyst and the user find that the user story is lacking in some way (too large, complicated, imprecise), it is rewritten until it is satisfactory often using the INVEST guidelines.

The INVEST guidelines was created by Bill Wake as the characteristics of a good quality user story:

Letter
Meaning
Description
I
Independent
The user story should be self-contained, in a way that there is no inherent dependency on another user story.
N
Negotiable
User stories can always be changed and rewritten.
V
Valuable
A user story must deliver value to the end user.
E
Estimatable
You must always be able to estimate the size of a user story.
S
Sized appropriately or Small
User stories should not be so big as to become impossible to plan/task/prioritize with a certain level of certainty.
T
Testable
The user story or its related description must provide the necessary information to make test possible.

User stories generally are not created to be too definite once they have been written down because requirements tend to change during the development period.

User stories generally follow the following template:

As a "role", I want "goal" so that "benefit".
As a "user" I want to "do something" so that "I can accomplish goal".
As a "user" I want "function" so that "value".

There is also shorter version:

As a "role", I want "goal/benefit"

For example:

"As the CTO, I want the system to use our existing orders database rather than create a new one so that we don’t have one more database to maintain.
"As a user, I want the site to be available 99.9% of the time I try to access it so that I don’t get frustrated and find another site to use."
"As an estimator, I want to see all items I need to estimate this season so that I can see the volume of my work."
"As a user, I want to search for my customers by their first and last names."
 "As a user closing the application, I want to be prompted to save if I have made any change in my data since the last save."

User stories are often written on index cards or sticky notes, stored in a shoe box, and arranged on walls or tables to facilitate planning and discussion. As such, they strongly shift the focus from writing about features to discussing them. In fact, these discussions are more important than whatever text is written.

As a central part of many agile development methodologies, user stories define what is to be built in a project. User stories are prioritized by developers to indicate which are most important. User stories then will be broken down in tasks and estimated by developers.

One of the benefits is that they can be written at varying levels of detail. We can write user stories that cover large amounts of functionality. These large user stories are generally known as epics. Here is an example epic from a desktop backup product:

"As a user, I can backup my entire hard drive."

Because an epic is generally too large for an agile team to complete in one iteration, it is split into multiple smaller stories before it is worked on. The epic above could be split into dozens (or possibly hundreds), including these two:

"As a power user, I can specify files or folders to backup based on file size, date created, and date modified."
"As a user, I can indicate folders not to backup so that my backup drive isn’t filled up with things I don’t need saved."

Agile projects use a product backlog which is a prioritized list of the functionality to be developed in a product or service. Although product backlog items can be whatever the team desires, user stories have emerged as the best and most popular form of product backlog item.

While a product backlog can be thought of as a replacement for the requirements document of a traditional project, it is important to remember that the written part of a user story (“As a user, I want…”) is incomplete until the discussions about that story occur. It is often best to think of the written part as a pointer to the real requirement. A user story could point to a diagram depicting a workflow, a spreadsheet showing how to perform a calculation, or any other artifact the product owner or team desires.

Every user story must at some point have one or more acceptance tests attached, allowing the developer to test when the user story is done and also allowing the user to validate it. Without a precise formulation of the requirements, prolonged nonconstructive arguments may arise when the product is to be delivered.

Agile methodology favors face-to-face discussion over comprehensive documentation and quick adaptation to change instead of fixation on the problem. User stories achieve this by:

  • Being very short. They represent small chunks of business value that can be implemented in a period of days to weeks. 
  • Allowing a developer and a user to discuss requirements throughout the project. 
  • Needing very little maintenance. 
  • Only being considered at the time of use. 
  • Maintaining a close user contact. 
  • Allowing projects to be broken into small increments. 
  • Being suited to projects where the requirements are volatile or poorly understood. Iterations of discovery drive the refinement process. 
  • Making it easier to estimate development effort. 
  • Require close customer contact throughout the project so that the most valued parts of the project get implemented. 

Some of the limitations of user stories in agile methodologies:

  • They can be difficult to scale to large projects. 
  • They are regarded as conversation starters.

While both user stories and use cases serve the purpose to capture specific user requirements, there are major differences between them:

User Stories
Use Cases
Provide a small-scale and easy-to-use presentation of information. Are generally formulated in the everyday language of the user and contain little detail, thus remaining open to interpretation. They should help the reader understand what the software should accomplish.
Describe a process and its steps in detail, and may be worded in terms of a formal model. A use case is intended to provide sufficient detail for it to be understood on its own. A use case has been described as “a generalized description of a set of interactions between the system and one or more actors, where an actor is either a user or another system”.
Can be accompanied by Acceptance Testing procedures for clarification of behavior where stories appear ambiguous.
May be delivered in a stand-alone document.

Friday, March 9, 2012

Case Study - Wind River - Twiki Information Architecture

In the "Case Studies" series of my posts, I describe the projects that I worked on and lessons learned from them. In this post, I am going to describe the project of re-structuring content and information architecture of a content management system based on Twiki in Wind River.

Wind River is a software engineering company that used Twiki as their content management system. TWiki is a Perl-based structured wiki application, typically used to run a collaboration platform, content management system, a knowledge base, or a team portal.

Content organization and information architecture in Twiki were such that users had difficulty in finding information they were looking for. This situation made it difficult to find, re-use, and update content.

This also discouraged further adding of the new content and thus created areas where no documentation existed and the knowledge instead of being shared was being stored in personal computers. Storing the content in personal computers presented a risk of it being lost because it was not backed-up.

There was a lot of obsolete content because no content owners have been formally identified and no retention schedule has been set up. Collaborative work on the documents and projects was accomplished by users sending links to Twiki pages. Without these links it was very difficult to find information. There was no information governance in place and so content management processes were very sporadic.

The task was to re-organize the content organization and information architecture of the system and to set up information governance to solve these problems.

I strongly believe in user-centered design, so I performed the users study. I identified stakeholders within each Wind River team and created the questionnaire for collecting users' requirements for the system re-organization and the usability issues.

Based on these requirements, I re-organized the content structure and information architecture of the system. The major key to this re-organization was that the structure is very simple and intuitive. I made navigation very simple, created very intuitive labels, made sure that there is not too much information on one page and a user does not have to scroll down a very long page, that each page has a correct breadcrumb, and created taxonomy of webs (the building blocks of Twiki). Based on this taxonomy, I re-organized the location of documents. I also enhanced the system search.

For each content type, document owners were identified and retention schedule was set up with the function to flag the content that would reach an expiration date according to the retention schedule. This flag function would send an email notification to the content administrator that a certain document reached an expiration date. This alert allowed the content administrator to contact the document owner for the decision on what should be done with this document: review and update, move to an archive, or delete.

User acceptance testing of the system was performed. Users were satisfied with the system's new information architecture and indicated that it became much easier to find information.

The system with new content structure and information architecture was deployed.

Information governance was set up. Group and individual training was conducted on ongoing basis.

The project was a success. Company management and users were very cooperative in helping to make this project a success. It helped to increase efficiency and productivity and thus saved Wind River cost because employees did not waste any time on searching for documents or recreating documents that already exist.

Lessons learned

1. User-centered design is paramount to the project success. When you design and build the system based on users’ requirements, they are going to use it. Users have the sense of ownership of the system which provides excellent starting point. They know that the system you are building will be what they need.

2. Top-down support is critical for the project success. Management support is a huge factor in employees' encouragement to use the system and in setting up and enforcing procedures for information governance.

3. Assurance of users from the very beginning that they will not be left alone with the system provided their cooperation.

4. User acceptance testing helped to encourage employees to start using the system. When they participate in this process, this gives them the feeling of ownership of the system.

5. Ongoing training after the system deployment with the new content structure and information architecture made user adoption smooth.

Thursday, March 8, 2012

Content Management Systems Reviews - SDL Tridion

SDL Tridion is the leading web content management system. This solution enables organisations to deliver a consistent, interactive and highly targeted customer experience, in multiple languages, across multiple web sites and channels including email, mobile sites and print.

In addition to content creation, management, translation, delivery and archiving solutions, SDL Tridion provides brand management, targeted communication, multi-channel delivery and visitor interaction capabilities.

SDL Tridion integrations well with third-party software packages such as Google Enterprise Search and online metrics suites like WebTrends and Omniture.

Content Entities

SDL Tridion’s famous Building Blocks represent the core of this WCM solution. Building Blocks are Component and Page Templates, Component Presentations, Components, Pages and Schemas.

Content authors usually create Components based on certain Schemas (which define permissible content fields), followed by combining those Components with appropriate Component Templates, thus creating Component Presentations.

A Page may contain several Component Presentations. Every Page requires an appropriate Page Template in order to be rendered properly.

There are two types of Components: Components storing text and links and Multimedia Components handling binaries (.jpg, .gif, .swf, .doc, etc.)

Pages are created in Structure Groups (i.e., a website sections) in a corresponding Publication (i.e., website).

The system comes with a set of default Page and Component Templates. Customized Page and Component Templates will need to be developed to fit the layout and code of your particular web site.

Metadata can usually be handled either on the Structure Group/Page level, or through embeddable metadata Schemas.

From the development standpoint, all Schemas and Templates are created in a development environment and then transferred to test and production environments, using the tool called Content Porter, where they become available to end-users.

Content Versioning

Versioning is quite simple and can be used for two things: to view History and to Compare. The History feature provides data on all modifications on an item (user names and timestamps are included). Using the Compare tool, you can see the exact changes highlighted in various colors made to an item by comparing any two versions.

Workflow

The basic workflow is Ready for Editing – Ready for Publishing, with the ability to modify and add other levels of approval.

You can create and customize your workflow in MS Visio with a Tridion plug-in.

There are additional modules that allow for inline and e-mail notifications. The standard, out-of-the-box version includes commentary capabilities. Tridion’s Event System can also be used to introduce a number of automatic (vs. manual) activities in the content management process.

Workflow is natively integrated into the GUI and Microsoft Outlook allowing a user to perform various activities from his/her inbox.

SDL Tridion offers the following products:

Content Manager (core product)

SDL Tridion’s Content Manager Solution uses proprietary BluePrinting technology that allows organizations to reuse and adapt content, layout, and other functionality for different pages and web sites. This ensures that companies can address target audiences in different regions around the world in a "same but different" manner. The technology enables the user to maintain centralized control of brand and message while allowing for local differentiation. This technology makes sharing complex content over multiple web sites fast and easy while maintaining centralized control and brand consistency.

BluePrinting technology has proven its value to organizations that have any of the following requirements:

  • globalized websites;
  • multi-website management;
  • translation management;
  • brand management;
  • target audience marketing;
  • multi-channel marketing.

Digital asset repository is managed through Multimedia Components that can store major binary extensions, which can be configured per your requirements.

Content Manager Explorer (main user interface to Content Manager) - provides rich functionality for basic to advanced content management tasks within a browser.

SiteEdit

Delivers key contributors with a WYSIWYG, browser-based interface and a collaborative environment for many online communication tasks. It is easy to use, ensuring lower training costs and easy adoption. There is version comparison and roll-back to previous version. It allows to perform simple actions like re-arranging content blocks, adding new content, editing existing content and spell-check.

WebDAV Connector (Windows Explorer access to SDL Tridion WCM and content)

Provides access to SDL Tridion WCM content using Windows Explorer. Contributors can add, edit, delete, and use content in the same way that they would use the Windows file system using the most appropriate desk top application for the task they need to perform.

Word Connector (Microsoft Word-based word processing tool for content creation)

For occasional content contributors who need to create simple text for the organization’s Web site in the word processing tool that they know best, Microsoft Word. Authors can open, edit, and create structured XML content using Word and to save this content directly to Content Manager.

Multilingual Support

It is represented by Translator Manager and enables companies to configure their translation needs within their existing BluePrint structure. This includes defining both the source and target language of typical translation processes. Users can set up that Translation Manager for a certain publication and specify target languages, as well as use workflow to specify the translation process.

For each translation process users can set up criteria, such as which translation agency to use. When properly configured, users can initiate a translation in one or all Child Publications using Translation Manager, which will prompt a translation job creation at a chosen agency.

Translation jobs can be seen in the GUI right below the Publishing Queue. From the translation jobs list users can send the job to the TMS (Translation Management System) and check the status of the translation.

Content Delivery Architecture

Content Delivery is mainly based on two products: Presentation Servers and Dynamic Content Broker. In addition, there are other modules that are part of Content Delivery, including Dynamic Linking, Personalization and Profiling, Content Distributor and Content Deployer.

The separation of Content Delivery from Content Management ensures that only approved content goes outside the firewall. Delivery of both dynamic and static content is possible, using either the Dynamic Content Broker or the Presentation Server approach.

Dynamic Content Broker assembles pages containing dynamic content based on configurable queries. Content Broker’s API is public and can be used to retrieve published content using custom built applications. It allows organizations to deliver online content based on page context, queries and visitor profiles. Provides the ability to create the best balance between static and dynamic web site content.

On the static side, Content Delivery System (CDS) distributes published content from the Content Manager to Presentation Servers, where it is stored in either a file system or a relational database.

Presentation Servers provide storage management, link management and cache management to manage large, complex and high-performance environments. They also manage Dynamic Linking feature responsible for resolving and rendering of links between published content items.

All environments, with a possible exception of test/acceptance, require a dedicated machine.

Content Porter

Content Porter is a Windows client used to import and export code, content and other items from one environment to another (up and down the chain). Content Porter can connect to any OLEDB or ODBC data sources. It ensures a structured quality-control process for all online content. Allows organizations to transfer any type of content managed in Content Manager between different environments.

Archive Manager

Automates web site archiving processes, providing the capability to retrieve an archived wWeb page or entire site for a specific date, time and visitor profile and view these pages with the original content and layout. Enables an organization to comply with regulatory requirements and record all versions of web site pages.

Business Connector

Integrates with other applications, thereby allowing companies to include information stored in external systems such as product catalogs and inventories.

Content Distributor

For organizations with an international infrastructure that need to ensure reliable, scheduled content distribution to all web servers. It ensures content consistency and compliance through many different secure transport providers.

Personalization and Profiling

Personalization and Profiling module (which used to be called WAI) is aimed to create personalized web sites and web pages catering to specific audiences. It works with both explicit and implicit profiles, ensuring that content is personalized based on information that visitors have provided.

Communications Statistics

Tridion Communication Statistics module provides in-site views of how your content is doing. Users can track how their content is performing by viewing web pages with stats graphs on it in a staging environment.

Monday, March 5, 2012

Content Management Systems Reviews - Documentum - CenterStage

In my last post on Documentum, I described one of its collaboration products - eRoom. In today's post, I am going to describe another collaboration product of Documentum - CenterStage.

CenterStage delivers the benefits of enterprise content management, advanced search, and collaboration tools on a single architecture.

It allows to:

  • manage and visually organize project, team, and corporate work information; 
  • launch projects with space and content templates; 
  • work with team members on documents in public and private workspaces; 
  • find information wherever it resides; 
  • gain access to this information from anywhere.

Public or team workspaces where contributors can share and exchange ideas and activities. There is cross-project visibility and awareness for easy project management of simultaneous projects. Users can establish roles and permissions for their team members, set workspace and content policies, and use templates to ensure best practices.
The CenterStage community model brings people and content together with similar interests or objectives. A community member can become aware of many contributors and their activities across different workspaces.

CenterStage provides wide range of collaboration tools: wiki, blogs, inline authoring, discussion threads, tagging, ratings. These tools allow “collaboration with context” – the ability to see inter-related items and information in one view.

Standalone tables

CenterStage provides standalone data tables to facilitate the management of content collections. A structured arrangement of related content organized into fields, columns, and data tables provide a way for teams to manage such information as list of planned projects, contact information, action items, etc. Data tables organize information into a series of related entries making them a useful way to manage such simple items as to-do lists or more complex items as inventory tracking data. With standalone data tables, information is easy to manage, track, and update.

Search and discovery

CenterStage includes advanced search feature. Users can search with one query an unlimited number of information repositories such as SharePoint, file shares, email archives, ERP systems, ECM systems. The results are filtered, merged, and organized into logical groupings so that users can navigate to the most relevant search results.

The search and discovery features are further enhances by different viewing capabilities, providing the ability to determine contents of any file or folder without opening it. In addition to the regular thumbnail view, users can hover their mouse over any file to display its metadata or employ the slides view to grasp the actual content of any file. Search and discovery enables rapid access to the most relevant content by extracting metadata such as company name, place, and topic, and then dynamically filtering the results.

Governance, risk, and compliance

Documentum Content Server is the back end to the CenterStage solution and addresses the need to ensure that all information is compliant with regulations, legal stipulations, and best practices. The content server provides a rich set of content management services and a comprehensive infrastructure for all content applications, so implementation and administration is simplified. It also provides the scalability, robust functionality, and service orientation for global enterprise deployments.

With the content server, companies can store and manage all types of content including HTML and XML, graphics, multimedia, and traditional documents created with desktop applications. All content can be safely managed and archived. There is full audit at all stages of content creation, approval, and use while enforcing information retention and disposal. This in turn ensures optimum network performance by eliminating isolated pockets of data and content stranded across the enterprise.

Other benefits

Branch office caching services – provides quick access to any type of content regardless of bandwidth constraints or network latency.

Content storage services – allocate content across storage tiers based on its changing value and access requirements.

Media transformation services – transform and manage content such as images, video, and other rich media.

Information rights management – secure information no matter where it travels to maintain control over that information.

Transactional content management – accelerate business processes such as invoice processing or case management.

Web content management – manage the content, underlying structure, and publishing process for web sites and portals.

CenterStage Key Features


Team and community workspace
Wiki, blogs, discussion forums, tagging, RSS feeds
Organization of structured content
Standalone data tables to manage content collections
Workspace and page templates
Lifecycles
Component-based UI and composition model
Advanced search and discovery
Policy-based configuration
Federated Search
Access control
Guided navigation
Lifecycles
Content analytics

Thursday, March 1, 2012

Knowledge Centered Support

Knowledge Centered Support (KCS) is a methodology and a set of practices and processes that focus on knowledge as a key asset of the customer/technical support organization. Unlike the traditional add-on process of knowledge engineering, KCS is an integral part of day-to-day operation in support centers. KCS becomes the way people solve problems and creates knowledge as a by-product of problem solving.

While KCS is enabled by technology, KCS is primarily about people. People are the source of knowledge. KCS has proven that the best people to capture and maintain support knowledge are the people who create and use it every day - the support analysts.

Goals of KCS are:

1. Create content as a by-product of solving problems, which is also known as incident management process, as well as the problem management process. As support analysts capture information related to an incident, they create knowledge that can be reused within the support process by other support analysts as well as customers with access to a self-service knowledge base.

2. Evolve content based on demand and usage. As people interact with the knowledge base within the incident management process, they should review it before delivering the knowledge to a customer. If they discover the need to correct or enhance the knowledge, they will fix it at that time or flag it for another person to fix it if they do not have the access authority to the knowledge base. Under this model, knowledge is evolved just-in-time based on demand instead of just-in-case. This lowers the cost of knowledge management.

3. Develop a knowledge base from an organization's collective experience to date. New knowledge capture within the incident management process is an experience resulting from one interaction. This knowledge has not been validated or verified beyond the initial incident. Thus the initial knowledge is not as trusted in this state, which is referred to as Draft knowledge.

It is not until reuse occurs that trust is increased. At some point the knowledge will be marked as trusted and either approved for internal use or published for self-service. The knowledge base under the KCS methodology includes knowledge that is at different states of trust and visibility. The collective experiences to date challenges the traditional thinking that all knowledge in a knowledge base must be perfect, validated, and highly trusted.

4. Reward learning, collaboration, sharing and improving. The culture of the organization should change to recognize the value of an individual based on the knowledge they share that enables the organization to be more effective and efficient.

KCS breaks through the limitations of current support strategies and enables support organizations to deliver greater value with more efficiency. The secret? Capitalizing on what they already have - knowledge. This increased value is created and managed by capturing the collective experience of solving problems and answering questions, making it reusable, and evolving it to reflect organizational-level knowledge.

KCS takes teamwork to a new level. The organization must shift to a perspective that sees knowledge as an asset owned and maintained by the team, not by an individual or a small group of dedicated content creators. The focus of the team is to capture and improve the collective knowledge, not only to solve individual customer issues, but also to improve organizational learning.

For optimum performance, KCS practices and the tools that support them must be integrated with other support and business systems, including incident management, change management, and service level management processes and systems.

Companies that have implemented KCS in both their internal and external support organizations are reporting dramatic improvements in incident resolution, training times, in customer satisfaction, and in analyst job satisfaction. As a result, they are realizing substantial savings in operating costs at the same time they are seeing improvements in service levels.