Wednesday, January 18, 2012

Content Management Systems Reviews - TeamSite

In my previous posts, I described component content management. A component content management system (CCMS) is used for managing component content.

A component content management system (CCMS) is a content management system that manages content at a granular or component level rather than at the document level. There are few of them in today's market including Interwoven, Documentum, AuthorIT, DocZone, Vasont, SiberLogic, Trisoft, Astoria, Tridion. In this my post, I am going to describe Interwoven.

This system is made by Interwoven company which was purchased by Autonomy which in turn was purchased by HP. Autonomy TeamSite is a market-leading content management system for content authoring, site design and layout, content targeting, advanced analytics, workflows, and archiving.

Interwoven TeamSite® is the industry’s most advanced Web Content Management System. For enterprises, TeamSite powers corporate and ecommerce web sites, employee intranets, support portals, marketing microsites, and extranets as well as e-mail, wireless, and print.

Product Architecture

TeamSite is built on a high-performance repository that stores and manages all asset types, including file-system content, database content, XML, digital and brand assets, media files, documents, and application code.

TeamSite supports multi-channel content strategy. By exposing and re-purposing content across multiple customer, partner, and employee touchpoints, companies get more value from their investments in content development and deliver more consistent messages.

Features

TeamSite provides a set of application services to manage content and automate business processes, including:

  • flexible, easy to manage workflows for automating critical business processes; 
  • library services for creating, browsing, searching, transforming, and viewing assets; 
  • parallel development support; 
  • templating capabilities; 
  • version control, access control, and search services; 
  • metadata management services; 
  • archival services; 
  • multi-stage deployment and provisioning; 
  • taxonomy and navigation management.

Multi-site management and archiving

TeamSite has branching and collaborative development model. This approach enables organizations to manage multiple sites —corporate sites, micro-sites,intranets, extranets, support sites, portal, or multi-channel publishing initiatives — all from the same infrastructure. TeamSite maintains archived copies of individual assets, as well as snapshots of whole sites, so users can compare, track, or roll back individual assets or applications, site sections, whole Websites or documents.

Parallel, collaborative content development

TeamSite’s extensive parallel development capabilities are ideal for companies who require frequent changes to multiple sites, want to retain control, and deliver high quality experiences. With parallel development, separate teams can work independently on different projects without additional software or hardware costs. Collaboration capabilities allow users to simultaneously make changes to content, data, and code—on the same site. Users visually review and merge changes, then test the entire Website as if it were live.

Administration and security

TeamSite has robust security with granular permission management. Sophisticated roles and rules features make it easy for non-technical business users to manage multiple initiatives and sites within specific, predefined boundaries. Using TeamSite to delegate administration and contribution reduces demand for resources and enables companies to get accurate, relevant content to the web faster.

Ease-of-use

Designed for all types of users, TeamSite provides the industry’s most intuitive, simple, and customizable user experience. The platform is easy enough to use for business professionals and other non-technical content contributors, yet powerful enough for project managers, site designers, developers, and administrators.

TeamSite includes:

ContentCenter Standard — a clear, easy portal-like interface with wizard-driven, point-and-click content contribution features for non-technical business users; interacts with business content, workflows, and forms.

ContentCenter Professional — an advanced interface for power users, project managers, and administrators, who can perform the most advanced content management with just a few clicks. Project managers can quickly and easily track and manage the content publishing lifecycle. Administrators can take advantage of in-context administrative functions to set up environments and monitor content contribution.

FormsPublisher — provides for structured, form-based content authoring by nontechnical business users. FormsPublisher is most often used by common or frequent content contributors as an interface for authoring such content types as press releases, announcements, or events as well as content that is likely to be repurposed across multiple channels (such as Web, wireless, and print) and/or multiple sites.

SitePublisher — a WYSIWYG content contribution interface with a reusable component based architecture enables powerful drag-and-drop assembly for efficient page management as well as in-context editing for point and click page editing. “Smart” templates, out-ofthe-box site functionality, and point and click navigation management also help organizations quickly create dynamic web sites, and then easily customize their presentation, content, and functionality to meet changing business needs.

Point-and-click customization

By easily making customizations directly on the page, with point-and-click simplicity, page owners can quickly modify content and its look and feel to fit changing business requirements.

In-context review and edit

Authorized team members can easily modify content in the context of any Web page they are browsing—without having to go to a separate content management interface.

Portal Integration

Interwoven includes connectors for leading enterprise portal applications including SAP, BEA, and IBM. The connectors are designed to deliver content, code, and metadata to portal repositories and enable business users to manage content directly from within the portal interface. Customers can also integrate TeamSite with leading enterprise applications software from vendors, including Siebel, Peoplesoft, and Oracle.

Platform extensions

TeamSite platform extensions ensure seamless integration across the enterprise. TeamSite is fully compatible with all is fully compatible with all Interwoven Web solutions, including:

Interwoven LiveSite - a content delivery engine, provides delivery of dynamic, targeted, and interactive.

Interwoven ReportCenter - allows site administrators to track and report all content management and publishing activities.

Interwoven Content Transformation Services - allow business users to automatically transform documents into PDFs and HTML—prior to publishing.

Interwoven MediaBin Digital Asset Management Server - for rich media asset management.

Interwoven WorkSite Collaborative Document Management Server - for document security, storage, collaboration, and retention.

Interwoven MetaTagger - content intelligence services for automated metadata extraction and recommendation.

Interwoven OpenDeploy - content distribution and publishing services - for multi-stage content and code provisioning.

Key Features


Key Feature
Description


Advanced information
management
Provides extensive search services so people find content quickly; associates content with editing, testing, reuse, auditing, and publication processes.
Reusable component
based architecture
Enables rapid site rollout with reuse of content and site functionality for creating and managing Web pages and entire sites without additional coding.
Drag-and-drop layout
Layout sites, templates, and pages by dropping components onto pages, dragging them into place, and making powerful customizations with a point and click interface.
Navigation management
Inheritable templates
Easily manage site navigation through a drag and drop, point and click interface
Metadata management
services
Sophisticated templates automatically distribute updates and permissions across sites
Visual annotate
Supports easy commenting and page mark-up directly from browser
Visual publish
Empowers content contributors to access and edit content directly from within Web browser
E-mail interface
Supports in-line, actionable content management tasks such as approve, edit, tag, and preview, directly from Microsoft Outlook
Front Office
Enables content authoring, submission, and publishing directly from Microsoft Office
Administration
User Interface (UI)
Localization
Fully internationalization and localization of administrative UI for English, French, German, Italian, Japanese, Spanish, Traditional and Simplified Chinese—all from a single instance
Optimized workflow—
with an easy, drag-and drop
visual modeler
Extends a combination of flexible, rigid, or fully-customized creation, review, notification, and publishing processes, supporting quality controls and business process automation
of multiple online and offline initiatives
XML object store
Manages XML content and varying object models for element-level collaboration, reuse, management, version control, and search
Security access control
Provides configurable asset locking model with SSL/LDAP, Microsoft Active Directory, and CA SiteMinder® support, and native operating system file and directory permissions support
Central control and
delegated administration
Allows centralized control of the entire platform with flexible groups and roles-based delegated administration
Granular permissions
Simplifies control of activation, restriction and access rights for any role or user, with each function, folder, or asset within the system

Monday, January 16, 2012

Component Content Management

In my last post, I described how DITA is used in dynamic content management. I will continue the subject of dynamic content management in this post.

DITA was conceived as a model for improving reuse through topic-oriented modularization of content. Instead of creating new content or copying and pasting information which may or may not be current and authoritative, organizations manage a repository of content assets – or DITA topics – that can be centrally managed, maintained and reused across the enterprise. This helps to accelerate the creation and maintenance of documents and other deliverables and to ensure the quality and consistency of the content organizations publish.

Dynamic content management is also called component content management. It is also called single source publishing. DITA is its foundation. A component content management system (CCMS) is used for managing component content. A component content management system (CCMS) is a content management system that manages content at a granular or component level rather than at the document level. Examples of such systems are Interwoven, Documentum, AuthorIT, DocZone, Vasont, SiberLogic, Trisoft, Astoria, Tridion.

What exactly is a component? Each component represents a single topic, concept or asset (e.g., image, table, product description). Components can be as large as a chapter or as small as a definition or even a word. Components in multiple content assemblies can be viewed as components or as traditional documents. Reuse allows the core component to be edited and maintained in one place, and then be assembled into thousands of documents where it is needed.

Each component is only stored one time in the content management system, providing a single, trusted source of content. These components are then reused (rather than copied and pasted) within a document or across multiple documents. This ensures that content is consistent across the entire documentation set. Each component has its own lifecycle (owner, version, approval, use) and can be tracked individually or as part of an assembly.

Component Content Management can be regarded as an overall process for originating, managing, and publishing content right across the enterprise and to any output.

Component content management provides significant benefits and cost savings over traditional document authoring and maintenance methods. Some of these are:

  • greater consistency and accuracy;
  • reduced maintenance costs;
  • reduced delivery costs;
  • reduced translation costs.

And more specifically:

  • Faster time to market because authors spend far less time creating and recreating the same content, reviewers spend less time reviewing, translators spend less time translating. Publishing to print, Help, and Web formats is fully automated. This is achieved by controlling standards, eliminating duplication, and effectively managing creation, localization, and publishing of content.
  • Efficient use of resources by eliminating repetitive creation and maintenance, more of your resources can be devoted to improving the quality of the content and adding value to your documentation.
  • Slashed translation costs: content is translated only once no matter how often it is reused. Translators only ever work on new or changed source content, so you don’t pay for them to handle unchanged text. Real projects have shown reductions in translation word count in excess of 30%.
  • Improved quality and usability of content: through easy definition and enforcement of standards you can guarantee consistent documentation structure and formatting, increasing readability and usability. Using single-source content ensures 100% consistency wherever it appears.
  • Improved workplace satisfaction: free authors from tedious, time-consuming tasks such as formatting and repetitive updates, so they can concentrate on creating and improving content. Reviewers gain by reviewing content only once, regardless of the number of end deliverables. Writers save 95% of the time they usually spend formatting content.
  • Increased customer satisfaction: consistent, accurate documentation of all types means fewer calls to customer support, because you are providing the right information, at the right time, in the right format.

Generating content takes time and money. As such, content should be treated as the valuable business asset that it is. To get maximum value from your content, you should be able to do a number of things:

  • You should be able to re-use content across documents without copying, so that you can write it once, and maintain it in a single place no matter how many times you have used it.
  • You should be able to use content created for one purpose equally well in other contexts and for other purposes.
  • You should be able to translate re-used content once and have it automatically reflected anywhere it is used.
  • You should be able to publish to print, help, and web outputs without having to modify or make different versions of your content.

These measures provide the potential for increasing the quality and consistency of your documentation, for reducing the cost and time involved in producing it, and for gaining more value from every piece of content that you create.

In my future posts, I will describe component content management systems.

Saturday, January 14, 2012

DITA and Dynamic Content Management

In my previous post on DITA, I mentioned that DITA, Darwin Information Typing Architecture, is an XML-based architecture for authoring, producing, and delivering information. In this post, I am going to describe more details about DITA and how it is used in content management.

At the heart of DITA, representing the generic building block of a topic-oriented information architecture, is an XML document type definition (DTD) called the topic DTD. The point of the XML-based Darwin Information Typing Architecture (DITA) is to create modular technical documents that are easy to reuse with varied display and delivery mechanisms.

Main features of the DITA architecture

As the "Architecture" part of DITA's name suggests, DITA has unifying features that serve to organize and integrate information:

Topic orientation. The highest standard structure in DITA is the topic. Any higher structure than a topic is usually part of the processing context for a topic, such as a print-organizing structure or the navigation for a set of topics.

Reuse. A principal goal for DITA has been to reduce the practice of copying content from one place to another as a way of reusing content. Reuse within DITA occurs on two levels:

Topic reuse. Because of the non-nesting structure of topics, a topic can be reused in any topic-like context.

Content reuse. DITA provides each element with a conref attribute that can point to any other equivalent element in the same or any other topic.

Specialization. Any DITA element can be extended into a new element.

Topic specialization. Applied to topic structures, specialization is a natural way to extend the generic topic into new information types (or infotypes), which in turn can be extended into more specific instantiations of information structures. For example, a recipe, a material safety data sheet, and an encyclopedia article are all potential derivations from a common reference topic.

Domain specialization. Using the same specialization principle, the element vocabulary within a generic topic can be extended by introducing elements that reflect a particular information domain served by those topics. For example, a keyword can be extended as a unit of weight in a recipe, as a part name in a hardware reference, or as a variable in a programming reference.

Property-based processing. The DITA model provides metadata and attributes that can be used to associate or filter the content of DITA topics with applications such as content management systems, search engines, etc.

Extensive metadata to make topics easier to find. The DITA model for metadata supports the standard categories for the Dublin Core Metadata Initiative. In addition, the DITA metadata enables many different content management approaches to be applied to its content.

Universal properties. Most elements in the topic DTD contain a set of universal attributes that enable the elements to be used as selectors, filters, content referencing infrastructure, and multi-language support.

Taking advantage of existing tags and tools. Rather than being a radical departure from the familiar, DITA builds on well-accepted sets of tags and can be used with standard XML tools.

Leveraging popular language subsets. The core elements in DITA's topic DTD borrow from HTML and XHTML, using familiar element names like p, ol, ul, and dl within an HTML-like topic structure. In fact, DITA topics can be written, like HTML for rendering directly in a browser.

Leveraging popular and well-supported tools. The XML processing model is widely supported by a number of vendors and translates well to the design features of the XSLT and CSS stylesheet languages defined by the World Wide Web Consortium and supported in many transformation tools, editors, and browsers.

Typed topics are easily managed within content management systems as reusable, stand-alone units of information. For example, selected topics can be gathered, arranged, and processed within a delivery context to provide a variety of deliverables to varied audiences. These deliverables might be a booklet, a web site, a specification, etc.

At the center of these content management systems are fundamental XML technologies for creating modular content, managing it as discrete chunks, and publishing it in an organized fashion. These are the basic technologies for "one source, one output" applications, sometimes referred to as Singe Source Publishing (SSP) systems.

The innermost ring contains capabilities that are needed even when using a dedicated word processor or layout tool, including editing, rendering, and some limited content storage capabilities. In the middle ring are the technologies that enable single-sourcing content components for reuse in multiple outputs. They include a more robust content management environment, often with workflow management tools, as well as multi-channel formatting and delivery capabilities and structured editing tools. The outermost ring includes the technologies for smart content applications.

It is good to note that smart content solutions rely on structured editing, component management, and multi-channel delivery as foundational capabilities, augmented with content enrichment, topic component assembly, and social publishing capabilities across a distributed network.

Content Enrichment/Metadata Management: Once a descriptive metadata taxonomy is created or adopted, its use for content enrichment will depend on tools for analyzing and/or applying the metadata. These can be manual dialogs, automated scripts and crawlers, or a combination of approaches. Automated scripts can be created to interrogate the content to determine what it is about and to extract key information for use as metadata. Automated tools are efficient and scalable, but generally do not apply metadata with the same accuracy as manual processes. Manual processes, while ensuring better enrichment, are labor intensive and not scalable for large volumes of content. A combination of manual and automated processes and tools is the most likely approach in a smart content environment. Taxonomies may be extensible over time and can require administrative tools for editorial control and term management.

Component Discovery/Assembly: Once data has been enriched, tools for searching and selecting content based on the enrichment criteria will enable more precise discovery and access. Search mechanisms can use metadata to improve search results compared to full text searching. Information architects and content managers can use search to discover what content exists, and what still needs to be developed to proactively manage and monitor the content. These same discovery and search capabilities can be used to automatically create delivery maps and dynamically assemble content organized using them.

Distributed Collaboration/Social Publishing: Componentized information lends itself to a more granular update and maintenance process, enabling several users to simultaneously access topics that may appear in a single deliverable form to reduce schedules. Subject matter experts, both remote and local, may be included in review and content creation processes at key steps. Users of the information may want to "self-organize" the content of greatest interest to them, and even augment or comment upon specific topics. A distributed social publishing capability will enable a broader range of contributors to participate in the creation, review and updating of content in new ways.

Federated Content Management/Access: Smart content solutions can integrate content without duplicating it in multiple places, rather accessing it across the network in the original storage repository. This federated content approach requires the repositories to have integration capabilities to access content stored in other systems, platforms, and environments. A federated system architecture will rely on interoperability standards (such as CMIS), system agnostic expressions of data models (such as XML Schemas), and a robust network infrastructure (such as the Internet).

These capabilities address a broader range of business activity and therefore fulfill more business requirements than single-source content solutions. Assessing your ability to implement these capabilities is essential in evaluating your organizations readiness for a smart content solution.

Thursday, January 12, 2012

Information Governance Uncovered

Content management is useless without information governance. In fact, it does not exist. You cannot successfully manage content if you don't have policies and procedures in place to govern it.

In my previous post on information governance, I mentioned that information governance is set of structures, policies, procedures, processes, and controls implemented to manage information at an enterprise level, supporting an organization's immediate and future regulatory, legal, risk, environmental and operational requirements. Let's look closely at these structures, policies, procedures, processes and controls.

Content Types

Before uploading content into your CMS, you need to determine what types of content you have now and/or will have in future. Knowing this will help you to set up the taxonomy and metadata accordingly and to make sure that your CMS contains documents that you would expect to find there. Uploading documents that are not included in your content types list, would create havoc in your CMS. If you have multiple systems in place such as document control systems, CRM, ERP, etc., you need to determine what documents each system will contain and what interactions these systems will have with each other.

Document Owners

You have a content management system (CMS) in place and you have uploaded your documents there. And you keep uploading them. At some point in time, you are going to have thousands of documents in your CMS. Well, they can't just stay there indefinitely without somebody taking some action about these documents such as update, move to the archive, delete, etc. In addition, when users look at these documents, they may have questions about content of these documents. Somebody has to own these documents. Each document should have a document owner. It could be a group name. The document owner should be entered into the document metadata. This will allow users and the CMS administrator to find owners of documents.

Retention Schedule

If you upload documents into your CMS and do nothing about them, the time will come when your search for documents will retrieve obsolete documents which will over-flood the system. In order to prevent this from happening, retention schedule needs to be put in place. Determine document types for your content. For each document type, determine the period of time, during which this document type is current and active.

For each document type, set up a workflow which would trigger an email to the administrator that this document type has reached an expiration date. This workflow is based on the expiration date of the document type. Upon receiving this email, the content administrator should contact the document owner who would make a decision whether this document needs to be reviewed and updated, moved to archive, or deleted. If the document is reviewed and updated, re-set the workflow for the next period of time.

Change Management and Control

When you set up taxonomy, metadata, naming conventions, systems functions, etc., they should not arbitrarily be changed. There should be a procedure in place for a change management. Changes like this, should go through a workflow and be approved. In addition, users feedback should be obtained.

Naming Conventions

For each document type, you need to establish naming conventions. Users should be able to have a very good idea what the document is about without opening it. This would save a lot of time for users.

Approval and Publishing Documents

Set up a workflow for content approval and publishing documents. Users upload documents and populate metadata. They may upload documents in a wrong place or enter wrong metadata. These errors would impact search and browse functions. The administrator should check to make sure that the document has been uploaded into the correct place and the correct metadata is entered. Errors should be corrected before the document is published.

Time for Check-out Documents

In a CMS environment, users need to check out documents if they need to edit them. They may check out documents and leave them checked out for a long time. While documents are checked out, other users can't edit them. Set up a policy with the time frame that users can have their documents checked out to them. Monitor checked out documents. If you see a checked-out document with expired check out time frame, contact the user and request that the document would be checked in. You can also check in the document yourself or discard the check out.

Permissions

Determine your users groups. For each document type, determine which permissions should your users groups have, e.g. read and write or read only. It is highly recommended that only CMS administrator has full permissions to the system. Users should not be allowed to change the system.

These are main and most important procedures. Depending on your specific requirements, you may have other procedures in place. Set them up from the very beginning of your CMS deployment and educate your users about them. Create these procedures with users feedback. When users provide their feedback to you, they will agree with these procedures and will follow them.

Tuesday, January 10, 2012

Enterprise Search

Enterprise search is the practice of identifying and enabling specific content across the entire organization to be indexed, searched, and displayed to authorized users.

How does Enterprise Search affect your organization? When individuals within your company are searching for documents, are they finding the correct and most current information? When customers visit your website, are they locating the products and services with ease? These are essential questions to make sure your company maximizes its profitability.

Business intelligence and enterprise search solutions allow both your employees and customers to locate the information they need to make the most informed decisions. Universal Enterprise Search is the key way in which companies organize their information and allow individuals to locate exactly what they are looking for.

Companies may have few managed content repositories. It could be few content management systems, knowledgebase applications, wiki applications, CRM, ERP, etc. They are usually isolated from one another. Companies also may have unmanaged content repositories such as network drives. These repositories grow fast thus resulting in disconnected enterprise knowledge assets.

In a situation like this, it would be a good idea to have a search tool that would allow users to search all these repositories at the same time. "Universal Search" term was born. Another term that was born is "Content Management Interoperability Services" (CMIS) which offers to connect some of those repositories. Companies are looking into Enterprise Search (ES) as a shortcut to their findability problems.

Enterprise search is not an alternative to ECM systems. Even though there are search applications that allow the search across different platforms, enterprise search alone is not a solution to the findability problem. While enterprise search provides great value, it does not mean that you can give up integrating your repositories or leave your content on network drives.

Whit Andrews, Gartner VP, distinguished analyst, and author of Gartner's 2009 "Magic Quadrant for Information Access Technology" report (Gartner includes enterprise search as a part of "Information Access Technology") stated that with having right enterprise search you can get search that will be very effective, but it isn't the same as 100% success. He also acknowledged that you cannot simply buy, install, and walk away from powerful ES systems. Both ES and ECM systems require diligence and governance.

Leslie Owens, a Forrester Research analyst, is also skeptical about the likelihood of an ES quick fix, although she too is a strong proponent of targeted search applications.

In addition, out-of-the-box search may get you part of the way to findability, but [it] generally works best when enhanced with supporting semantic information-such as controlled vocabularies.

Every content repository should have two access points - search and browse. Users are going to use the Search access point if they know exactly what they are looking for. They are going to enter keywords in the search field and retrieve results. If they retrieve too many results, the presence of metadata will allow them to limit the search to specific results.

If users don't know what they are looking for, they are going to browse the content to get some ideas of what is available in the content repository. At some point during browsing, they may switch to search.

Therefore having only search as one access points is not enough to ensure findability of content. Metadata and taxonomy are invaluable tools to ensure that search and browse access points work properly. Taxonomy that is intuitive would greatly help the browse access point. The successful taxonomy is the one that is not too deep and not too wide. The users should be able to get to content with as few clicks as possible and yet the list of documents should not be too long. Users are not going to scroll through long lists of documents.

Naming conventions are extremely important. If users can figure out the contents of a document from its name without opening it, it will greatly speed up the time the user will spend browsing for documents.

Having information governance in place will make sure that the content resides in the correct place, that is has correct metadata, and that the content follows procedures to keep it accurate and up-to-date.

So is it ECM or ES? The answer is both.

In my future posts, I will review search applications and will further address search strategy. I will also review few case studies in ECM and ES.

Monday, January 9, 2012

Consequences of GxP/GMP for Information Technology

In my last post, I described the GMP requirements for document control. In this post, I am going to describe the GMP requirements for information technology used in a GMP company.

For a drug to be produced in a GxP compliant manner, some specific information technology practices must be followed. Computer systems involved in the development, manufacture, and sale of regulated product must meet certain requirements such as:
  • secure logging: each system activity must be registered, in particular what users of the system do, that relate to research, development and manufacturing. The logged information has to be secured appropriately so that it cannot be changed once logged, not even by an administrative user of the system;
  • auditing: an IT system must be able to provide conclusive evidence in litigation cases, to reconstruct the decisions and potential mistakes that were made in developing or manufacturing a medical device, drug or other regulated product;
  • keeping archives: relevant audit information must be kept for a set period. In certain countries, archives must be kept for several decades. Archived information is still subject to the same requirements, but its only purpose is to provided trusted evidence in litigation cases;
  • accountability: Every piece of audited information must have a known author who has signed into the system using an electronic signature. No actions are performed by anonymous individuals;
  • non-repudiation: audit information must be logged in a way that no user could say that the information is invalid, e.g. saying that someone could have tampered with the information. One way of assuring this is the use of digital signatures.
GMP guidelines require that software programs must be validated by adequate and documented testing. Validation is defined as the documented act of demonstrating that a procedure, process, and activity will consistently lead to the expected results. The software validation guideline states: “The software development process should be sufficiently well planned, controlled, and documented to detect and correct unexpected results from software changes."

To validate software, it must be:
  • structured, documented, and evaluated as it is developed;
  • checked to make sure that it meets specifications;
  • adequately tested with the assigned hardware systems;
  • operated under varied conditions by the intended operators or persons of like training to assure that it will perform consistently and correctly.
It is important to notice these requirements since a document management system is required to control documents, so this document management system must meet these requirements for information technology.

Friday, January 6, 2012

GxP/GMP and Document Control

In the regulated environment, the document control is the cornerstone of the quality system. It is so important that if an external audit identifies deficiencies in the document control system, the entire organization can be shut down.

In my last post, I talked about the connection between ISO 9001 and document control. ISO 9001 is one example of the regulated environment. It is usually used in engineering types of companies. In food, drugs, medical devices, and cosmetics industries, GxP/GMP regulations are used. Today, I am going to talk about the connection between GxP/GMP and document control.

GxP is a general term for Good Practice quality guidelines and regulations. The titles of these good practice guidelines usually begin with "Good" and end in "Practice", with the specific practice descriptor in between. GxP represents the abbreviations of these titles, where x (a common symbol for a variable) represents the specific descriptor.

For example: Good Clinical Practice (GCP), Good Laboratory Practice (GLP), Good Manufacturing Practice (GMP), Good Safety Practice (GSP), and many others.

A "c" or "C" is sometimes added to the front of the acronym. The preceding "c" stands for "current." For example, cGMP is an acronym for "current Good Manufacturing Practice." The term GxP is only used in a casual manner, to refer in a general way to a collection of quality guidelines.

The purpose of the GxP quality guidelines is to ensure that a product is safe and meets its intended use. GxP guides quality manufacture in regulated industries such as food, drugs, medical devices, and cosmetics.

The most central aspects of GxP are traceability - the ability to reconstruct the development history of a drug or medical device and accountability - the ability to resolve who has contributed what to the development and when.

GMP is the most well known example of a GxP.

Good Manufacturing Practice (GMP) are practices and the systems required to be adapted in pharmaceutical and medical devices companies. GMP is the guidance that outlines the aspects of production and testing that can impact the quality of a product.

Many countries have legislated that pharmaceutical and medical device companies must follow GMP procedures, and have created their own GMP guidelines that correspond with their legislation. Basic concepts of all of these guidelines remain more or less similar to the ultimate goals of safeguarding the health of the patient as well as producing good quality medicine, medical devices, or active pharmaceutical products.

In the U.S. a drug may be deemed adulterated if it passes all of the specifications tests but is found to be manufactured in a condition which violates current good manufacturing guidelines. Therefore, complying with GMP is a mandatory aspect in pharmaceutical and medical devices manufacturing.

Documentation is a critical tool for ensuring GxP/GMP compliance.

This is what GMP states about document control:

Each manufacturer shall establish and maintain procedures to control all documents that are required. The procedures shall provide for the following:

1. Document approval and distribution. Each manufacturer shall designate an individual(s) to review for adequacy and approve prior to issuance all documents. The approval, including the date and signature of the individual(s) approving the document, shall be documented. Documents shall be available at all locations for which they are designated, used, or otherwise necessary, and all obsolete documents shall be promptly removed from all points of use or otherwise prevented from unintended use.

2. Document changes. Changes to documents shall be reviewed and approved by an individual(s) in the same function or organization that performed the review and approval of original documents, unless specifically designated otherwise. Approved changes shall be communicated to the appropriate personnel in a timely manner. Each manufacturer shall maintain records of changes to documents. Change records shall include a description of the change, identification of the affected documents, the signature of the approving individual(s), the approval date, and when the change becomes effective.

These requirements are consistent with document control requirements stated in ISO 9001 which I described in my previous post.

The role of QA, in regards to the document system, is one of management and overview. QA ensures that all documents are maintained in a controlled fashion and that all procedures are being used within a company are approved by the appropriate subject matter experts, are consistent with other documents, and are the most current version. One way that QA ensures this is by being the last signature on all approved documents. All documents; current, obsolete, superseded, as well as all the history on the creation and revision of the document are kept in Quality Assurance.

These are the steps of the document control procedure:

Creation

Any knowledgeable employee should be able to write or revise documents as needed.

Revising

When revising a document the redline changes along with detailed justification of the changes should be routed.

Routing

The document control function of QA is responsible for routing documents for review and approval. It is suggested that a pre-route be done to ensure that all affected parties are in agreement with the document before it is submitted to QA. There should be a documented process detailing how documents are submitted for review and approval.

A controlled form listing all the changes made to the document, justification for the changes, and a list of personnel who need to review the document needs to be routed along with the document. At a minimum the author’s manager, all affected department heads, and QA need to review the document. Other Subject Matter Experts can be included.

Approval

Once all affected parties have agreed to the changes, document control will prepare the document for approval. All changes will be incorporated into the document. For new documents the version # will be 00. For each revision of a document the version number will increase (01, 02, 03, etc). A master document will be routed for approval signatures.

Typically the approval signatures are the Author, the Department Head, and QA. QA must be the last signature on all documents. Usually the approval signatures only appear on the first page of the document. Once the master document has been signed, and effective date is stamped onto each page of the document. The effective date must be far enough in advance to allow for the document to be trained on before it becomes effective (typically this is 5 days).

Distributing

On the effective day copies of the signed master document are routed to the affected departments. The departments will remove the old version and replace it the new version (for revised documents). If the document is new, there will be no replacement document to remove.

The old versions must be returned to document control. On a periodic basis document control personnel should audit the binders to determine if they contain the correct versions. Each document binder should contain a table of contents and only those documents that the department is responsible for. A full set of all approved documents should be in the QA department as well as in a central company location.

Archiving

Old revisions of documents will be stamped as superseded. No document revisions will be discarded or altered. A file will be maintained within QA that contains all the superseded documents and the signature approvals of personnel who agreed to the revisions.

Obsolete

If a document will no longer be used by any department in the company it can become obsolete. The document must be stamped as Obsolete and all copies removed from all document binders. It is a good idea to place a notice in the document stating that the document has been Obsolete.

Good manufacturing practice (GMP) regulations require that all documentation be issued, managed and controlled using a document management system.

In my future posts, I will further describe GMP regulations pertaining to documentation and documentation management systems.

Thursday, January 5, 2012

ISO 9001 and Document Control

ISO 9001 specifies requirements for a Quality Management System (QMS) where an organization needs to demonstrate its ability to consistently provide product that meets customer and applicable statutory and regulatory requirements.

An organization is required to establish, document, implement, and maintain a quality management system and continually improve its effectiveness.

A cornerstone of the QMS is document control. Therefore, in order for an organization to meet ISO 9001 requirements, it must have a document control system in place. Auditors pay particular attention to document control.

Document control is an essential preventive measure ensuring that only approved, current documents are used throughout the organization. Inadvertent use of out-of-date documents or not approved documents can have significant negative consequences on quality, costs, and customer satisfaction.

What is Controlled Document?

Let's define controlled documents. Controlled document is any document that is used to perform work and not for reference. Furthermore, ISO 9001 states that documents required by the QMS should be controlled.

The QMS includes the following documents: statements of quality policy and quality objectives, quality manual, procedures, and records determined by the organization to be necessary to ensure the effective planning, operation, and control of its processes.

The format and structure of the quality policy, quality objectives, and quality manual is a decision for each organization, and will depend on the organization’s size, culture and complexity.

A small organization may find it appropriate to include the description of its entire QMS within a single manual, including all the documented procedures required by the standard. Large, multi-national organizations may need several manuals at the global, national or regional level, and a more complex hierarchy of documentation.

ISO 9001 specifically requires the organization to have documented procedures for the following six activities:

1. Control of documents

2. Control of records

3. Internal audit

4. Control of nonconforming product

5. Corrective action

6. Preventive action

Some organizations may find it convenient to combine the procedure for several activities into a single documented procedure (for example, corrective action and preventive action). Others may choose to document a given activity by using more than one documented procedure (for example, internal audits). Both are acceptable.

Some organizations (particularly larger organizations, or those with more complex processes) may require additional documented procedures (particularly those relating to product realization processes) to implement an effective QMS.

Other organizations may require additional procedures, but the size and/or culture of the organization could enable them to be effectively implemented without necessarily being documented. However, in order to demonstrate compliance with ISO 9001, the organization has to be able to provide objective evidence that its QMS has been effectively implemented.

The typical controlled document types include:
  • policies - the company’s position or intention for its operation;
  • procedures - responsibilities and processes for how the company operates to comply with its policies;
  • work and/or test instructions - step-by-step instructions for a specific job or task;
  • forms and records - recorded information demonstrating compliance with documented requirements;
  • drawings - those that are issued for work;
  • process maps, process flow charts, and/or process descriptions;
  • specifications;
  • internal communication;
  • production schedules;
  • approved supplier lists;
  • test and inspection plans;
  • quality plans.
The type and extent of the documentation will depend on the nature of the organization’s products and processes, the degree of formality of communication systems and the level of communication skills within the organization, and the organizational culture.

Document Control Procedure

ISO 9001 states that the procedure for document control should be established which should include the following:

1. To approve documents for adequacy prior to issue.

Document approvals are mandatory and must be kept as a record as well. When determining who should approve a particular document, limit approvals to those with direct knowledge or responsibility for the document.

Approval signatures must be recorded prior to the release and use of the document. Approvals may be in the form of a written signature or a password-protected electronic approval record. The date of all approvals must precede the document's release date.

While not explicitly stated, this requirement also applies to temporary memos or postings that are used to communicate QMS or product-related requirements. Any temporary documents must be clearly identified, signed and dated. It is advisable to include an expiration date on temporary documents to ensure they are removed from use when intended.

2. To review and update as necessary and re-approve documents.

All documents must be reviewed periodically and updated and re-approved if needed. This review can be tied to a company's internal audit process, management review or scheduled on some periodic (annual) basis. A record of such reviews must be kept.

3. To ensure that changes and the current revision status of documents are identified.

When a document is updated, a record must be kept of the change, including the reasons for and nature of the change. In addition, current revision status must be maintained. This includes the current development stage (draft, review, approval, etc.) and the date or revision level (number or letter) identifying the current version of the document.

4. To ensure that relevant versions of applicable documents are available at points of use.

The storage and access of documents must easily allow individuals to find the appropriate version of a document to use where needed. Older versions of a document that are still needed (e.g. specifications for an older product) may remain active if necessary, but the revision level must be made clear.

You should consider where designated controlled locations of your documents will be established and whether short-term reference copies of controlled documents will be permitted. Typically, the easier it is for employees to access controlled copies when needed, the fewer times they will feel the need to use an uncontrolled copy of a document. Ensuring timely and convenient access to documents is frequently the source of high costs and repeated discrepancies.

5. To ensure that documents remain legible and readily identifiable.

The format and storage of documents must protect a document from being rendered unreadable due to wear or damage and that every document can be clearly identified through a title, document number or other suitable identification.

6. To ensure that documents of external origin determined by the organization to be necessary for the planning and operation of the quality management system are identified and their distribution controlled.

Documents that do not originate within the organization, but are necessary for ensuring quality and meeting customer requirements must also be controlled. These can include customer, supplier or industry documents. However, the extent of control is limited to clear identification and controlled distribution. A log or other record would suffice to track external documents.

7. To prevent the unintended use of obsolete documents, and to apply suitable identification to them if they are retained for any purpose.

Out-of-date documents or older versions of revised documents must be protected from unintentional use. This usually requires segregation or disposal of obsolete documents. Any obsolete documents that are kept for reference or other purposes must be clearly identified through markings, separate storage areas, or other means.

Controlled documents need to be clearly identified. Hard copy documents need to be stamped. Electronic documents need to be watermarked so that when they are printed, they could be identified that they are controlled documents and a user needs to verify an electronic version prior to use of this document.

Main Objectives

These are some of the main objectives of an organization’s documentation system:

1. Communication of Information

Documentation is viewed as a tool for information transmission and communication.

2. Evidence of conformity

Documentation is viewed as provision of evidence that what was planned, has actually been done.

3. Knowledge sharing

Documentation is used to disseminate and preserve the organization’s experiences. A typical example would be a technical specification, which can be used as a base for design and development of a new product.

Measuring Success

How can you measure the performance of your document control process? Here are some suggested metrics:

User satisfaction – Periodically survey your employees regarding the usability of your documentation. Use the results to improve the format of your documents and training of your authors.

Document errors – Track the number of document revisions due to information mistakes in your documentation. Results will often reveal weaknesses in your review and proofreading processes.

Up-to-date – Count the number of document revisions or audit discrepancies stemming from a document that is out-of-date. This will tell you whether your periodic document reviews or obsolete document provisions are effective.

Cycle time – Measure the time it takes a document to be developed or revised from initial draft to release. Work to improve the efficiency of your document control process as you would any other business process.

Cost – Consider tracking the costs associated with your documentation including developing, revising, storing, retrieving, distributing, filing, auditing, reviewing, approving, etc. Of these potential costs, document retrieval is often an expensive hidden cost generated when individuals must search endlessly for a document because of inadequate indexing, organization, storage or training.

Results of the performance measures of your document control process can help you determine how to drive continual improvements into your entire QMS.

In my next posts I will describe electronic systems that are used for document control.

Wednesday, January 4, 2012

SharePoint 2010 - Part 2

Yesterday, I started to describe the features of SharePoint 2010. Today, I will conclude the description of SharePoint 2010.

Enterprise Content Management

Managed Metadata Features

The new Managed Metadata service in SharePoint Server 2010 provides a set of features that enable organizations to manage taxonomies and metadata consistently across your SharePoint sites. With the new Managed Metadata service you can publish and share content types across site collections and Web applications and to use the Term Store to manage terms and taxonomies.

A taxonomy is a hierarchical organization of terms. Users can apply these terms to content on your site if you add the new managed metadata column to lists, libraries, or content types. Taxonomies and terms can be centrally managed within your organization, or you can integrate managed metadata with social tagging and enable users to suggest terms when they tag content.

Document Sets

SharePoint 2010 introduced document sets, which are a new content type that enables creating and managing work products that span multiple documents. Document sets are configured like other content types. They can be set up to include a set of default documents that people then customize when they create a new instance of a document set.

Document set features such as shared metadata, workflows, and versioning enable groups to manage the development of a work product or content set efficiently. A common example of a document set is a "pitch book" used by a team to group different document types together for a product promotion.

Document Center Site Template

The document center site template enables new document management features for a SharePoint site, including the new metadata-based navigation feature. With this feature you can browse content in a large list or library by using metadata rather than by folder location. Unique document IDs make content easy to find regardless of its location.

Records Management

SharePoint 2010 supports the management and discovery of content in place, without the need for a locked down repository for official records. Some of the new records management features include:
  • in-place records management that enables you to store records in place next to in-progress content;
  • retention policies that now include complex schedules, such as multi-staged schedules and more than seven included record management actions, such as "Send to a records archive" and "Declare" as an "in-place record";
  • for larger archives, the Records Center site has been improved by the addition of a hierarchical file plan, submission methods driven by metadata, and the ability to band together multiple site collections that can be managed as one large repository.
Web Content Management

New and improved Web Content Management features make it easier to publish Web pages and manage sites. In addition, SharePoint 2010 now includes support for streaming video.

The Web content authoring experience has been improved and simplified with the addition of the ribbon, which consolidates page commands and makes commands more task-based.

Improvements to export behavior as well as logging and reporting make content deployments easier.

Out-of-the-box Web Analytics features provide support for Traffic, Search, and Inventory analytics reports.

New support for rich media includes a new Asset Library, with rich views and pickers; support for videos as a SharePoint content type, a streaming video infrastructure, and a skinable Silverlight media player.

Large page libraries simplify the management of web sites with many pages.

Creating and managing different versions, or "variations" of publishing sites or pages is an operation that is now submitted to a queue and occurs in the background so users can continue working in SharePoint while the operation completes. A "View Changes" command has been added that allows to compare an older version of a web page with a more recent one. Changes are highlighted in a special report to enable side-by-side editing in the Rich Text Editor.

Search

New Search features in SharePoint 2010 make it easier to locate more relevant information and find colleagues quickly and efficiently. Improvements include a new results layout that refines information into categories, and includes better descriptions and metadata. In addition, people who are in your social circle will appear toward the top of your search results.

New features include:
  • refinement: helps to inform you about results and allows you to narrow search results by specific types such as site, author, or date;
  • pre-populated query suggestions, related search links, and acronym expansion;
  • ability to query for documents by using Boolean syntax (AND, OR, and NOT), and prefix wildcards (*);
  • ability to search SharePoint content from a computer running Windows 7;
  • improved "Did you mean?" to support more languages and terms within your enterprise.
By using search with the social computing and collaboration features in SharePoint 2010 you can:
  • search for a person by expertise to find someone who has the skills that match your needs;
  • use the phonetic name lookup to find similar sounding names (is it John or Jon?);
  • refine search results by using categories such as department or job title.
The search model uses the properties (or metadata) that you provide on documents. Search now combines the content for key phrases that might locate missing or inaccurate properties, which helps improve the search relevance.

Site searches are automatically scoped to the current site and its subsites rather than all sites.

Business Intelligence

Excel Services

New and enhanced features include:
  • improved features for visual data analysis, such as enhanced conditional formatting, sparklines, and intuitive data exploration by using filters;
  • tightly integrated client functionality with the PowerPivot for SharePoint, a new "self-service BI" feature from SQL server analysis services;
  • the ability to analyze millions of records quickly and easily;
  • new formatting and editing capabilities that enable to edit and format spreadsheets directly in the web browser just like in Excel. You can now apply color, style and size formatting to lines, borders, and numbers, and use the same background color features from Excel;
  • browser-based creation of new workbooks, and tables in workbooks.
Rich Charts

The new Chart Web Part, based on Dundas data visualization techniques, enables to add rich charts to SharePoint sites by using web-based configuration to connect charts to data from a variety of sources, such as SharePoint lists, external lists, Business Data Services, Excel Services, and other web parts.

PerformancePoint Services

PerformancePoint Services makes it easier to monitor and analyze performance against goals and make better business decisions:
  • create and use interactive dashboards with scorecards, reports (including reporting services and Excel services reports), and filters;
  • create and use scorecards that bring together data from multiple data sources (including Analysis Services, SQL server, SharePoint lists and Excel Services) to track and monitor your data;
  • use analytic reports to identify driving forces and root causes, and apply filters to personalize your reports;
  • integrate your business intelligence applications and information with other SharePoint features, such as collaboration and content management;
  • dashboards and dashboard items are stored, managed, and secured within SharePoint lists and libraries, providing you with a single security and repository framework.
Enhanced scorecards make it easy for you to drill down and quickly access more detailed information. PerformancePoint scorecards also offer more flexible layout options, dynamic hierarchies, and calculated Key Performance Indicator (KPI) features. Using this enhanced functionality, you can now create custom metrics that use multiple data sources. You can also sort, filter, and view variances between actual and target values to help you identify concerns or risks.

Enhanced analytic reports support value filtering, new chart types, and server-based conditional formatting. The unique visualization Decomposition Tree, a new report type available in PerformancePoint Services, enables you to quickly and visually break down higher-level values so you can understand the driving forces behind them.

Site Management and Customization

SharePoint 2010 includes view and adjust permission levels, including item-level permissions, for a particular user or group using the new permissions management tool.

New permissions management is available from every site collection, site, list or list item, so that you can easily add or remove users or groups, change permission levels, break inheritance, and manage anonymous access. In addition, you can view and adjust all permission levels granted to a particular user or group.

SharePoint 2010 includes the new Themes Gallery to select from several themes. You can also generate your own theme files from Microsoft PowerPoint and add them to the gallery for selection.

With the appropriate language packs installed, you can view settings pages, Help, and application content such as list titles and column names in your preferred language.

InfoPath can now be used to fully customize SharePoint’s list forms. You can change the look of the form, switch to multi-column layouts, break the form into sections, validate the information entered, pre-populate fields, and cause sections of the form to show and hide automatically. From any list, click the "Customize Form" command on the ribbon to launch InfoPath. After customizing the form, publish the form back to the SharePoint site to replace the default form.

All standard views of list items in SharePoint 2010 now use the customizable XSLT list view web part. From SharePoint Designer 2010, you can quickly apply custom styles to SharePoint’s list views and conditionally format rows based on their content.

SharePoint Designer 2010 can now be used to fully customize the the Approval, Collect Feedback, and Collect Signature workflows. Workflow capability has been expanded with new actions such as the rich pre-built approval actions.

Reusable workflows can be designed once, and then reused across multiple lists, document libraries, or content types. The SharePoint Designer 2010 user experience has been completely redesigned using the ribbon, creating an experience that’s simpler and more familiar to people who use Microsoft Office.

Thursday, December 29, 2011

SharePoint 2010 - Part 1

Today, I am going to describe more details about SharePoint 2010.

New user interface

SharePoint 2010 user interface includes a new ribbon to perform tasks quickly and in the context of your work. If you work with 2007 Microsoft Office applications such as Microsoft Word or Microsoft PowerPoint, you are already familiar with the ribbon.

Like the ribbon in these Office applications, the new ribbon in SharePoint 2010 is designed to help you quickly find the commands that you need to complete your tasks. Commands are organized in logical groups, displayed together under tabs. Each tab relates to a type of activity, such as working with a document in a document library or adding and formatting text on a page.

Collaboration

SharePoint 2010 includes the new co-authoring feature which allows few users to work simultaneously on the same documents. For example, to review a document you can send a link to the document in a SharePoint library, and all of the reviewers can provide their feedback in the document simultaneously.

Calendar feature has been improved. You can now schedule meetings and keep track of your schedule more easily. Improved calendar allows to display multiple SharePoint and Exchange calendars on a single page. You can easily add events to a calendar by clicking a date and entering details for the event without leaving the calendar. You can also drag and drop items within a calendar. There is the new Group calendar to schedule meetings with colleagues and schedule resources such as audio visual equipment and meeting rooms.

Working with wiki pages is more streamlined. You can insert and format content directly on the page with the new Rich Text Editor. Browse for images or photos on your local computer or network and insert them into your site without leaving the page you are on.

Managing multiple items in SharePoint lists is more efficient. Now you can select multiple items in a list and click a button to perform the same action on all the items at the same time. For example, you can check in or check out several documents at the same time.

Creating and managing blogs has improved authoring tools and new navigation. Use the new Rich Text Editor to more easily and intuitively author blog posts. Browse for images or photos on your local computer or network and insert them into your blog posts without leaving the page. Browse blog entries by month as well as by categories. You can see the number of posts for each month or category in real time. A new "Archive" link provides access to a view of all months since the blog’s inception and, within each month, posts are listed by category.

Social Computing

With the new features in SharePoint Server 2010 you can locate content and stay informed about people and areas of interest that matter most to you. New features include newsfeeds, social tagging, and ratings so that you can more easily keep track of your colleagues’ activities, as well as share relevant content.

Improvements to "My Sites" help you use your "My Sites" and profiles to share knowledge in your specific area with your colleagues. Adding interests and responsibilities to profiles makes it easier for colleagues to find each other through newsfeeds, ask and answer questions, and to connect in other ways.

You can use activity feeds on My Sites to follow your colleagues’ activities, stay informed of developments in areas you are interested in, and connect with others who are looking for help in areas you are interested in. You can also receive recommendations for new colleagues or keywords to follow, so that you can expand your professional network and knowledge.

Mobile SharePoint

With SharePoint Web pages optimized for viewing on small devices, you can now view and work with documents, blogs, wikis, back-end business data, and sites from your mobile phone. You can use the mobile search experience for finding people, contact information, SharePoint content, and finding data in custom databases. Subscribe to text message (SMS) alerts for changes to documents in SharePoint or to any SharePoint document library or list.

Offline access to site content

Microsoft SharePoint Workspace now enables you to work with SharePoint sites, libraries, and lists on your desktop while disconnected from your corporate network and then to synchronize your changes when you reconnect to your corporate network.

Major benefits of this offline and online integration include:
  • you can quickly view, add, edit and delete SharePoint library documents or list items while you are offline;
  • two-way synchronization between your computer and the network—that is, updates to data on your computer or on the network—are automatic while you are connected to the network;
  • content is automatically synchronized when you take your computer offline and then go back online;
  • you can use the new External List feature to work more efficiently with back-end business data—such as SQL Server databases and SAP—while you are offline.
Business Connectivity Services (BCS)

Business Connectivity Services (BCS) enables SharePoint integration with external data, including line of business applications. BCS builds on top of the Business Data Catalog (BDC) technology delivered in Microsoft SharePoint 2007. Use BCS to:
  • more easily define external content types—previously referred to as “entities”—by using SharePoint Designer’s visual interface, without using an XML editor;
  • connect to a wider range of data sources—relational databases, SAP, Web services, and custom applications—and interact with them in richer ways, including full create, read, update, and delete support;
  • use rich client extensions to build a SharePoint application and extend it to Office client applications such as SharePoint Workspace, Outlook and Word, so you can work with your external data offline;
  • view external back-end business data across server and client applications with no customization, including seamless business data integration with SharePoint lists.
I will describe more details about SharePoint 2010 in my next post.