Monday, December 28, 2020

Electronic Signature and Content Management


 At this time of digital transformation, it is difficult to talk about managing content management without talking about using electronic signatures. E-signatures make it possible to create digital workflows, help to maximize ROI from content management, and enhance productivity, compliance, security, and analytics.

Quite a few content management tools include e-signature implementation such as SharePoint, Box, and other content management systems (CMS).

Electronic signatures, digital business, and content management are interdependent. Without e-signature capability, documents continue to be printed for signing, then photocopied, shipped, corrected, imaged back into the system, archived, and shredded. 90% of the time and cost of labor dedicated to managing paper can be saved by using e-signatures. There are also other benefits of using e-signatures such as faster decision making, shorter sales cycles, and improved customer experience.

In the last few years, financial services, insurance, healthcare, and government have embraced digital transformation. A major driver is compliance and risk. Many organizations are concerned about legal risk or they struggle with the constantly changing regulatory landscape in their industries, in part because manual processing is very prone to errors.

Rather than react to regulatory pressure with additional people, manual controls, and process complexity, organizations that adopt e-signatures have these benefits:

  • Leverage workflow rules to execute transactions correctly and consistently.
  • Capture a full audit trail and electronic evidence.
  • Minimize exposure to risk due to misplaced or lost documents.
  • Make the process of e-discovery easier, more reliable, and less expensive.
  • Demonstrate compliance and reduce legal risk through the ability to playback the exact process that was used to capture signatures.

Let's look at this example: the VP of compliance is asking for transaction records from 5 years ago. How helpful would it be to quickly produce all signed records, in good order and replay the entire web-based signing process for context.

According to Forrester Research, organizations and customers now recognize that e-signature is an important enabler of digital business.

Today, the business is digital and e-signature is a foundational technology enabling end-to-end digitization. Let's look at this example: a customer filled out an insurance application. When the package is ready to be signed by the customer, traditionally it would revert to paper. Instead, documents are handed off to the electronic signature solution. This solution would manage every aspect of the e-sign process, including notifying and authenticating signers, presenting documents for review, capturing intent, securing documents, collecting evidence, etc.

Once e-signed, the documents can be downloaded in PDF format and stored in any archiving system. The e-signature audit trail and the security travels seamlessly with the document, ensuring the record can be verified independently or the e-signature service.

A document centric approach to embedding e-signatures within signed records allows for greater portability and easier long term storage in an CMS solution. Additional metadata related to the e-sign transaction can be handed off to the CMS as well for analytics purpose.

Adopting electronic signatures is quick and easy and does require IT or programming resources. Companies who are looking for a more integrated automated workflow, e-signature plugins for SharePoint, Salesforce, Box are available.

Organizations can quickly and easily enhance approval workflows with a more robust e-signature solution than a checkbox on an approval routing sheet, while also automating archival.

Thursday, July 30, 2020

Metadata Driven Solutuions

Metadata is data that provides information about other data. Many distinct types of metadata exist, including descriptive metadata, structural metadata, administrative metadata, reference metadata, and statistical metadata.
  • Descriptive metadata is descriptive information about a resource. It is used for discovery and identification. It includes elements such as title, abstract, author, and keywords.
  • Structural metadata is metadata about containers of data and indicates how compound objects are put together, for example, how pages are ordered to form chapters. It describes the types, versions, relationships and other characteristics of digital materials.
  • Administrative metadata is information to help manage a resource, like resource type, permissions, and when and how it was created.
  • Reference metadata is information about the contents and quality of statistical data.
  • Statistical metadata, also called process data, may describe processes that collect, process, or produce statistical data.
Metadata, properly managed, is the powerful tool to make things happen. We can have processes and solutions which are driven by metadata.

In application building process which is metadata driven, instead of building the desired application directly we define the application’s specifications. These specifications are fed to an engine which builds the application for us by using predefined rules.

Instead of building a package to create a dimension, for example, we can provide the dimension description (metadata) to a package generating engine. This engine is then responsible for creating the defined package. Once the package is executed, it will create and maintain the prescribed dimension.

Why is metadata driven so much more efficient than traditional methods?

  • Creating a definition of a process is much faster than creating a process. A metadata driven approach results in building the same asset in less time as compared to traditional methods.
  • Quality standards are enforced. The rules engine becomes the gatekeeper by enforcing best practices.
  • The rules engine becomes a growing knowledge base which all processes benefit from.
  • Easily adapts to change & extension. Simply edit the definition and submit to the engine for a build. Need to inject a custom process? No problem, create a package the old fashioned way.
  • Enables agile data warehousing. Agile becomes possible due to greatly increased speed of development and reduced rework required by change.
The ongoing proliferation of devices joined with the distributed nature of data sources has created an indispensable role for metadata. Metadata provides knowledge such as location of the device and nature of the data, which facilitates integration of data regardless of its origin or structure.

Enterprises are incorporating data quality and data governance functions as part of data integration flows. Embedding these processes in the integration pipeline necessitates sharing of metadata between the integration tools, and the quality and governance tools.

Metadata also facilitates performance optimization in integration scenarios by providing information on the characteristics of underlying sources in support of dynamic optimization strategies.

In content management, folder-less approach allows you to search for and access files however you want – by client, project type, date, status or other criteria. It's completely dynamic, enabling you organize and display information how you need it, without the limitations of antiquated, static folder structure.

All you do is save to files and tag the file with the properties you need, and you are done. No more wandering through complex, hard-to-navigate folder structures, trying to guess where to save a file. With metadata you just quickly describe what it is you are looking for.

Metadata management software provides context and information for data assets stored across the enterprise. ... Metadata management tools include data catalogs, or assemblages of data organized into datasets (e.g. searchable tables or other arrangements, facilitating exploration).

Wednesday, April 29, 2020

IT Systems Validation

GMP guidelines require that IT systems must be validated by adequate and documented testing. It is required in all regulated industries: pharmaceuticals, medical devices, food and beverages, cosmetics.

GMP Requirements - “The software development process should be sufficiently well planned, controlled, and documented to detect and correct unexpected results from software changes."

Validation is defined as the documented act of demonstrating that a procedure, process, and activity will consistently lead to the expected results. This is the formal testing to demonstrate that the software meets its specified requirements.

This is a documented process to verify that a computerized system does exactly what it is designed to do in a consistent and reproducible manner, that it will produce information or data that meets a set of defined requirements. If a system meets these requirements, it can be assumed that it is consistently performing in the way it was intended.

Validation helps to ensure that both new and existing computer systems consistently fulfill their intended purpose and produce accurate and reliable results that enable regulatory compliance, fulfillment of user requirements, and the ability to discern invalid and/or altered records.

Computer systems need to be examined to confirm that the systems will work in all situations.

Computer system validation is required when configuring a new system or making a change in a validated system (upgrades, patches, extensions, etc.).

Validation processes should be based on applicable regulations and guidance, best practices for the domain, and the characteristics of the system being validated.

To validate software, it must be:

  • structured, documented, and evaluated as it is developed;
  • checked to make sure that it meets specifications;
  • adequately tested with the assigned hardware systems;
  • operated under varied conditions by the intended operators or persons of like training to assure that it will perform consistently and correctly.

Why Do We Need IT Systems Validation?

Regulated industries have to adopt many compliance procedures to make sure their final product is safe for distribution or sale.
Regulatory agencies around the world require validation processes to confirm the accuracy and integrity of data in computerized systems in order to ensure product safety and effectiveness.

The process is used to make sure that the IT systems are completely transparent, robust and tamper proof because they directly impact public health and safety.

It is critical that these systems can be relied upon to produce data consistently and store those electronic data records so that they stand the test of time.

Validation is one of those compliance requirements and is part of the Quality Management System within regulated industries.

There are several of examples as to why software validation is important. We can look at our library of FDA Warning Letters to see more than 200 reasons to validate your software or systems.

There is a case study about Therac-25, a radiation therapy machine from the 1980s.

Due to programming issues, the machine could administer the wrong amount of radiation to patients (often as a huge overdose), which led to serious injuries and even death. Had there been software validation standards in place, these types of instances could have been identified and remediated prior to the treatment of patients.

Computer systems validation is serious and the FDA and other regulatory agencies do not take this lightly.

What benefits does validation deliver?
  • Accuracy – when test outcomes are routinely checked against predetermined expected results, the accuracy of computer systems within the manufacturing process can be relied upon.
  • Security – validation processes make clear when entries to the system have been altered.Reliability – the process ensures that system outputs can be relied upon throughout the life cycle.
  • Consistency – it also ensures that the system output is consistent across its life cycle.
  • Optimization – following the process also means that computer systems can be more easily optimized. Optimization is a key feature of an effective and efficient manufacturing site.
When used as intended, systems validation can provide increased process reliability, confidence, improved production results, and reduced operating expenses significantly.

Saturday, March 28, 2020

Purpose of Document Control and its Role in Quality Assurance

GxP/GMP, GDocP, ISO 9000 and documentation

GxP stands for "Good Practice" which is quality guidelines and regulations. The "x" stands for the various fields, for example Good Documentation Practice or GDocP, Good Financial Practice or GFP and so on. There are many instances of these regulations. One instance of GxP is Good Manufacturing practice or GMP.

GMP describes required Quality Management System (QMS) for manufacturing, testing, and quality assurance in order to ensure that products are safe, pure, and effective. GMP has ultimate goal to enable companies to minimize or eliminate contamination and errors which protects consumers from purchasing a product which is not effective or even dangerous. GMP regulations are required to be used in regulated industries such as food and beverages, pharmaceutical, medical devices, and cosmetics.

GMP documentation requirements are aligned with Good Documentation Practice (GDocP). GDocP is the standard in the regulated industries by which documents are created and maintained. It is the systematic set of procedures of preparation, reviewing, approving, issuing, recording, storing, and archiving documents.

The ISO 9000 is a set of standards which deals with fundamentals of Quality Management System (QMS) that helps organizations to ensure that they meet customers’ needs within statutory and regulatory requirements related to a product or service. ISO 9001 deals with the requirements that organizations wishing to meet the standard must fulfil.

GxP/GMP, GDocP, ISO 9000 are about QMS where an organization needs to demonstrate its ability to consistently provide a product that meets customer and applicable statutory and regulatory requirements.

Documentation is the key to compliance with these regulations and ensures traceability of all development, manufacturing, and testing activities. Documentation provides the route for auditors to assess the overall quality of operations within a company and the final product. GMP, GDocP, and ISO 9000 are enforced by regulatory agencies. Auditors pay particular attention to documentation to make sure that it complies with these regulations.

Therefore, in order for an organization to meet these requirements, it must have documentation procedures in place. Documentation is a critical tool for ensuring this compliance.

Purpose of document control and its role in Quality Assurance (QA)

The primary purpose of document control is to ensure that only current documents and not documents that have been superseded are used to perform work and that obsolete versions are removed. Document control also ensures that current documents are approved by the competent and responsible for the specific job people and documents are distributed to the places where they are used.

Document control is an essential preventive measure ensuring that only approved, current documents are used throughout an organization. Inadvertent use of out-of-date documents or not approved documents can have significant negative consequences on quality, costs, customer satisfaction, and can even cause death.

The role of QA, in regards to the document control system is one of management and overview.

QA ensures that all documents are maintained in a controlled fashion and that all controlled documents are approved by the appropriate subject matter experts, are consistent with other documents, and are the most current version.

One way that QA ensures this is by being the last signature on all approved documents. All documents - current, obsolete, superseded, as well as all the history on the creation and revision of the document should be kept in Quality Assurance.

Monday, December 30, 2019

Headless CMS - Contentful

In the last post, we have described headless CMS. Headless CMS architecture is rising in popularity in the development world. 

This model allows breakthrough user experiences, gives developers the great flexibility to innovate, and helps site owners future-proof their builds by allowing them to refresh the design without re-implementing the whole CMS.

One of headless CMS is Contentful. Contentful platform lets you create, manage and distribute content to any platform. It gives you total freedom to create your own content model so you can decide which content you want to manage. 

With an uncluttered user interface, Contentful is an efficient tool for creating and managing your content online, either alone or in team. You can assign custom roles and permissions to team members, add validations depending on the kind of content you have to insert and add media such as images, documents, sounds or video.

Contentful has a three-step process. First, you define a content model which is independent from any presentation layer that defines what kind of content you want to manage. In a second step, you and other internal or external editors can manage all of the content in easy-to-use and interactive editing interface. In the third step, the content is served in a presentation-independent way.

Being presentation-layer agnostic is one of the strengths of Contentful because you will be able to reuse your content across any platform.

To create a web site, you will either have to code it yourself and load content from Contentful API or work with someone who can develop the web site for you. Contentful is the platform where you can update the content of your web site, a mobile app or any other platform that displays content.

Contentful runs on all browsers. Contentful offers the most powerful REST APIs and the only enterprise-grade in-production GraphQL API.

There are three steps you'll have to take in order to deliver content from Contentful to your apps and websites.

1. Create your Content Model

The Content Model is the first step to structuring your content properly. It consists of creating content types that will accept only certain types of data for entry. For example, when creating an interactive quiz, you will need to add something that is a question, multiple answers, an indicator of the correct answer and potentially an image. This can be set up in the content model, so you can then easily just add as many "Questions" to your Quiz as you want.

2. Add Entries and Assets

Entries refer to the content itself. Entries could be blog posts, product features or ingredients of a recipe or any other content. These entries will depend on your previously created content model. In this phase you can also add assets like images, sounds, videos and many other files.

3. Deliver your content with our API


The delivery part of the content may or may not be left only to developers. In this step you set up API Keys that will determine which content will go to which platform. After the delivery is set up correctly, your content is then available for consumption as soon as you hit the “Publish” button.

We have over 18 years experience with numerous CMS, so we can help you with Contentful as well. Call us today for a free, no obligation consultation.

Saturday, November 30, 2019

Headless CMS vs Decoupled CMS

A headless content management system, or headless CMS, is a back-end only content management system (CMS) built as a content repository that makes content accessible via a RESTful API for display on any device.

The term “headless” comes from the concept of chopping the “head” which is the front end, i.e. the web site off the “body” (the back end, i.e. the content repository).

Whereas a traditional CMS typically combines the content and presentation layers of a web site, a headless CMS is just the content component which focuses entirely on the administrative interface for content creators, the facilitation of content workflows and collaboration, and the organization of content into taxonomies.

It doesn’t include presentation layers, templates, site structure, or design, but rather stores its content in pure format and provides access to other components (e.g. delivery front ends, analytics tools, etc.) through stateless or loosely coupled APIs.

The headless CMS concept is one born of the demands of the digital era and a business’s need to focus on engaging customers with personalized content via multiple channels at all stages of the customer journey. As the content in a headless CMS is considered “pure” (because it has no presentation layer attached) just one instance of it can be used for display on any device; web site, mobile, tablet, smart watches, etc.

There is some confusion around what makes a headless CMS truly “headless”, as vendors use the term somewhat loosely to label their decoupled or hybrid CMS systems. But a true headless CMS is one that was built from the ground up to be API-first, not a full monolith CMS with APIs attached afterwards.

Cloud-first headless CMSs are those that were also built with a multi-tenant cloud model at their core and whose vendor promotes Software as a Service (Saas), promising high availability, scalability, and full management of security, upgrades, and hot fixes, etc. on behalf of clients.

Coupled CMS vs. Headless CMS

Most traditional (monolithic) CMS systems are “coupled”, meaning that the content management application (CMA) and the content delivery application (CDA) come together in a single application, making back-end user tools, content editing and taxonomy, web site design, and templates inseparable.

Coupled systems are useful for blogs and basic web sites as everything can be managed in one place. But this means that the CMS code is tightly connected to any custom code and templates, which means developers have to spend more time on installations, customization, upgrades, hot fixes, etc. and they cannot easily move their code to another CMS.

There is a lot of confusion around the differences between a decoupled CMS and a headless one because they have a lot in common.

A decoupled CMS separates the CMA and CDA environments, typically with content being created behind the firewall and then being synchronized and pushed to the delivery environment.

The main difference between a decoupled CMS and a headless CMS is that the decoupled architecture is active. It prepares content for presentation and then pushes it into the delivery environment, whereas a headless CMS is reactive. It sits idly until a request is sent for content.

Decoupled architecture allows for easier scalability and provides better security than coupled architecture, but it does not provide the same support for omni-channel delivery. Plus, there are multiple environments to manage, this increasing infrastructure and maintenance costs.

Advantages of Headless CMS
  • Omnichannel readiness: the content created in a headless CMS is “pure” and can be re-purposed across multiple channels, including web site, mobile applications, digital assistant, virtual reality, smart watches, etc., in other words, anywhere and at any time through the customer journey.
  • Low operating costs: headless CMSs are usually cheaper to install and run than their monolith counterparts, especially as they are typically built on a cloud model where multi-tenant options keep the running costs low.
  • Reduces time to market: a headless CMS promotes an agile way of working because content creators and developers can work simultaneously, and projects can be finished faster.
  • Easy to use: traditional CMSs tend to be cumbersome and complex as vendors attempt to offer every available feature in one box. Headless systems focus on content management, keeping things simple for those who use it on a daily basis. The entire user experience can usually be managed from within one back end.
  • Flexibility: content editors can work in whichever headless CMS they like and developers can build any kind of front end they want in their preferred language (e.g. Ruby, PHP, Java, or Swift) and then simply integrate the two via APIs (like JSON or XML) over RESTful communication. This allows for polyglot programming where multiple programming paradigms can be used to deliver content to multiple channels, and enables a company to benefit from the latest developments in language frameworks, promoting a micro-services architecture.
  • Cloud Scalability: the content purity and stateless APIs of headless CMSs enable high scalability, especially as the architecture fully leverages the elasticity of a cloud platform.
  • System Security: since the content is typically provided through a high-performance Content Delivery Network (rather than directly from the database), the risk of distributed denial-of-service attacks (DDOS) is reduced.
  • Marketing empowerment: marketers may end up relying more on developers in certain scenarios, e.g. creating a landing page with custom layout.
Disadvantages of Headless CMS
  • Multiple services: managing multiple systems can be challenging and a team’s knowledge base must cover them all.
  • No channel-specific support: since pure headless CMSs don’t deal with the presentation layer, developers may have to create some functionality, such as web site navigation.
  • Content organization: as pure headless CMSs do not typically provide the concept of pages and web site maps, content editors need to adapt to the fact that content is organized in its pure form, independently on the web site or other channel.
Headless CMS architecture is rising in popularity in the development world. This model allows breakthrough user experiences, gives developers the great flexibility to innovate, and helps site owners future-proof their builds by allowing them to refresh the design without re-implementing the whole CMS.

In the following posts, we will look more into headless CMS and will describe specific headless CMS. Stay tuned.

Sunday, September 29, 2019

Knowledge Management Maturity

Regardless of how great its knowledge management, every organization should take time to see if it enables the flow of knowledge across people and systems and identify opportunities for its improvement. 

Most organizations go through some evaluation when they first initiate their KM programs. However, it is equally important to revisit that self-evaluation at key intervals, such as when participation in KM tools and approaches lags or when leaders want to capitalize on the success of an effective, but limited, KM implementation by expanding it organization-wide.

This post is about knowledge management strategy as well as governance, processes, technology, and change management associated with successful and sustainable knowledge management implementation. In following posts, we will share details about the governance structures, processes, technologies, change management enablers and measurement approaches associated with successful and sustainable KM implementations.

Focus on Value Creation

Start with a focus on value creation. When it comes to building KM capabilities within your organization, it’s important to focus on the organization's goals from the very beginning. According to analysis of the assessment data, organizations that acknowledge value creation as a major objective of KM have a significant advantage in setting clear goals and objectives for their KM efforts. 

Specifically, those organizations are nearly four times more likely to document their KM strategies and road-maps than similar organizations which are not focused on value creation and they are 15 times more likely to articulate formal business cases that lay out the expected benefits and impact of applying KM to business opportunities.

Organizations which start by understanding the relationship between the flow of knowledge and desired business outcomes and then work to design KM tools and approaches that will aid those outcomes are successful in their KM efforts.

Any KM initiative worth pursuing must generate business value in the form of increased revenue, faster cycle times, cost savings, enhanced quality or other tangible benefits. When value creation is acknowledged as the underlying goal of KM, the initiative is starting on the right foot.

By contrast, if an organization has not made the connection between KM and value creation is prone to start throwing tools and techniques at employees without thinking through how they will be used or what broader purpose they will serve. And that kind of KM program tends to fade out over time as users fail to perceive why they are being asked to share their knowledge or how the new tools will help them in their day-to-day work.

Define your strategy and road-map

Once your organization recognizes the relationship between KM and business value, the next step is to cement that relationship by building it into a formal KM strategy and road-map. Writing down exactly where your KM program is headed and how you intend to get there is very important. 

A solid strategy will accelerate knowledge management maturity by providing focus, alignment, and credibility throughout your KM journey. It will also guide conversations with the business stakeholders whose support and buy-in you need to win along the way.

Alignment between KM and enterprise strategy is important for many reasons, but most importantly because it helps you justify the ongoing time, energy, and money required to support and participate in KM tools and approaches. If senior leaders understand the link between KM and the big-picture business concerns that keep them up at night, securing support becomes much easier, even during downturns and business disruptions when funding for “nice to have” programs dries up.

Documenting a KM strategy and road-map is linked with an even more meaningful outcome: the ability to leverage knowledge assets for competitive advantage. Almost every modern organization wants to compete on knowledge: to put its collective know-how to work to get to market faster, deliver superior products and services and earn customer loyalty.

KM exhibits its benefits behind the scenes, and customers reap the rewards without distinguishing the role played by better, faster access to institutional knowledge. But regardless of whether customers see your superior KM processes or they just know they’re getting something better from you than from your rivals, the ability to leverage knowledge for competitive advantage is a goal worth striving for.

Estimate impact

The most powerful accelerator of KM maturity related to strategy development involves analyzing the financial and other benefits your organization can expect from implementing the proposed KM tools and approaches. 

Although that may entail estimating a hard-dollar return on initial KM investments, it does not have to. But regardless of the nature of the benefits on which an organization focuses, your KM team must get specific about the projected impact (on productivity, quality, safety or other key performance indicators) and articulate a set of measures that can be tracked to compare reality against the forecast.

Those organizations that follow this strategy, get it back in the form of reliable funding, leadership and business unit support, program resilience and return on investment.

Financial analysis and documentation of benefits would greatly help the allocation of a KM budget. Even more impressively, organizations that document KM benefits are over five times more likely to procure flexible KM budgets that expand in response to increased demand for knowledge assets and competencies. This relationship is logical because leaders tend to be forthcoming with additional capital as needed if they feel confident that their funds will yield tangible results.

A clearly articulated business case and projection of value are also instrumental in engaging and retaining business unit support. There is no more crucial enabler of KM sustainability than solid business unit backing. Your KM core team can only accomplish so much on its own, and without the business dedicating resources and assigning people to support KM processes and approaches, KM’s scope is destined to remain limited.

Solid business unit support goes hand in hand with opportunities to expand and enhance the KM program, so it is not surprising that financial analysis and documentation of benefits are statistically linked to outcomes. 

For example, KM groups that perform the analysis are more likely to enhance KM capabilities across business units or disciplines and to expand focus from initial areas to other areas of the business. They are also more likely to be able to develop a formal business case for expanding KM to new domains based on predicted gains and impact to the organization.

The most compelling reason to perform financial analysis and documentation of benefits is its strong link to return on investment (ROI). Although many KM programs achieve success without measuring ROI, those that rely purely on anecdotal evidence and success stories to justify KM investments may find themselves on shaky ground if the business environment changes or a more skeptical CEO arrives. Some clear measure of business impact, whether ROI or another outcome in keeping with the goals laid out in the KM strategy and road-map is required to ensure sustainable KM development over the long term.

Conducting financial analysis and documentation of benefits during KM strategy development is highly correlated with an ability to show that type of tangible result. KM programs reap what they sow, and those that establish clear milestones and measures of success upfront are much better positioned to substantiate claims of value down the road.

Stay tuned for further posts on knowledge management.

Monday, August 5, 2019

Knowledge Management for a Contact Center

Knowledge management has been helping many organization to achieve higher efficiency and productivity for many years. It is especially important in a contact center.

When a customer calls, contact center agents rely heavily on knowledge base to answer customer questions. Not having knowledge management in place can jeopardize the quality of customer service in a contact center.

The biggest hurdles to providing high quality customer service is lack of contact center knowledge and inconsistency of answers across different channels of contact as well as inability of self-service help center to deliver information needed by customers. The solution to these hurdles is is the unified multiple channels knowledge management.

These are the main reasons why knowledge management is crucial component in service organizations:

1. Multiple Contact Channels – customers contact with organizations using multiple channels. It is very important to provide a single source of the truth, so that contact center
employees can provide consistent answers to customer questions across phone, email, chat, SMS, and social media. Having a central knowledge base accessible across channels
eliminates silos of information that can lead to different answers for the same question.

2. Self-Service – majority of customers prefer to find answers to questions on their own. Since there is no employee involved in this process to provide answers, an easy-to-navigate knowledge base is essential to give customers a place to search for answers on their computers or mobile devices. This can also service customers while employees are on holidays break, sick days, etc. Self-service also deflects need for customers calls, chat and emails.

3. Issue Complexity – one side effect of the popularity of self-service is that the issues that do arrive in the contact center can be the most difficult and complex. Because of this, agents are unlikely to know the answers and will rely heavily on a knowledge base to find information. A good knowledge base which contains information across a wide variety of topics would be very helpful.

Even if an agent has never taken a certain type of call, he/she can resolve the issue with confidence having such knowledge base handy. Ability to answer complex customers' questions would reduce the rate of return due to inability of agents to answer customers' questions and solve their problems.

4. Trusted Content – social content from forums and online communities can be a plentiful source of useful information, but customers can never be sure if the information is accurate. By promoting social content as part of a structured knowledge base, you can ensure that customers trust that the information is accurate and up to date.

5. Tools and Tactics – the tools and tactics to make contact center workforce successful must also evolve. Contact center workforce should be able to look up information rather than memorizing, and they usually rely heavily on a knowledge base to find answers for customers.

6. Pace of Change – when issues arise, up-to-date information is paramount. Weather issues, communication outages and software bugs can all generate an influx of calls demanding answers with the very latest information. A knowledge base gives employees a place to find the most current information on a frequently changing situation.

7. Speed of Answer – everyone is looking for shorter handling time. Customers are happy to get answers quickly, and organizations get the cost savings they require. However,
shorter handling times are only valuable when the call is still resolved with complete, accurate information. A knowledge base provides a quick way to get reliable answers to even the most complex questions.

8. Any Agent, Any Call – specialized agents can cause frustration and inefficiency as customers get transferred from employee to employee to get an answer. When each agent can access the full breadth of information in a central knowledge base, there is less need to specialize agents for tier one calls. Transfers can be reduced, resulting in happier customers and a more efficient contact center.

9. Employees Engagement – it is important to provide the tools for employees to feel engaged, do their jobs well, and feel confident and motivated in their work. A comprehensive knowledge base is a very useful tool that empowers employees and enables them to answer a broad range of customer questions, even on topics they may not have encountered before.

10. Employees Turnover – employees turnover can be extremely costly. Each time a new employee is on-boarded, weeks of time are spent training him/her on the vast array of information required to help customers. A knowledge base that contains the information needed to answer customer questions can significantly reduce training time, allowing trainers to focus on soft skills and customer engagement.

11. Employees Training - you can reduce employees training needs without compromising service quality by having an optimal knowledge base in place. You can also reduce employees training time which in turn cuts the time lost due to training time.

Providing high level of customer service is not easy, but with a successful knowledge management, you can empower your customers and employees with the information they need, when they need it. Since customer service issues will only grow in both number and complexity, now is the time to ensure your customer service operation is equipped with a knowledge management system. Done right, knowledge management can transform contact centers.

Galaxy Consulting has over 18 years experience in knowledge management. Please contact us today for a free consultation.

Wednesday, July 31, 2019

Taxonomy Development, Management, and Governance

Taxonomies do not exist in isolation. They exist within the context of multiple business processes. Taxonomies can take many different forms and they serve a wide variety of purposes in different organizations. 

A customer-facing search and browse taxonomy that describes a product catalog is a typical application for an e-commerce company while a taxonomy could provide a detailed profile of a scientific domain for indexing research content for a company focused on research and development. Website navigation, customer and employee profiling, inventory management, records management, writing, publishing and content management and site search are other possible taxonomy applications.

Efficient taxonomy management is the best facilitated by formally designating team members’ level of participation and responsibilities. Taxonomy management covers a broad range of activities and the most efficient use of team resources is achieved when responsibilities are clearly defined.

Taxonomy operations are typically performed by personnel with specialized training in library science or information management. The task of taxonomy governance are performed by taxonomy administrators. It is important to develop taxonomy change management procedures when taxonomy is being developed.

A well-governed taxonomy requires a time commitment from stakeholders. Participation in governance team activities is one manifestation of this but of greater significance is the impact that policies and procedures developed by the governance team have on stakeholders and business processes.

The size and precise makeup of taxonomy governance teams vary greatly depending on the size and complexity of both the organization and the taxonomy implementation. At one end of the spectrum a governance team might consist of a few individuals. In contrast, in an enterprise environment taxonomy governance might be one part of a larger data or IT governance organization made up of multiple teams.

It is also worth emphasizing that size is only one factor to consider when devising governance policies and allocating governance resources. For example, regulatory requirements vary widely across industries. It is completely appropriate for a business operating in a highly-regulated industry to dedicate a relatively higher proportion of resources to governance activities.

Governance efforts are more likely to fail because of human factors than technological ones. This means that a realistic assessment of organizational context is an important first step when creating a taxonomy governance team and setting expectations for taxonomy efforts. 

For example, significant disruptions to existing workflows typically result in poor compliance with governance policies. Identifying these potential pitfalls in advance is best accomplished by soliciting input from users at all levels of an organization. This is just one reason why the governance team must include representatives from all stakeholder groups, not just from leadership and project management.

In broad terms representatives from management and business groups, information technology, taxonomy management and taxonomy users come together on the governance team to serve as advocates for their respective groups.

Because of the wide range of potential applications, taxonomy management can be the responsibility of an equally wide range of groups. Information technology groups, user experience and web design groups, libraries, and a range of marketing and business groups are all potential homes for taxonomy management. A taxonomy governance team needs executive sponsors and management representatives who can provide high-level guidance and steer taxonomy efforts in a productive direction for the business as a whole.

All members of the taxonomy governance team should contribute to the creation of a high-level strategy but this is a task for executive sponsors and business decision makers.

Following are some of the important questions to answer during taxonomy development. Taxonomy implementation will be very different depending on the answer to these questions:
  • Given that most large organizations have multiple applications that use taxonomies, will a single, multipurpose enterprise taxonomy be created and maintained or will multiple specialized taxonomies be used?
  • How will different taxonomy applications be prioritized? Given multiple taxonomy users, how will resources be allocated and how will taxonomy projects be funded?
  • Will there be a central taxonomy management group?
  • How will taxonomy goals be defined and what metrics will be used to measure success?
  • How will new and emerging technologies and trends be evaluated and potentially incorporated?  
A taxonomy deployment impacts many different groups within an organization, which means that conflicts over priorities and resource allocation are not unusual. Awareness of potential conflicts and a transparent decision-making process helps to minimize the strife between stakeholders. Managing the relationships between stakeholders is the single most important task of leadership representatives on the governance team. Leadership representatives on the governance team should include both executive sponsors and business group personnel who can provide insight into business processes and business needs.

Technical support is crucial for successful taxonomy implementation and use. Strategic and business goals must be realistic given an organization’s technical capabilities and constraints. The primary role of taxonomy governance team representatives from technology implementation and support groups is to provide the expertise needed to ensure that business goals align with technical reality.

Taxonomy implementations range from a small number of terms applied through a web publishing platform and managed in a spreadsheet to highly specialized taxonomies consisting of thousands of terms and relationships that are managed with dedicated software and support dozens of consuming systems.

Obviously, the specific details have a significant effect on technical requirements. Many taxonomy management systems provide tools for workflow and governance modeling and enforcement. Alternatively, if the taxonomy is maintained and applied from within a content management system, then the governance team should determine an appropriate level of control and develop mechanisms to implement it.

It is important not to underestimate the work needed to integrate taxonomy management with consuming systems. The reality is that most organizations have a mix of consuming systems. Development resources are required in all of these scenarios and input from technical stakeholders is needed when planning and prioritizing implementation and ongoing maintenance. At the beginning of a taxonomy implementation, technology questions should be on defining technical solutions based on business objectives.

Some of the questions technical stakeholders help to answer include:
  • Adapting existing processes and technology versus building or buying new ones.
  • In-house development of taxonomy management tools versus purchase of third-party tools.
  • Integration requirements for taxonomy management with consuming systems.
As a taxonomy implementation matures, the technical emphasis shifts from implementation to ongoing maintenance and support, as is typical in the software life cycle.

Technology stakeholders are typically in-house staff, although it is not unusual for contractors to be part of the team, especially during tool development and implementation stages when the workload may be significantly higher.

Taxonomy management consists of the initial creation of taxonomies and related vocabularies and their maintenance over time. The responsibility of taxonomy management personnel is to execute policies created by the governance team, report to the governance team on taxonomy status and performance, and provide expert advice on taxonomy capabilities to inform decisions on future taxonomy development.

The tasks that are part of initial taxonomy development are quite different from those that are required during ongoing maintenance and administration. Those differences may require changes in emphasis on the part of the governance team, including team make-up and activities, depending the stage of the taxonomy life cycle.

Taxonomy development should be driven by business requirements, working within organizational and technical constraints. Both requirements and constraints should be defined by the governance team, thus the taxonomy management representatives on the team must be sufficiently conversant in both business and technical issues to productively collaborate with team members from other disciplines. Next, execution of taxonomy development will require collaboration between taxonomists and subject matter experts to create vocabularies that represent relevant concepts using terminology that is accurate and meaningful to users.

Some of the questions that taxonomy management staff will answer for the governance team include:
  • What specific taxonomies are required to meet business needs?
  • Will these taxonomies need to be developed from scratch or can existing taxonomies be reused?
  • Are there vocabularies, organizing principles or other classification methods currently in use within the organization that can be harvested and reused?
  • Are there standard domain-specific taxonomies, thesauri, or ontologies that will satisfy the requirements, either as is or with modification?
  • Are implemented taxonomies meeting user and business needs?
  • What changes are needed to improve taxonomy performance?
Staff for both taxonomy development and administration can be either in-house or provided by a consultant. Staffing needs vary greatly between organizations and details of the taxonomy implementation should be considered carefully when staffing decisions are made. The initial development and implementation of specialized taxonomies can be a substantial amount of work and it is common to make use of consultants for this phase of the project.

However, the costs for long-term administration should not be underestimated. Costs rise when organizations do not anticipate staff and resources needed for taxonomy maintenance. More importantly, without maintenance, taxonomies will atrophy and the value they provide to the organization is greatly diminished. Taxonomy management representatives provide the governance team with accurate assessments of taxonomy status as well as short and long-term resource needs.

The list below describes the functional roles performed by a taxonomy governance team and lists the team members who are typically associated with a given role. The individuals fulfilling the roles will vary depending on the structure, management philosophy, and staffing model of the organization so these descriptions should be considered as general guidelines rather than specific job titles. It is also not uncommon for an individual an on the team to play more than one role.

Executive Sponsors - provide strategic guidance, advocacy and support for taxonomy projects within the organization.

Business Decision Makers - identify business objectives, resolve cost/benefit issues and oversee resource allocation for taxonomy projects.

Technology Implementation and Support - develop and support taxonomy management tools or manage integration of third-party tools with relevant systems and organizational IT infrastructure.

Taxonomy Management - responsible for high- and low-level execution of taxonomy strategy and day-to-day taxonomy administration. May be an in-house team, an outside consultant or a mix.

Taxonomy Consumers - systems, groups, and individuals that use taxonomy in their day-to-day business operations. Typical consumers include content management, content strategy, user experience and web design, writing and publishing, site search, SEM and SEO, and business intelligence.

Subject Matter Experts - provide expert advice on intellectual domains, business processes, and other subject areas described by organizational taxonomies. Subject matter experts may or may not also be taxonomy consumers.

There is no universal taxonomy governance solution. Rather, effective governance achieves an important set of general goals while recognizing the unique features of an organization. Establishing a taxonomy governance Team is very important.

Galaxy Consulting has 18 years experience in taxonomy development, management, and governance. Please call us today for a free consultation.

Wednesday, May 29, 2019

Change Management and Content Management

Content Management and Change Management are connected. Change Management is needed for successful Content Management. These two subject matters support each other.

Companies can benefit from the positive relationship between these two subject matters and suitable processes about them, starting with content management. 

Improved visibility and management of documents is particularly beneficial for change management. Employees across an organization can use the same, current documents with up-to-date facts and figures, and with an automated document management system, they can do it quickly, boosting the organization’s agility in times of change.

When Content Management Takes the Lead

With a reliable and efficient content management system, individual departments and change management teams can better:

  • Integrate siloed information and standardize operating procedures across the organization, thereby allowing everyone to pull from a single source of truth.
  • Communicate any changes quickly throughout the entire organization.
  • Increase product and process quality by ensuring employees have the right document at the right time.

By-products of these activities include improved decision-making and reduced possibility of errors, miscommunication, and regulatory actions through enforced compliance. In short, Content Management helps keep Change Management in control.

When Change Management Takes the Lead

How does change management helps to keep the content management processes in check? Whether change is driven by FDA, EMEA, or ISO regulations, or by competitive business forces, it is undeniably critical to operations. It doesn’t matter whether the change being addressed in an internal change, or a process change that must be efficiently and accurately documented to ensure adherence going forward. It must be kept in control, and to do so, it commands that other inter-related processes, including content management, be reliable at all times.

To effectively manage change, an organization must be agile. Bottlenecks to operational agility might include an inability to locate data, or outdated SOPs that expose the company to noncompliance or financial, operational, or legal risk. These bottlenecks might rest within the content management processes, rendering them unreliable. Change Management would help to resolve these problems.

An effective change management system will take charge and guide content management by starting document updates during the implementation of an approved change. This action:

  • Provides a comprehensive workflow for documenting change from the initial change request through to the approvals and implementation.
  • Reduces the risk of losing documents, or storing incomplete or unapproved documents.
  • Increases the available transparency of what is being documented.

Content Management and Change Management are Better Together

On their own, these subject matters are strong but together they are extremely agile, and they drive continuous improvement and overall organizational quality. They are also high-achievers in the higher-level view from Quality Management point of view. Working in tandem, Content Management and Change Management benefit Quality operations through:

  • Accessibility: Organized, current, and visible documentation provides an easily accessible audit trail to keep the organization on track and to satisfy regulatory requirements at a moment’s notice.
  • Collaboration: When electronic change requests integrate with electronic document management, they expedite the document update process and enhance project collaboration among impacted departments and functions.
  • Security: Concise storage and accessibility of current documents, particularly SOPs, ensures that the right individuals are receiving the right documents at the right time. When change is in the focus, incomplete documents or those not applicable to certain departments cannot be accessed through a “back door.”

Organizations Should Consider Adoption

Organizations would do well to adopt both quality management processes, whether on their own or as part of an automated enterprise-wide Quality Management System (QMS).

An effective automated system will integrate document and change control procedures. It also will integrate with other solutions, providing access to approved, controlled documents in other areas of the quality system, including audits, CAPAs and employee training. In these cases, an automated system’s search and retrieval capabilities, dashboards, and repositories expedite the processes.

Industry standards and regulatory guidelines recommend quality management processes which are integrated across the entire organization.