Thursday, February 11, 2021

Mastering Fractured Data

Data complexity in companies can be a big obstacle to achieve efficient operations and excellent customer service.

Companies are broken down into various departments. They have hundreds, thousands, or even hundreds of thousands of employees performing various tasks. Adding to the complexity, customer information is stored in so many different applications that wide gaps exist among data sources. Bridging those gaps so every employee in the organization has a consistent view of data is possible and necessary.

Various applications collect customer information in different ways. For example, CRM solutions focus on process management and not on data management.

Consequently, customer data is entered into numerous autonomous systems that were not designed to talk to one another. Client data is housed one way in a sales application, another way in an inventory system, and yet another way in contact center systems.

Other organizational factors further splinter the data, which can vary depending on the products in which a customer is interested, where the product resides, and who (the company or a partner) delivers it.

In addition, information is entered in various ways, including manually, either by the customer or an employee, or via voice recognition. And applications store the information in unique ways. One system might limit the field for customers’ last names to 16 characters while another could allow for 64 characters.

The challenge is further exacerbated by software design and vendors’ focus. CRM vendors concentrate on adding application features and do not spend as much time on data quality.

Customers can input their personal information 10 different ways. Most applications do not check for duplication when new customer information is entered.

Human error creates additional problems. Employees are often quite busy, move frequently and quickly from one task to the next, and, consequently, sometimes do not follow best practices fully.

Data becomes very fractured and there appear different versions of truth. The data features a tremendous amount of duplication, inconsistencies, and inefficiencies.

The inconsistencies exist because fixing such problems is a monumental task, one that requires companies to tackle both technical and organizational issues. Master data management (MDM) solutions, which have been sold for decades, are designed to address the technical issues. They are built to clean up the various inconsistencies, a process dubbed data cleansing.

The work sounds straightforward, but it is time-consuming and excruciatingly complex. The company has to audit all of its applications and determine what is stored where and how it is formatted. In many cases, companies work with terabytes and petabytes of information. Usually, they find many more sources than initially anticipated because cloud and other recent changes enable departments to set up their own data lakes.

Cleansing Process

Cleansing starts with mundane tasks, like identifying and fixing typos. The MDM solution might also identify where necessary information is missing.

To start the process, companies need to normalize fields and field values and develop standard naming conventions.  The data clean-up process can be streamlined in a few ways. If a company chooses only one vendor to supply all of its applications, the chances of data having a more consistent format increase. Typically, vendors use the same formats for all of their solutions. In some cases, they include add-on modules to help customers harmonize their data.

But that is not typically the case. Most companies purchase software from different suppliers, and data cleaning has largely been done in an ad hoc fashion, with companies harmonizing information application by application. Recognizing the need for better integration, suppliers sometimes include MDM links to popular systems, like Salesforce Sales Cloud, Microsoft Dynamics, and Marketo.

Artificial intelligence and machine learning are emerging to help companies grapple with such issues, but the work is still in the very early stages of development.

Still other challenges stem from internal company policies—or a lack thereof—and corporate politics. Businesses need to step back from their traditional departmental views of data and create an enterprise-wide architecture. They must understand data hierarchies and dependencies; develop a data governance policy; ensure that all departments understand and follow that policy; and assign data stewards to promote it.

The relationship between company departments and IT has sometimes been strained. The latter’s objectives to keep infrastructure costs low and to put central policies in place to create data consistency often conflict with the company departments' drivers. And while departments have taken more control over the data, they often lack the technical skills to manage it on their own.

It is a good idea to start with small area and then expand to other areas.

Having clean and organized data would make company's operations much more effective and would enable to optimize customer service. They can take steps to improve their data quality.

Please contact us for more information or for a free consultation.

Saturday, January 30, 2021

Digital Transformation

Digital technology is drastically changing how companies do their business and companies' relationship with their customers. While customers gain the power of information and choice, digital technology dramatically improves the economics of business. The rules of business are being rewritten nearly every day with new digital technologies. Every company has a unique digital transformation opportunity.

However, doing so involves far more than merely converting paper processes to electronic ones. Companies undergoing a digital transformation also need to make sure that all of their digital processes are interconnected. Even more important, though, digital transformation requires a company-wide culture transformation.

For successful digital transformation, first we need to understand that digital transformation is more than simply a technology change or software adoption. It requires a cultural shift and a change in how a business behaves, given changes in customer demands. The shift and change require complete support within the company, from top managers to rank-and-file personnel.

A well-timed adoption and utilization of technology and software can support this bridge by enabling seamless flow of information between the company and the customer.

Digital transformation is also about being focused on the company's customers. It is about you enabling them to be intelligent and self-educated and to go along their own journey in a self-guided manner, and then you figuring out where you need to intersperse human touchpoints along that journey to add value to the digital touchpoints.

The best practice is a mix of digital content and human interaction that is orchestrated around customers and how they want to learn about and experience the company and its brands.

Another critical element of digital transformation is the interconnection of all company's data. Companies are broken down into various departments. They have hundreds, thousands, or even hundreds of thousands of employees performing various tasks. Adding to the complexity, customer information is stored in so many different applications that wide gaps exist among data sources. Bridging those gaps so every employee in the organization has a consistent view of data is possible and necessary.

But the task requires large investments of money and manpower and sweeping process changes, steps that most organizations have not been willing to make thus far.

It’s not an easy task, but it is getting simpler, particularly as a wide and growing variety of applications emerge. Vendors are now building solutions to streamline workflows for employees inputting data or responding to various triggers, like customers calling in with a problem.

Please contact us today for more information and for a free consultation.

Monday, December 28, 2020

Electronic Signature and Content Management


 At this time of digital transformation, it is difficult to talk about managing content management without talking about using electronic signatures. E-signatures make it possible to create digital workflows, help to maximize ROI from content management, and enhance productivity, compliance, security, and analytics.

Quite a few content management tools include e-signature implementation such as SharePoint, Box, and other content management systems (CMS).

Electronic signatures, digital business, and content management are interdependent. Without e-signature capability, documents continue to be printed for signing, then photocopied, shipped, corrected, imaged back into the system, archived, and shredded. 90% of the time and cost of labor dedicated to managing paper can be saved by using e-signatures. There are also other benefits of using e-signatures such as faster decision making, shorter sales cycles, and improved customer experience.

In the last few years, financial services, insurance, healthcare, and government have embraced digital transformation. A major driver is compliance and risk. Many organizations are concerned about legal risk or they struggle with the constantly changing regulatory landscape in their industries, in part because manual processing is very prone to errors.

Rather than react to regulatory pressure with additional people, manual controls, and process complexity, organizations that adopt e-signatures have these benefits:

  • Leverage workflow rules to execute transactions correctly and consistently.
  • Capture a full audit trail and electronic evidence.
  • Minimize exposure to risk due to misplaced or lost documents.
  • Make the process of e-discovery easier, more reliable, and less expensive.
  • Demonstrate compliance and reduce legal risk through the ability to playback the exact process that was used to capture signatures.

Let's look at this example: the VP of compliance is asking for transaction records from 5 years ago. How helpful would it be to quickly produce all signed records, in good order and replay the entire web-based signing process for context.

According to Forrester Research, organizations and customers now recognize that e-signature is an important enabler of digital business.

Today, the business is digital and e-signature is a foundational technology enabling end-to-end digitization. Let's look at this example: a customer filled out an insurance application. When the package is ready to be signed by the customer, traditionally it would revert to paper. Instead, documents are handed off to the electronic signature solution. This solution would manage every aspect of the e-sign process, including notifying and authenticating signers, presenting documents for review, capturing intent, securing documents, collecting evidence, etc.

Once e-signed, the documents can be downloaded in PDF format and stored in any archiving system. The e-signature audit trail and the security travels seamlessly with the document, ensuring the record can be verified independently or the e-signature service.

A document centric approach to embedding e-signatures within signed records allows for greater portability and easier long term storage in an CMS solution. Additional metadata related to the e-sign transaction can be handed off to the CMS as well for analytics purpose.

Adopting electronic signatures is quick and easy and does require IT or programming resources. Companies who are looking for a more integrated automated workflow, e-signature plugins for SharePoint, Salesforce, Box are available.

Organizations can quickly and easily enhance approval workflows with a more robust e-signature solution than a checkbox on an approval routing sheet, while also automating archival.

Thursday, July 30, 2020

Metadata Driven Solutuions

Metadata is data that provides information about other data. Many distinct types of metadata exist, including descriptive metadata, structural metadata, administrative metadata, reference metadata, and statistical metadata.
  • Descriptive metadata is descriptive information about a resource. It is used for discovery and identification. It includes elements such as title, abstract, author, and keywords.
  • Structural metadata is metadata about containers of data and indicates how compound objects are put together, for example, how pages are ordered to form chapters. It describes the types, versions, relationships and other characteristics of digital materials.
  • Administrative metadata is information to help manage a resource, like resource type, permissions, and when and how it was created.
  • Reference metadata is information about the contents and quality of statistical data.
  • Statistical metadata, also called process data, may describe processes that collect, process, or produce statistical data.
Metadata, properly managed, is the powerful tool to make things happen. We can have processes and solutions which are driven by metadata.

In application building process which is metadata driven, instead of building the desired application directly we define the application’s specifications. These specifications are fed to an engine which builds the application for us by using predefined rules.

Instead of building a package to create a dimension, for example, we can provide the dimension description (metadata) to a package generating engine. This engine is then responsible for creating the defined package. Once the package is executed, it will create and maintain the prescribed dimension.

Why is metadata driven so much more efficient than traditional methods?

  • Creating a definition of a process is much faster than creating a process. A metadata driven approach results in building the same asset in less time as compared to traditional methods.
  • Quality standards are enforced. The rules engine becomes the gatekeeper by enforcing best practices.
  • The rules engine becomes a growing knowledge base which all processes benefit from.
  • Easily adapts to change & extension. Simply edit the definition and submit to the engine for a build. Need to inject a custom process? No problem, create a package the old fashioned way.
  • Enables agile data warehousing. Agile becomes possible due to greatly increased speed of development and reduced rework required by change.
The ongoing proliferation of devices joined with the distributed nature of data sources has created an indispensable role for metadata. Metadata provides knowledge such as location of the device and nature of the data, which facilitates integration of data regardless of its origin or structure.

Enterprises are incorporating data quality and data governance functions as part of data integration flows. Embedding these processes in the integration pipeline necessitates sharing of metadata between the integration tools, and the quality and governance tools.

Metadata also facilitates performance optimization in integration scenarios by providing information on the characteristics of underlying sources in support of dynamic optimization strategies.

In content management, folder-less approach allows you to search for and access files however you want – by client, project type, date, status or other criteria. It's completely dynamic, enabling you organize and display information how you need it, without the limitations of antiquated, static folder structure.

All you do is save to files and tag the file with the properties you need, and you are done. No more wandering through complex, hard-to-navigate folder structures, trying to guess where to save a file. With metadata you just quickly describe what it is you are looking for.

Metadata management software provides context and information for data assets stored across the enterprise. ... Metadata management tools include data catalogs, or assemblages of data organized into datasets (e.g. searchable tables or other arrangements, facilitating exploration).

Wednesday, April 29, 2020

IT Systems Validation

GMP guidelines require that IT systems must be validated by adequate and documented testing. It is required in all regulated industries: pharmaceuticals, medical devices, food and beverages, cosmetics.

GMP Requirements - “The software development process should be sufficiently well planned, controlled, and documented to detect and correct unexpected results from software changes."

Validation is defined as the documented act of demonstrating that a procedure, process, and activity will consistently lead to the expected results. This is the formal testing to demonstrate that the software meets its specified requirements.

This is a documented process to verify that a computerized system does exactly what it is designed to do in a consistent and reproducible manner, that it will produce information or data that meets a set of defined requirements. If a system meets these requirements, it can be assumed that it is consistently performing in the way it was intended.

Validation helps to ensure that both new and existing computer systems consistently fulfill their intended purpose and produce accurate and reliable results that enable regulatory compliance, fulfillment of user requirements, and the ability to discern invalid and/or altered records.

Computer systems need to be examined to confirm that the systems will work in all situations.

Computer system validation is required when configuring a new system or making a change in a validated system (upgrades, patches, extensions, etc.).

Validation processes should be based on applicable regulations and guidance, best practices for the domain, and the characteristics of the system being validated.

To validate software, it must be:

  • structured, documented, and evaluated as it is developed;
  • checked to make sure that it meets specifications;
  • adequately tested with the assigned hardware systems;
  • operated under varied conditions by the intended operators or persons of like training to assure that it will perform consistently and correctly.

Why Do We Need IT Systems Validation?

Regulated industries have to adopt many compliance procedures to make sure their final product is safe for distribution or sale.
Regulatory agencies around the world require validation processes to confirm the accuracy and integrity of data in computerized systems in order to ensure product safety and effectiveness.

The process is used to make sure that the IT systems are completely transparent, robust and tamper proof because they directly impact public health and safety.

It is critical that these systems can be relied upon to produce data consistently and store those electronic data records so that they stand the test of time.

Validation is one of those compliance requirements and is part of the Quality Management System within regulated industries.

There are several of examples as to why software validation is important. We can look at our library of FDA Warning Letters to see more than 200 reasons to validate your software or systems.

There is a case study about Therac-25, a radiation therapy machine from the 1980s.

Due to programming issues, the machine could administer the wrong amount of radiation to patients (often as a huge overdose), which led to serious injuries and even death. Had there been software validation standards in place, these types of instances could have been identified and remediated prior to the treatment of patients.

Computer systems validation is serious and the FDA and other regulatory agencies do not take this lightly.

What benefits does validation deliver?
  • Accuracy – when test outcomes are routinely checked against predetermined expected results, the accuracy of computer systems within the manufacturing process can be relied upon.
  • Security – validation processes make clear when entries to the system have been altered.Reliability – the process ensures that system outputs can be relied upon throughout the life cycle.
  • Consistency – it also ensures that the system output is consistent across its life cycle.
  • Optimization – following the process also means that computer systems can be more easily optimized. Optimization is a key feature of an effective and efficient manufacturing site.
When used as intended, systems validation can provide increased process reliability, confidence, improved production results, and reduced operating expenses significantly.

Saturday, March 28, 2020

Purpose of Document Control and its Role in Quality Assurance

GxP/GMP, GDocP, ISO 9000 and documentation

GxP stands for "Good Practice" which is quality guidelines and regulations. The "x" stands for the various fields, for example Good Documentation Practice or GDocP, Good Financial Practice or GFP and so on. There are many instances of these regulations. One instance of GxP is Good Manufacturing practice or GMP.

GMP describes required Quality Management System (QMS) for manufacturing, testing, and quality assurance in order to ensure that products are safe, pure, and effective. GMP has ultimate goal to enable companies to minimize or eliminate contamination and errors which protects consumers from purchasing a product which is not effective or even dangerous. GMP regulations are required to be used in regulated industries such as food and beverages, pharmaceutical, medical devices, and cosmetics.

GMP documentation requirements are aligned with Good Documentation Practice (GDocP). GDocP is the standard in the regulated industries by which documents are created and maintained. It is the systematic set of procedures of preparation, reviewing, approving, issuing, recording, storing, and archiving documents.

The ISO 9000 is a set of standards which deals with fundamentals of Quality Management System (QMS) that helps organizations to ensure that they meet customers’ needs within statutory and regulatory requirements related to a product or service. ISO 9001 deals with the requirements that organizations wishing to meet the standard must fulfil.

GxP/GMP, GDocP, ISO 9000 are about QMS where an organization needs to demonstrate its ability to consistently provide a product that meets customer and applicable statutory and regulatory requirements.

Documentation is the key to compliance with these regulations and ensures traceability of all development, manufacturing, and testing activities. Documentation provides the route for auditors to assess the overall quality of operations within a company and the final product. GMP, GDocP, and ISO 9000 are enforced by regulatory agencies. Auditors pay particular attention to documentation to make sure that it complies with these regulations.

Therefore, in order for an organization to meet these requirements, it must have documentation procedures in place. Documentation is a critical tool for ensuring this compliance.

Purpose of document control and its role in Quality Assurance (QA)

The primary purpose of document control is to ensure that only current documents and not documents that have been superseded are used to perform work and that obsolete versions are removed. Document control also ensures that current documents are approved by the competent and responsible for the specific job people and documents are distributed to the places where they are used.

Document control is an essential preventive measure ensuring that only approved, current documents are used throughout an organization. Inadvertent use of out-of-date documents or not approved documents can have significant negative consequences on quality, costs, customer satisfaction, and can even cause death.

The role of QA, in regards to the document control system is one of management and overview.

QA ensures that all documents are maintained in a controlled fashion and that all controlled documents are approved by the appropriate subject matter experts, are consistent with other documents, and are the most current version.

One way that QA ensures this is by being the last signature on all approved documents. All documents - current, obsolete, superseded, as well as all the history on the creation and revision of the document should be kept in Quality Assurance.