Monday, December 28, 2020

Electronic Signature and Content Management


 At this time of digital transformation, it is difficult to talk about managing content management without talking about using electronic signatures. E-signatures make it possible to create digital workflows, help to maximize ROI from content management, and enhance productivity, compliance, security, and analytics.

Quite a few content management tools include e-signature implementation such as SharePoint, Box, and other content management systems (CMS).

Electronic signatures, digital business, and content management are interdependent. Without e-signature capability, documents continue to be printed for signing, then photocopied, shipped, corrected, imaged back into the system, archived, and shredded. 90% of the time and cost of labor dedicated to managing paper can be saved by using e-signatures. There are also other benefits of using e-signatures such as faster decision making, shorter sales cycles, and improved customer experience.

In the last few years, financial services, insurance, healthcare, and government have embraced digital transformation. A major driver is compliance and risk. Many organizations are concerned about legal risk or they struggle with the constantly changing regulatory landscape in their industries, in part because manual processing is very prone to errors.

Rather than react to regulatory pressure with additional people, manual controls, and process complexity, organizations that adopt e-signatures have these benefits:

  • Leverage workflow rules to execute transactions correctly and consistently.
  • Capture a full audit trail and electronic evidence.
  • Minimize exposure to risk due to misplaced or lost documents.
  • Make the process of e-discovery easier, more reliable, and less expensive.
  • Demonstrate compliance and reduce legal risk through the ability to playback the exact process that was used to capture signatures.

Let's look at this example: the VP of compliance is asking for transaction records from 5 years ago. How helpful would it be to quickly produce all signed records, in good order and replay the entire web-based signing process for context.

According to Forrester Research, organizations and customers now recognize that e-signature is an important enabler of digital business.

Today, the business is digital and e-signature is a foundational technology enabling end-to-end digitization. Let's look at this example: a customer filled out an insurance application. When the package is ready to be signed by the customer, traditionally it would revert to paper. Instead, documents are handed off to the electronic signature solution. This solution would manage every aspect of the e-sign process, including notifying and authenticating signers, presenting documents for review, capturing intent, securing documents, collecting evidence, etc.

Once e-signed, the documents can be downloaded in PDF format and stored in any archiving system. The e-signature audit trail and the security travels seamlessly with the document, ensuring the record can be verified independently or the e-signature service.

A document centric approach to embedding e-signatures within signed records allows for greater portability and easier long term storage in an CMS solution. Additional metadata related to the e-sign transaction can be handed off to the CMS as well for analytics purpose.

Adopting electronic signatures is quick and easy and does require IT or programming resources. Companies who are looking for a more integrated automated workflow, e-signature plugins for SharePoint, Salesforce, Box are available.

Organizations can quickly and easily enhance approval workflows with a more robust e-signature solution than a checkbox on an approval routing sheet, while also automating archival.

Thursday, July 30, 2020

Metadata Driven Solutuions

Metadata is data that provides information about other data. Many distinct types of metadata exist, including descriptive metadata, structural metadata, administrative metadata, reference metadata, and statistical metadata.
  • Descriptive metadata is descriptive information about a resource. It is used for discovery and identification. It includes elements such as title, abstract, author, and keywords.
  • Structural metadata is metadata about containers of data and indicates how compound objects are put together, for example, how pages are ordered to form chapters. It describes the types, versions, relationships and other characteristics of digital materials.
  • Administrative metadata is information to help manage a resource, like resource type, permissions, and when and how it was created.
  • Reference metadata is information about the contents and quality of statistical data.
  • Statistical metadata, also called process data, may describe processes that collect, process, or produce statistical data.
Metadata, properly managed, is the powerful tool to make things happen. We can have processes and solutions which are driven by metadata.

In application building process which is metadata driven, instead of building the desired application directly we define the application’s specifications. These specifications are fed to an engine which builds the application for us by using predefined rules.

Instead of building a package to create a dimension, for example, we can provide the dimension description (metadata) to a package generating engine. This engine is then responsible for creating the defined package. Once the package is executed, it will create and maintain the prescribed dimension.

Why is metadata driven so much more efficient than traditional methods?

  • Creating a definition of a process is much faster than creating a process. A metadata driven approach results in building the same asset in less time as compared to traditional methods.
  • Quality standards are enforced. The rules engine becomes the gatekeeper by enforcing best practices.
  • The rules engine becomes a growing knowledge base which all processes benefit from.
  • Easily adapts to change & extension. Simply edit the definition and submit to the engine for a build. Need to inject a custom process? No problem, create a package the old fashioned way.
  • Enables agile data warehousing. Agile becomes possible due to greatly increased speed of development and reduced rework required by change.
The ongoing proliferation of devices joined with the distributed nature of data sources has created an indispensable role for metadata. Metadata provides knowledge such as location of the device and nature of the data, which facilitates integration of data regardless of its origin or structure.

Enterprises are incorporating data quality and data governance functions as part of data integration flows. Embedding these processes in the integration pipeline necessitates sharing of metadata between the integration tools, and the quality and governance tools.

Metadata also facilitates performance optimization in integration scenarios by providing information on the characteristics of underlying sources in support of dynamic optimization strategies.

In content management, folder-less approach allows you to search for and access files however you want – by client, project type, date, status or other criteria. It's completely dynamic, enabling you organize and display information how you need it, without the limitations of antiquated, static folder structure.

All you do is save to files and tag the file with the properties you need, and you are done. No more wandering through complex, hard-to-navigate folder structures, trying to guess where to save a file. With metadata you just quickly describe what it is you are looking for.

Metadata management software provides context and information for data assets stored across the enterprise. ... Metadata management tools include data catalogs, or assemblages of data organized into datasets (e.g. searchable tables or other arrangements, facilitating exploration).

Wednesday, April 29, 2020

IT Systems Validation

GMP guidelines require that IT systems must be validated by adequate and documented testing. It is required in all regulated industries: pharmaceuticals, medical devices, food and beverages, cosmetics.

GMP Requirements - “The software development process should be sufficiently well planned, controlled, and documented to detect and correct unexpected results from software changes."

Validation is defined as the documented act of demonstrating that a procedure, process, and activity will consistently lead to the expected results. This is the formal testing to demonstrate that the software meets its specified requirements.

This is a documented process to verify that a computerized system does exactly what it is designed to do in a consistent and reproducible manner, that it will produce information or data that meets a set of defined requirements. If a system meets these requirements, it can be assumed that it is consistently performing in the way it was intended.

Validation helps to ensure that both new and existing computer systems consistently fulfill their intended purpose and produce accurate and reliable results that enable regulatory compliance, fulfillment of user requirements, and the ability to discern invalid and/or altered records.

Computer systems need to be examined to confirm that the systems will work in all situations.

Computer system validation is required when configuring a new system or making a change in a validated system (upgrades, patches, extensions, etc.).

Validation processes should be based on applicable regulations and guidance, best practices for the domain, and the characteristics of the system being validated.

To validate software, it must be:

  • structured, documented, and evaluated as it is developed;
  • checked to make sure that it meets specifications;
  • adequately tested with the assigned hardware systems;
  • operated under varied conditions by the intended operators or persons of like training to assure that it will perform consistently and correctly.

Why Do We Need IT Systems Validation?

Regulated industries have to adopt many compliance procedures to make sure their final product is safe for distribution or sale.
Regulatory agencies around the world require validation processes to confirm the accuracy and integrity of data in computerized systems in order to ensure product safety and effectiveness.

The process is used to make sure that the IT systems are completely transparent, robust and tamper proof because they directly impact public health and safety.

It is critical that these systems can be relied upon to produce data consistently and store those electronic data records so that they stand the test of time.

Validation is one of those compliance requirements and is part of the Quality Management System within regulated industries.

There are several of examples as to why software validation is important. We can look at our library of FDA Warning Letters to see more than 200 reasons to validate your software or systems.

There is a case study about Therac-25, a radiation therapy machine from the 1980s.

Due to programming issues, the machine could administer the wrong amount of radiation to patients (often as a huge overdose), which led to serious injuries and even death. Had there been software validation standards in place, these types of instances could have been identified and remediated prior to the treatment of patients.

Computer systems validation is serious and the FDA and other regulatory agencies do not take this lightly.

What benefits does validation deliver?
  • Accuracy – when test outcomes are routinely checked against predetermined expected results, the accuracy of computer systems within the manufacturing process can be relied upon.
  • Security – validation processes make clear when entries to the system have been altered.Reliability – the process ensures that system outputs can be relied upon throughout the life cycle.
  • Consistency – it also ensures that the system output is consistent across its life cycle.
  • Optimization – following the process also means that computer systems can be more easily optimized. Optimization is a key feature of an effective and efficient manufacturing site.
When used as intended, systems validation can provide increased process reliability, confidence, improved production results, and reduced operating expenses significantly.

Saturday, March 28, 2020

Purpose of Document Control and its Role in Quality Assurance

GxP/GMP, GDocP, ISO 9000 and documentation

GxP stands for "Good Practice" which is quality guidelines and regulations. The "x" stands for the various fields, for example Good Documentation Practice or GDocP, Good Financial Practice or GFP and so on. There are many instances of these regulations. One instance of GxP is Good Manufacturing practice or GMP.

GMP describes required Quality Management System (QMS) for manufacturing, testing, and quality assurance in order to ensure that products are safe, pure, and effective. GMP has ultimate goal to enable companies to minimize or eliminate contamination and errors which protects consumers from purchasing a product which is not effective or even dangerous. GMP regulations are required to be used in regulated industries such as food and beverages, pharmaceutical, medical devices, and cosmetics.

GMP documentation requirements are aligned with Good Documentation Practice (GDocP). GDocP is the standard in the regulated industries by which documents are created and maintained. It is the systematic set of procedures of preparation, reviewing, approving, issuing, recording, storing, and archiving documents.

The ISO 9000 is a set of standards which deals with fundamentals of Quality Management System (QMS) that helps organizations to ensure that they meet customers’ needs within statutory and regulatory requirements related to a product or service. ISO 9001 deals with the requirements that organizations wishing to meet the standard must fulfil.

GxP/GMP, GDocP, ISO 9000 are about QMS where an organization needs to demonstrate its ability to consistently provide a product that meets customer and applicable statutory and regulatory requirements.

Documentation is the key to compliance with these regulations and ensures traceability of all development, manufacturing, and testing activities. Documentation provides the route for auditors to assess the overall quality of operations within a company and the final product. GMP, GDocP, and ISO 9000 are enforced by regulatory agencies. Auditors pay particular attention to documentation to make sure that it complies with these regulations.

Therefore, in order for an organization to meet these requirements, it must have documentation procedures in place. Documentation is a critical tool for ensuring this compliance.

Purpose of document control and its role in Quality Assurance (QA)

The primary purpose of document control is to ensure that only current documents and not documents that have been superseded are used to perform work and that obsolete versions are removed. Document control also ensures that current documents are approved by the competent and responsible for the specific job people and documents are distributed to the places where they are used.

Document control is an essential preventive measure ensuring that only approved, current documents are used throughout an organization. Inadvertent use of out-of-date documents or not approved documents can have significant negative consequences on quality, costs, customer satisfaction, and can even cause death.

The role of QA, in regards to the document control system is one of management and overview.

QA ensures that all documents are maintained in a controlled fashion and that all controlled documents are approved by the appropriate subject matter experts, are consistent with other documents, and are the most current version.

One way that QA ensures this is by being the last signature on all approved documents. All documents - current, obsolete, superseded, as well as all the history on the creation and revision of the document should be kept in Quality Assurance.