Wednesday, April 29, 2020

IT Systems Validation

GMP guidelines require that IT systems must be validated by adequate and documented testing. It is required in all regulated industries: pharmaceuticals, medical devices, food and beverages, cosmetics.

GMP Requirements - “The software development process should be sufficiently well planned, controlled, and documented to detect and correct unexpected results from software changes."

Validation is defined as the documented act of demonstrating that a procedure, process, and activity will consistently lead to the expected results. This is the formal testing to demonstrate that the software meets its specified requirements.

This is a documented process to verify that a computerized system does exactly what it is designed to do in a consistent and reproducible manner, that it will produce information or data that meets a set of defined requirements. If a system meets these requirements, it can be assumed that it is consistently performing in the way it was intended.

Validation helps to ensure that both new and existing computer systems consistently fulfill their intended purpose and produce accurate and reliable results that enable regulatory compliance, fulfillment of user requirements, and the ability to discern invalid and/or altered records.

Computer systems need to be examined to confirm that the systems will work in all situations.

Computer system validation is required when configuring a new system or making a change in a validated system (upgrades, patches, extensions, etc.).

Validation processes should be based on applicable regulations and guidance, best practices for the domain, and the characteristics of the system being validated.

To validate software, it must be:

  • structured, documented, and evaluated as it is developed;
  • checked to make sure that it meets specifications;
  • adequately tested with the assigned hardware systems;
  • operated under varied conditions by the intended operators or persons of like training to assure that it will perform consistently and correctly.

Why Do We Need IT Systems Validation?

Regulated industries have to adopt many compliance procedures to make sure their final product is safe for distribution or sale.
Regulatory agencies around the world require validation processes to confirm the accuracy and integrity of data in computerized systems in order to ensure product safety and effectiveness.

The process is used to make sure that the IT systems are completely transparent, robust and tamper proof because they directly impact public health and safety.

It is critical that these systems can be relied upon to produce data consistently and store those electronic data records so that they stand the test of time.

Validation is one of those compliance requirements and is part of the Quality Management System within regulated industries.

There are several of examples as to why software validation is important. We can look at our library of FDA Warning Letters to see more than 200 reasons to validate your software or systems.

There is a case study about Therac-25, a radiation therapy machine from the 1980s.

Due to programming issues, the machine could administer the wrong amount of radiation to patients (often as a huge overdose), which led to serious injuries and even death. Had there been software validation standards in place, these types of instances could have been identified and remediated prior to the treatment of patients.

Computer systems validation is serious and the FDA and other regulatory agencies do not take this lightly.

What benefits does validation deliver?
  • Accuracy – when test outcomes are routinely checked against predetermined expected results, the accuracy of computer systems within the manufacturing process can be relied upon.
  • Security – validation processes make clear when entries to the system have been altered.Reliability – the process ensures that system outputs can be relied upon throughout the life cycle.
  • Consistency – it also ensures that the system output is consistent across its life cycle.
  • Optimization – following the process also means that computer systems can be more easily optimized. Optimization is a key feature of an effective and efficient manufacturing site.
When used as intended, systems validation can provide increased process reliability, confidence, improved production results, and reduced operating expenses significantly.

Saturday, March 28, 2020

Purpose of Document Control and its Role in Quality Assurance

GxP/GMP, GDocP, ISO 9000 and documentation

GxP stands for "Good Practice" which is quality guidelines and regulations. The "x" stands for the various fields, for example Good Documentation Practice or GDocP, Good Financial Practice or GFP and so on. There are many instances of these regulations. One instance of GxP is Good Manufacturing practice or GMP.

GMP describes required Quality Management System (QMS) for manufacturing, testing, and quality assurance in order to ensure that products are safe, pure, and effective. GMP has ultimate goal to enable companies to minimize or eliminate contamination and errors which protects consumers from purchasing a product which is not effective or even dangerous. GMP regulations are required to be used in regulated industries such as food and beverages, pharmaceutical, medical devices, and cosmetics.

GMP documentation requirements are aligned with Good Documentation Practice (GDocP). GDocP is the standard in the regulated industries by which documents are created and maintained. It is the systematic set of procedures of preparation, reviewing, approving, issuing, recording, storing, and archiving documents.

The ISO 9000 is a set of standards which deals with fundamentals of Quality Management System (QMS) that helps organizations to ensure that they meet customers’ needs within statutory and regulatory requirements related to a product or service. ISO 9001 deals with the requirements that organizations wishing to meet the standard must fulfil.

GxP/GMP, GDocP, ISO 9000 are about QMS where an organization needs to demonstrate its ability to consistently provide a product that meets customer and applicable statutory and regulatory requirements.

Documentation is the key to compliance with these regulations and ensures traceability of all development, manufacturing, and testing activities. Documentation provides the route for auditors to assess the overall quality of operations within a company and the final product. GMP, GDocP, and ISO 9000 are enforced by regulatory agencies. Auditors pay particular attention to documentation to make sure that it complies with these regulations.

Therefore, in order for an organization to meet these requirements, it must have documentation procedures in place. Documentation is a critical tool for ensuring this compliance.

Purpose of document control and its role in Quality Assurance (QA)

The primary purpose of document control is to ensure that only current documents and not documents that have been superseded are used to perform work and that obsolete versions are removed. Document control also ensures that current documents are approved by the competent and responsible for the specific job people and documents are distributed to the places where they are used.

Document control is an essential preventive measure ensuring that only approved, current documents are used throughout an organization. Inadvertent use of out-of-date documents or not approved documents can have significant negative consequences on quality, costs, customer satisfaction, and can even cause death.

The role of QA, in regards to the document control system is one of management and overview.

QA ensures that all documents are maintained in a controlled fashion and that all controlled documents are approved by the appropriate subject matter experts, are consistent with other documents, and are the most current version.

One way that QA ensures this is by being the last signature on all approved documents. All documents - current, obsolete, superseded, as well as all the history on the creation and revision of the document should be kept in Quality Assurance.

Monday, December 30, 2019

Headless CMS - Contentful

In the last post, we have described headless CMS. Headless CMS architecture is rising in popularity in the development world. 

This model allows breakthrough user experiences, gives developers the great flexibility to innovate, and helps site owners future-proof their builds by allowing them to refresh the design without re-implementing the whole CMS.

One of headless CMS is Contentful. Contentful platform lets you create, manage and distribute content to any platform. It gives you total freedom to create your own content model so you can decide which content you want to manage. 

With an uncluttered user interface, Contentful is an efficient tool for creating and managing your content online, either alone or in team. You can assign custom roles and permissions to team members, add validations depending on the kind of content you have to insert and add media such as images, documents, sounds or video.

Contentful has a three-step process. First, you define a content model which is independent from any presentation layer that defines what kind of content you want to manage. In a second step, you and other internal or external editors can manage all of the content in easy-to-use and interactive editing interface. In the third step, the content is served in a presentation-independent way.

Being presentation-layer agnostic is one of the strengths of Contentful because you will be able to reuse your content across any platform.

To create a web site, you will either have to code it yourself and load content from Contentful API or work with someone who can develop the web site for you. Contentful is the platform where you can update the content of your web site, a mobile app or any other platform that displays content.

Contentful runs on all browsers. Contentful offers the most powerful REST APIs and the only enterprise-grade in-production GraphQL API.

There are three steps you'll have to take in order to deliver content from Contentful to your apps and websites.

1. Create your Content Model

The Content Model is the first step to structuring your content properly. It consists of creating content types that will accept only certain types of data for entry. For example, when creating an interactive quiz, you will need to add something that is a question, multiple answers, an indicator of the correct answer and potentially an image. This can be set up in the content model, so you can then easily just add as many "Questions" to your Quiz as you want.

2. Add Entries and Assets

Entries refer to the content itself. Entries could be blog posts, product features or ingredients of a recipe or any other content. These entries will depend on your previously created content model. In this phase you can also add assets like images, sounds, videos and many other files.

3. Deliver your content with our API


The delivery part of the content may or may not be left only to developers. In this step you set up API Keys that will determine which content will go to which platform. After the delivery is set up correctly, your content is then available for consumption as soon as you hit the “Publish” button.

We have over 18 years experience with numerous CMS, so we can help you with Contentful as well. Call us today for a free, no obligation consultation.

Saturday, November 30, 2019

Headless CMS vs Decoupled CMS

A headless content management system, or headless CMS, is a back-end only content management system (CMS) built as a content repository that makes content accessible via a RESTful API for display on any device.

The term “headless” comes from the concept of chopping the “head” which is the front end, i.e. the web site off the “body” (the back end, i.e. the content repository).

Whereas a traditional CMS typically combines the content and presentation layers of a web site, a headless CMS is just the content component which focuses entirely on the administrative interface for content creators, the facilitation of content workflows and collaboration, and the organization of content into taxonomies.

It doesn’t include presentation layers, templates, site structure, or design, but rather stores its content in pure format and provides access to other components (e.g. delivery front ends, analytics tools, etc.) through stateless or loosely coupled APIs.

The headless CMS concept is one born of the demands of the digital era and a business’s need to focus on engaging customers with personalized content via multiple channels at all stages of the customer journey. As the content in a headless CMS is considered “pure” (because it has no presentation layer attached) just one instance of it can be used for display on any device; web site, mobile, tablet, smart watches, etc.

There is some confusion around what makes a headless CMS truly “headless”, as vendors use the term somewhat loosely to label their decoupled or hybrid CMS systems. But a true headless CMS is one that was built from the ground up to be API-first, not a full monolith CMS with APIs attached afterwards.

Cloud-first headless CMSs are those that were also built with a multi-tenant cloud model at their core and whose vendor promotes Software as a Service (Saas), promising high availability, scalability, and full management of security, upgrades, and hot fixes, etc. on behalf of clients.

Coupled CMS vs. Headless CMS

Most traditional (monolithic) CMS systems are “coupled”, meaning that the content management application (CMA) and the content delivery application (CDA) come together in a single application, making back-end user tools, content editing and taxonomy, web site design, and templates inseparable.

Coupled systems are useful for blogs and basic web sites as everything can be managed in one place. But this means that the CMS code is tightly connected to any custom code and templates, which means developers have to spend more time on installations, customization, upgrades, hot fixes, etc. and they cannot easily move their code to another CMS.

There is a lot of confusion around the differences between a decoupled CMS and a headless one because they have a lot in common.

A decoupled CMS separates the CMA and CDA environments, typically with content being created behind the firewall and then being synchronized and pushed to the delivery environment.

The main difference between a decoupled CMS and a headless CMS is that the decoupled architecture is active. It prepares content for presentation and then pushes it into the delivery environment, whereas a headless CMS is reactive. It sits idly until a request is sent for content.

Decoupled architecture allows for easier scalability and provides better security than coupled architecture, but it does not provide the same support for omni-channel delivery. Plus, there are multiple environments to manage, this increasing infrastructure and maintenance costs.

Advantages of Headless CMS
  • Omnichannel readiness: the content created in a headless CMS is “pure” and can be re-purposed across multiple channels, including web site, mobile applications, digital assistant, virtual reality, smart watches, etc., in other words, anywhere and at any time through the customer journey.
  • Low operating costs: headless CMSs are usually cheaper to install and run than their monolith counterparts, especially as they are typically built on a cloud model where multi-tenant options keep the running costs low.
  • Reduces time to market: a headless CMS promotes an agile way of working because content creators and developers can work simultaneously, and projects can be finished faster.
  • Easy to use: traditional CMSs tend to be cumbersome and complex as vendors attempt to offer every available feature in one box. Headless systems focus on content management, keeping things simple for those who use it on a daily basis. The entire user experience can usually be managed from within one back end.
  • Flexibility: content editors can work in whichever headless CMS they like and developers can build any kind of front end they want in their preferred language (e.g. Ruby, PHP, Java, or Swift) and then simply integrate the two via APIs (like JSON or XML) over RESTful communication. This allows for polyglot programming where multiple programming paradigms can be used to deliver content to multiple channels, and enables a company to benefit from the latest developments in language frameworks, promoting a micro-services architecture.
  • Cloud Scalability: the content purity and stateless APIs of headless CMSs enable high scalability, especially as the architecture fully leverages the elasticity of a cloud platform.
  • System Security: since the content is typically provided through a high-performance Content Delivery Network (rather than directly from the database), the risk of distributed denial-of-service attacks (DDOS) is reduced.
  • Marketing empowerment: marketers may end up relying more on developers in certain scenarios, e.g. creating a landing page with custom layout.
Disadvantages of Headless CMS
  • Multiple services: managing multiple systems can be challenging and a team’s knowledge base must cover them all.
  • No channel-specific support: since pure headless CMSs don’t deal with the presentation layer, developers may have to create some functionality, such as web site navigation.
  • Content organization: as pure headless CMSs do not typically provide the concept of pages and web site maps, content editors need to adapt to the fact that content is organized in its pure form, independently on the web site or other channel.
Headless CMS architecture is rising in popularity in the development world. This model allows breakthrough user experiences, gives developers the great flexibility to innovate, and helps site owners future-proof their builds by allowing them to refresh the design without re-implementing the whole CMS.

In the following posts, we will look more into headless CMS and will describe specific headless CMS. Stay tuned.