How to implement Data Integrity on computerized systems, Starting with R&D processes
The pharmaceutical industry is one of the most highly regulated industries on the planet. Regulations and requirements are constantly changing and evolving, and one thing is clear, the future will only bring more regulations and challenges, not less. TAPI is and intends to remain ahead of the game. In this article we will demonstrate why it is important to keep compliance in mind when developing new molecules in R&D, and how we recommend to use computerised systems to do that in order to align with the regulations and requirements from authorities such as the FDA, ISPE and PIC/S.
Data regulations are more complicated when it comes to R&D
R&D uses a wide variety of analytical technologies and is quick to implement novel technologies like Corona Detector, UPLC, TGA-MS, LC-MS, GC-MS and LC-MS/MS. This is also true of software, with a wide variety of the CFR21part11 / Annex11 compliance capability of each software, especially if the technology is not supported by the existing global Chromatographic Data System (CDS), which is fully compliant with all the regulatory requirements.
To successfully meet data integrity requirements, TAPI R&D has developed a unique approach that allows us to make the best decisions when assessing new technology and its implementation. Our approach includes a package of universal solutions to assess the built-in compliance level of software and new computerised procedures.
Our approach is designed to adapt to changes in regulations as well as innovations in procedures and processes, especially the wide range of computerised processes used in R&D. This is only made possible by close co-operation between our R&D department, IT and QA. This co-operation is designed to allow all three departments to challenge and question proposals from the other departments, leading to the best possible solutions. The team is also leading the implementation of this approach in all TAPI R&D sites, as well as sharing and consolidation with other TAPI and TEVA units.
Data integrity in R&D - How we do it and best practices
Our data integrity procedure starts during supplier evaluation stage and the decision to purchase or implement new technology. We begin with an evaluation of how to ensure the data integrity of the process in compliance with all data integrity requirements. It is important to note that even in the best case where a fully compliant software solution is available there are still some issues that need to be addressed by the IT department; these include:
Backup and Restore
It`s important to make sure you have a proper and protected data backup and storage platform – to be able to maintain and approach your data properly when needed. TEVA has a well-established backup solution platform, which is also used for laboratory stand-alone systems. It’s Vault module is built as a global server-based solution serving as a centralized management tool for the backup operations in a harmonized approach over all the sites. This system implementation requires a good understanding of the data management mechanism of the various connected software to be backed up. This is to clarify the suitability of a solution as it may require dedicated configuration, additional routine activities or a different solution. Finally, the backup and restore procedure is tested as part of each software qualification (usually OQ phase).
User Identification and Security
While it is not mandatory, solutions like LDAP user management, offer advantages over individual and separate systems as it is more efficient and maintenance friendly. It is also a more secure way to use existing user accounts of the corporate LDAP, where user accounts are managed based on a single corporate security policy, in accordance with electronic records and security guidelines.
In cases where LDAP integration is not supported by the software, there is an additional business own solution to configure a direct and automated linking of corporate LDAP user accounts to internal software accounts. With this we use a system that is able to run an automatic login to local software using its internal user account uniquely linked to corporate LDAP user account, based on user identification in this system. As with the backup system, this system is a global server-based solution that allows a centralized and harmonized management of user identification and security policy implementation across sites when not well covered by local software.
Is a key aspect of the data integrity and is managed globally by the network, although it can be affected by inappropriate user privileges on local stations. Therefore, it is crucial to remove these privileges from end users. Some readers may recall laboratory software (mostly older generation software) that requires full Windows administration permissions. These cases should be covered either by the global Windows policy or other alternative solutions as described below.
Mostly exists even in software with partial suitability to data integrity requirements. The related challenges are usually found in one of the following two cases:
a. Audit Trail exists but does not reflect the old and new values for updated data. Once the software is built or configured to prevent overwriting the updated data (either by creation of a new version or new record for each change) it is still auditable. Therefore, the correct definition of an audit trail review procedure might be enough. Once no prevention of the data overwrite is available by the software, it has to be prevented by a 3rd party solution (e.g. Windows Security or other alternative solutions, as described in the following).
b. Audit Trail is not available in the software. In this case a 3rd party solution has to be implemented to track the activities on the station related to data management and user identification coupled with data security solution as detailed below. An alternative but less preferable solution might be a manual audit trail management using paper notebook, ELN or intermediate electronic solution used for recording by the user in non-automated manner.
This is the most challenging issue, as there are a wide variety of software compliance levels with data security requirements. Generally, there are 3 options:
- Complete Data Security (DS) - data protection from altering, both within the software user interface by the privileges, and at Windows level
- Internal Only DS - data protection within the software user interface only but not at Windows level
- Zero DS - no data protection neither within the software user interface nor at Windows level
Complete Data Security is typical for well-designed software in accordance with data integrity requirements, and when supported with LDAP, it is the best solution to choose during the vendor/product selection stage. It only requires the configuration of the backup and restore, solution and the correct software security setup. If no LDAP integration is supported, the system will require local internal user management with/without additional SSO feature implementation (as mentioned above).
Internal Only DS software requires an additional 3rd party solution to prevent unauthorized data alteration at Windows level. The easiest solution might be Windows Security configuration on a specific data storage folder, where applicable, and completion of other protection requirements by the Windows Security Policy (like enforced data storage path to the only secured folder). Unfortunately, in most cases Windows Security is still too limited and is not able to fully manage the differentiation between user and software service accounts or process, to protect the data and still enable the software to create/update the data in authorized processes. In addition, management of different, dedicated, per station Windows Policies is almost impossible in a globally managed environment using Global Windows Policies. There are more elegant and effective tools that make our life much easier. One solution we use permits the creation of policies with improved (vs. Windows) resolution capabilities of differentiation between accounts, processes etc, with benefits of harmonization and centralized management once established as a global server-based solution, as we do in TEVA.
Zero DS is only encountered in older systems, as most vendors are aware of the ERES guidance and the demands of the Pharmaceutical industry (as customers). The only option here is to upgrade the software. Sometimes the data integrity features are supplied as separate options with additional licensing cost. Where the features are not available even with a software upgrade, the business has to make a decision either to invest in a new system or to implement a 3rd party solutions package to cover the missing features. Designing a suitable package that includes all or most of the required tools can be very complicated, but here is a good summary of the next steps you can start with:
- Evaluate and implement a Backup and Restore solution.
- Be in Control – Provide User Identification and Security solutions - mostly MS Windows User Identification and privileges management coupled with SSO like systems, where relevant.
- Include a Time Reference protection solution by MS Windows User Privileges management or Security Policy or alternative 3rd party solutions.
- Taking Care of the Audit Trail tracking solution - 3rd party solutions from fully automated software, to manual audit trail management using paper notebook.
- Make Sure Your Data is Secured – Use Data Security solution by MS Windows Folder Security and/or Security Policy and/or other alternative and complementary solutions.
In summary, our best practice approach is based on the foundations as listed following:
- Start with evaluation of the software already as part of vendor selection prior to Instrument/technology purchase, giving higher priority to a vendor or instrument supplied with a better Data Integrity compliance level.
- Plan in advance the complementary solutions required from the laboratory/corporate IT in order to achieve complete Data Integrity compliance during the software’s initial installation and configuration.
- Define and manage the routine activities that will guarantee continuous Data Integrity compliance.
- Evaluate periodically the guidance changes and update your Data Integrity approach in accordance with these changes.
- Evaluate periodically novel software solutions or advanced revisions for improved Data Integrity compliance, in aim to upgrade your systems, especially older systems without complete Data Integrity capabilities.
In conclusion, TAPI R&D has a high awareness and familiarity with data integrity guidelines and continuously invests significant resources to lead in compliance with the requirements of our laboratories, starting from the R&D stage of a product’s lifecycle. The Israeli R&D team is leading this activity from discovery of the approach, implementation, continuous improvement and implementation in all the TAPI R&D sites, and to share and consolidate this approach with other TEVA units.
We sure this strategy gives higher value to our customers by delivering high quality products. And in light of the complicated and strict guidance requirements, it might be a valuable differentiation of TAPI from other players in our industry.
This article is provided as courtesy only, and should not be considered in any way as legal, regulatory or otherwise professional advice. You should seek appropriate professional counsel for your own situation and requirements.
About the author
Rahamin Aminov – is an expert in laboratory computerized systems at TAPI R&D in Israel. In this role, he is integrating chemical and an informatics knowledge for the effective definition of the laboratories’ needs, as well as communicating with IT units, finding the best solutions, and managing actual implementations. Rahamin has a rich experience in analytical R&D from previous roles as a researcher and team leader.