Software risk management often seems synonymous with extra documentation. But during a 17 October 2019 Ask the Expert webinar, Martin Laferriere (director of information technology services at Avid Bioservices) explained how a thoughtful risk- assessment process at early project management stages can optimize software system validation efforts. Laferriere illustrated the benefits of developing strategic structures for documenting system use, hazards, and testing requirements. Such structures can help biomanufacturers clarify what records and functionalities they need to concentrate on to remain compliant. Then those companies can reduce their testing efforts while still robustly mitigating risk and remaining compliant.
Laferriere’s Presentation
Effective risk management demands careful evaluation, and software system compliance requires structured but flexible risk management to obtain useful and regulatory-compliant results. Thus, biopharmaceutical companies need to evaluate system capabilities consistently using predefined criteria early in their implementation cycle.
Companies that use small software systems to perform high-risk functions often decide to validate all their systems to save time and capital — and rightly so. But companies with large systems that execute a broad swath of functions often use the same strategy. Such organizations tend to qualify 80–100% of system functionality when much less traditional validation is needed.
A clear framework for systems assessment enhances that process and makes it repeatable. Simple steps should be taken at project initiation to evaluate a whole system for applicability to rules and good industry practice (GxP) regulations. That enables classification of systems by overall risk. The next step is to define a system’s intended uses correctly in user requirements (UR) statements, which establish what a business wants its system to perform. Adding regulatory-required statements (such as data integrity protocols) to UR documents will paint a complete picture of how a system should be built and configured for compliant use.
In modular systems, each component should be assessed because whole groups of functionality can be excluded from validation testing if an entire module does not need to be GxP–compliant. Evaluating each functionality is next. Because functional specifications (FS) documents are needed for applications and represent the functionality being configured to satisfy intended use, they are prime locations to evaluate and document function risks and resulting testing requirements.
Such analyses require more work up front but save time at critical project stages. Teams can optimize testing efforts by excluding zero- and low-risk functions. Companies can leverage vendor testing or other internally documented testing (from information technology units, for example) for moderately risky functions and reserve full validation for higher-risk functions.
Biomanufacturers should spend time balancing their assessment process before implementing it. Profiling and testing predefined evaluation criteria will ensure consistent, accurate results. Thorough, predefined analysis and risk documentation enables consistent recordkeeping and can facilitate conversations with auditors, helping them to understand how a company classifies its risks and how its procedures address those challenges.
Risk-based approaches to systems compliance can reveal clear, consistent criteria for evaluating software risk. Companies ultimately can streamline their documentation practices and limit unnecessary validation effort. Upfront work is smart and will help to optimize implementations.
Questions and Answers
What is the best way to classify a FS as a low, medium, or high risk? Most experts consult the Good Automated Manufacturing Practice 5 guide, which highlights three criteria: severity, probability, and detectability. But the last two are difficult to pin down for a software function — and thus are less useful to classify than is severity. It is more helpful to determine whether a function has a direct or indirect impact on GxP compliance. That is easier for evaluators to understand than, for example, the probability of a system function failure.
Why should risk assessment structures incorporate a small element of “vagueness?†Evaluations can identify functions that fall between low and medium or medium and high risk. Flexible risk assessment would enable managers to consult with quality assurance units, information technology groups, and validation teams to classify “grey area†risks appropriately.
How can companies streamline requirements and specifications documentation? Most current systems are prebuilt, and companies use their functions mostly “as is.†So the separation between UR and FS is weak. Combining UR and FS statements saves time and still maintains compliance.
More OnlineÂ
The full presentation of this webcast can be found on the BioProcess International website at the link below.