and Brent H. Hill, Director of Validation, CETech Validation Services, Inc.
Pharmaceutical organizations have a training need for validation skills that cover the areas of protocol execution, protocol development, validation project management, and documentation control. This need can best be met through knowledge and skills training that is customized for individual organizations. Customized training is intended to enhance the transfer of knowledge and skills from the learning environment to the working environment. Critical to getting the greatest benefit out of customized-training dollars is selecting the best candidates for training, and providing immediate opportunities after training to use newly acquired knowledge and skills.
The goals of the customized training effort described here were to design, develop, and evaluate a validation training program. The objective of the training program was to produce qualified personnel for Installation Qualification (IQ) and Operational Qualification (OQ) protocol execution and development. Core aspects of the training program are presented, along with the training evaluation results. Presented first is an overview of the training design, development, and delivery process.
Validation Training Design
The challenge in designing a training program such as the one described here is to create a modular system that can readily be customized to meet customer needs. Thus, a needs assessment is required before decisions can be made on the instructional method and media to be used in the training design. Part of the needs assessment involves an analysis of the organization, which includes detailing customer needs. The other two parts of the needs assessment are a task analysis and person analysis. Our approach to all three analyses is described in the following sections.
Organizational Analysis
Analysis of the organization included interviewing stakeholders and surveying department managers. Customer needs were identified and nominees for training were collected in the interviews. The expected number of employees to be trained to what level of performance was established from the interview data. It was determined that the customer had an immediate need for qualified personnel to execute IQ and OQ protocols, and an imminent need for qualified personnel to develop IQ/OQ protocols. In-house qualified protocol executors meant that contractor services could begin to be phased out with the goal of reducing costs (Wenzel, Hill, Novosad & Eckstrom, 2001).1
The departmental survey established the climate of the work environment. It was important to the success of the training program to know if the work environment provided opportunities to use knowledge and skills acquired in training, and if it was supportive of newly trained employees. The departmental survey also asked for nominees to be considered for validation training. Several candidates in the trainee pool came from responses to the departmental survey.
Conducting the task analysis involved breaking down the jobs of Protocol Executor and Protocol Developer to a level of granularity that specified the Knowledge, Skills, Abilities, and Experiences (KSAEs) necessary to do the jobs. Results from the task analysis were key in determining training objectives. The results also served as the basis for the person analysis. Prior to submitting results from the task analysis to the training designer, a Subject Matter Expert (SME) rated each KSAE, pertaining to protocol executor and protocol developer, based on the dimensions located in Figure 1.
A KSAE was included in the training criteria if it received the following ratings from the SME: “2” on Knowledge or Skill is Expected to be Acquired, and a “3” or greater on Relative Frequency of Application of Knowledge or Skill to Validation, and “3” or greater on Importance of Task to Effective Job Performance. The KSAEs that made the first cut were flagged if they received a “3” on Level of Recall Necessary to Apply Knowledge on the Job, or a “3” or more on Level of Difficulty Learning. The flags indicated to the training designer that emphasis needed to be put on related training materials.
Person Analysis
There are three attributes of an employee that constitute an excellent candidate for training. One, the employee is motivated to learn. Two, the employee will use the new knowledge and skills acquired in training. And, three, the employee possesses prerequisite skills to obtain the greatest benefit out of training. Of course, the employee also has to be available to attend training.
A self-assessment survey, similar to the instrument used by the SME to rate the KSAEs was created from the list of KSAEs. Each person to be considered for training completed the survey by rating his KSAEs on the dimensions located in Figure 2.
Candidates for protocol executor and protocol developer training were selected by analyzing data from the self-assessment survey. The self-assessment survey data were submitted to the following six-step process to create the candidate lists as shown in Figure 3.
A primary candidate list and secondary candidate list were created. The organization initially wanted a group of ten individuals trained on protocol execution, and a smaller select group from the original group trained on protocol development. However, the group of ten that was initially chosen attended both the protocol executor and protocol developer training.
Training Design and Development
Results from the needs assessment were compiled and passed onto the training designer. The results identified a need for basic validation training. Basic validation training, therefore, became a prerequisite for both protocol executor and protocol developer training. The training designer then made decisions on training content, and the training strategy to be implemented by the training developer.
Training Strategy: Method and Medium
The training designer decided on a minimalist training approach (Carrol, 1984).2 Advantages of the approach included:
(a) Training is focused on real tasks(b) Reading training materials is kept to a minimum(c) There is coordination between demonstrations and hands-on participation (Palmiter & Elkerton, 1993)3(d) Trainers are prepared to deal with execution errors(e) Trainee motivation is kept high by using actual work tasks (in this case, executing and developing protocols).The instructional elements were designed according to Gagné and Briggs’s (1974)4 nine events of instruction: gain attention, present learning objective, activate prerequisite knowledge, present material, guide learning, elicit performance (practice), provide feedback, assess performance, facilitate retrieval, and enhance transfer (Gagné, 1970).5
An audit was required to determine the in-house training capacity, i.e., facilities, equipment, and computer resources of the organization. Based on results from the audit, the training designer chose to design the training as lock-step instruction using computer-based presentations. The instructor decides when specific training materials are presented with the lock-step method. Self-paced, interactive multimedia courseware was considered as a training strategy. However, given the short turnaround time from needs assessment, to training delivery, and the lack of data on the computer skills levels of the trainees, Computer-Assisted Instruction (CAI) was the chosen strategy. Microsoft PowerPoint® was used as the training platform. PowerPoint® had the advantages of multimedia (sound, video, animation) capability, easily modified, self-executing presentation, and the audience had a general familiarity with the application.
Three training modules were targeted for development: validation boot camp, protocol executor, and protocol developer. A description of the training setting, an overview of training content, and the learning objectives for each training module are presented next.
Validation Boot Camp: Basic Validation Training
Validation boot camp presented the basics of validation. All trainees were required to attend boot camp training. It was conducted in a designated learning center equipped with a Pentium III personal computer connected to a projector. Boot camp training consisted of a PowerPoint® presentation that covered definitions of validation, the FDA’s role, the company’s validation philosophy, flowcharts of validation-related processes, validation-related Standard Operating Procedures (SOPs), the importance of documentation, and documentation control issues.
To gain and keep the trainees’ attention, the instructor donned a drill instructor uniform and attitude. It was extremely effective in this particular setting, but may not be effective in all training settings. Trainees participated by reading instructional material aloud. The learning objectives for boot camp are found in Figure 4.
Protocol Executor (PE) training was conducted in a small conference room. A laptop computer, projector, and overhead projector were used for training. The computer and projector were used for the Powerpoint® presentation. The overhead projector was used to display transparencies of IQ and OQ protocols.
PE training consisted of both lock-step instruction and hands-on training.
The Powerpoint® presentation outlined the protocol execution process, deviation report process, necessary documentation for protocol execution, critical equipment, test instrument calibration, and how and where to acquire resources. Instruction was designed to provide evaluative feedback to trainees on their pretest performance (see Training Evaluation Instruments and Results section). Group exercises were conducted so trainees could acquire experience completing IQ protocols, OQ protocols, and deviation report forms. Sample protocols were distributed along with dummy data for the group exercises. Blank deviation report forms were available for situations when the dummy data fell outside of the acceptance criteria. The learning objectives for PE training included increases in both knowledge and skills. The learning objectives are listed in Figure 5.
Protocol Developer Training
Protocol Developer (PD) training was conducted in a computer laboratory. A computer with projector setup served as the training platform. All of the necessary electronic files (e.g., IQ template, OQ template, final report template, example data forms) were loaded on the hard drive of the computer, along with the PowerPoint® presentation.
A trainee facing the upcoming validation of a new piece of equipment provided a Turn Over Package (TOP) for training purposes. The TOP served as the basis for the protocol development exercises conducted during PD training. The exercises were conducted as a group. Each trainee had an opportunity to sit at the keyboard and complete development tasks. Included in the group exercises were:
- Modifying the IQ and OQ templates to reflect the new equipment.
- Developing data forms and inserting them into the modified IQ template.
- Developing functional test forms and inserting them into the modified OQ template.
Hands-on instruction showing how to prepare a final report package was provided to trainees. Small groups of trainees were supplied with the necessary documents to practice preparing final report packages. The learning objectives for PD training focused on increases in knowledge. The learning objectives are listed in Figure 6.
The documentation control person presented additional information on accessing and modifying protocol templates, and submitting electronic protocol drafts on the final day of PD training. Much of the knowledge and many of the skills covered related directly to the hands-on portion of PD training.
Training Schedule
Boot camp training was conducted in a four-hour block. Following boot camp training, PE training was conducted in two four-hour blocks across two days. PD training was conducted in three four-hour blocks across three days the week immediately following PE training. All training modules were offered in a morning and afternoon session to accommodate first and second shift workers. One individual worked third shift and had his workweek adjusted to fit the training schedule. The number of trainees attending the sessions fluctuated from six or seven in the morning and three or four in the afternoon, respectively. This was not a problem since the modules were standardized across sessions.
Training Evaluation Instruments and Results
Evaluation is a critical aspect of a complete training program. Only through evaluation can it be determined if training criteria and organizational needs are met. In addition, evaluation results are critical to improving training.
The evaluation instruments that were used are described next, along with the evaluation results.
Comparison of Trainee Candidate Lists
A comparison of the primary candidate and secondary candidate lists (results from analysis of the self-assessment survey) to the employer-selected candidate list revealed that seven of the ten who attended training were on the primary candidate list; two were on the secondary candidate list, and one on the employee-selected list. The trainee from the employee-selected list had not completed the self-assessment survey prior to training.
Evaluation Procedure
The anonymity of the trainees was established by assigning each an arbitrary number to use for identification purposes on the evaluations. Evaluation booklets were created for each training module-boot camp, protocol executor, and protocol developer. The evaluation booklets contained informed consent, pre- and post-measures, confidence items, and the training assessment questionnaire. Verbal and written instructions were provided so that trainees knew when to start and stop the evaluation. All PE and PD knowledge and skills pre-measures were administered prior to the start of boot camp to prevent any overlap in training content across the modules.
The pre-measures of PE and PD knowledge involved a multiple-choice written test. The pre-measure of PE skills involved completing pages from IQ and OQ protocols based on data provided. The post-measure of PE knowledge was an alternate form of the multiple-choice test. The post-measure of the PE skill involved field execution of dummy IQ/OQ protocols for a 75 cubic feet V-Blender. The protocols were printed on paper marked “SAMPLE” so that they would not be confused with official documents. The post-measure of PD knowledge was an alternate form of the multiple-choice test.
Knowledge and skill gains were assessed by measuring the increase in the percent of correct responses. Unidirectional paired-sample t-tests were used to determine the effects of training on knowledge and skills. Results are shown in Figure 7. All increases in knowledge scores and skills were statistically significant. The average percent correct on the PE knowledge test increased from 46 before training to 89 after training (t(9) = 9.7, pvalue < .0001). The average percent correct on the PE skill test increased from 64 before training to 93 after training (t(9) = 4.9, pvalue < .001). The average percent correct on the PD knowledge test increased from 50 before training to 80 after training (t(9) = 5.2, pvalue < .001).
The extensive hands-on experience provided in the PE training likely contributed to the dramatic increase in “executor” knowledge. There was minimal hands-on experience proved in the PD training due to restricted computer resources. This lack of extensive hands-on experience likely contributed to the smaller, yet significant, increase in PD knowledge.
Confidence in Knowledge and Skills
Trainees were asked to rate their confidence in their knowledge and abilities with IQ and OQ execution and development before and after both the PE and PD training. Figure 8 describes a statistically significant increase in confidence as a function of training.
Trainees reported being more confident in their protocol execution knowledge and abilities than their protocol development knowledge and abilities across both pre- and post-measures.
Reactions to the Training Experience
Trainees’ reactions to the training experience were collected in two ways. At the end of each training module the Training Assessment Questionnaire (TAQ) was administered, along with three open-ended items. The TAQ covers 11 aspects of effective training. Each survey item was rated on a seven-point scale anchored by a pair of reciprocal descriptors. Here is an example of a TAQ item:
“The lesson objective was not clearly/clearly presented” (“1” represented not clearly and “7” represented clearly).
The other items that received average ratings below “5” were recorded on the TAQ with comments such as:
“Question-and-answer sessions were inadequate/adequate for learning” (M = 4.8) and: “The pace of training was inadequate/adequate for learning” (M = 4.9), however, only for the boot camp training module. Boot camp training for the first group diverged from PE and PD training in two ways. First, the head of engineering sat in on the morning session training. Second, the instructor was told that he had an hour less time than what he had planned for; therefore, he accelerated the presentation of the materials. These differences are potentially reflected in the ratings for the adequacy of question and answer sessions and pace for learning items.
At the end of both the PE and PD training modules, trainees were asked to respond to three open-ended items: What was of most value?, What was your favorite part?, and how could training be improved? The item responses are listed in Figure 10 and Figure 11.
The hands-on experience that trainees received proved to be a favored aspect of training. An interesting finding from trainee responses to the “how to improve training” item for PE (although, not in direct response to the item) was the expressed desire to get in the field and put their new knowledge and skills to work.
Discussion
The validation training program described here is highly effective in producing increases in knowledge and skills related to IQ/OQ protocol execution and development. Trainees find the training motivating and highly relevant to their jobs. Further, the training produces increased confidence in trainees’ self-reported IQ/OQ protocol execution and development knowledge and abilities. Missing from the training evaluation is an assessment of the degree that training transferred to the work environment.
Trainees were most receptive to learning about and doing protocol execution. Protocol execution does not require intimate knowledge of process equipment, nor does it call for a full understanding of validation requirements. As a rule of thumb, ninety percent of validation knowledge and skills are easily acquired through training and experience, and the remaining ten percent take years of experience to acquire. Protocol execution falls into the ninety percent category. Organizations should take this into consideration when deciding where to spend their validation training dollars.
The modular approach taken in the design stage produced a training program that was readily modifiable to meet changing organizational needs. Training needs may include modules for validation project managers or documentation control personnel, both experienced and non-experienced personnel, refresher training, and train-the-trainer programs. Regardless of the training need, training should be contextualized to enhance the transfer of knowledge and skills from the learning environment to the working environment.
The time required to produce a validation training program such as the one described here varies. It is inversely related to the experience level of the instructional developers and trainers, accessibility of validation expertise, the number of training hours delivered, and quality of the instruction.
Organizations may believe that there is economy in eliminating the needs analysis from the training development process. However, training criteria and trainee selection are two vital bits of information that result from the analysis. Both are essential to the success of a training program.
Validation training and training dollars are often unnecessarily wasted. Organizations should be cautious and ask for evidence that the training they are purchasing is able to produce results. Organizations should use more than the dollar sign as selection criterion. Ask the group offering the training to provide empirical evidence that their training is effective. Also ask for a list of customers. Contact companies on the customer list, and ask if they were satisfied with the training they received, moreover, were they satisfied with the training results.
About the Authors
Brenda Wenzel has a background in training development and evaluation. Her research spans the areas of human engineering, cognitive and social psychology, and training technologies. Her work has been presented at national and international conferences. Wenzel was formerly with CETech Validation Services, Inc. She is currently with the Air Force Research Laboratory, Warfighter Training Research Division. She can be reached by phone at 480-988-6561, by e-mail at brenda.wenzel@williams.af.mil.
Brent Hill is Director of Validation at CETech Validation Services, Inc. His specialty is in control system design and managing plant start-ups. Hill has 18 years experience with experience designing, installing, and qualifying pharmaceutical facilities. He has been with CETech Validation Services since 1994. He can be reached by phone at 303-279-4238, by fax at 303-279-3735, by e-mail at brent@cetonline.com.
References
- Wenzel, B., Hill, B., Novosad, M., & Eckstrom, R. (2001). Transitioning from Contract Validation Services to an In-House Validation Capability. Journal of Validation Technology, Vol 7. No 4.
- Carroll, J. M. (1984). Minimalist Training. Datamation, November, 25-136.
- Palmiter, S., & Elkerton, J. (1993). Animated Demonstrations for learning procedural computer-based tasks. Human-Computer Interaction, 8, 193-216.
- Gagne, R. M., & Briggs, L. J. (1974). Principles of instructional design. New York: Holt, Rinehart and Winston, Incorporated.
- Gagne, R. M. (1970). The conditions of learning (2nd ed.). New York: Holt, Rinehart and Winston, Incorporated.