Evaluation of a Science Teachers’ Training Extension Program: Lesson Learning and Implications for Program Design
Keywords:
Evaluation, extension, Kirkpatrick model, qualitative impact protocol, training program, science investigatory project, computer literacyAbstract
This study presented an approach to evaluating a higher education institution’s teachers’ training extension program (HEI)
using the Kirkpatrick evaluation model. The adapted model provided an excellent framework for identifying the strengths
and weaknesses of the training process. Findings revealed that the extension program was effective at the model’s reaction level (Level 1), as evidenced by a high level of satisfaction. The Level 2 (learning criteria) and Level 3 (behavior) of the model were not successfully documented due to limitations of monitoring data of the extension program. However, the final results (Level 4) were examined using the Qualitative Impact Protocol (QUIP). According to the QUIP findings, the participation of teachers in various training and seminars on science and ICT topics was widely cited as a positive driver of change across the three domains of the training program. Most teachers made positive implicit statements that corresponded to the expected changes that the extension program aims to achieve, but they made no explicit reference to the project. The analysis provided the extension practitioners with a holistic understanding of the preparation, design, and implementation of similar future teachers’ training extension programs in HEIs, focusing on the professional development of science teachers.