Blogs (1) >>

In today’s digital landscape, the demand for skilled cybersecurity professionals is higher than ever. However, many educational programs primarily focus on theoretical concepts, leaving students with insufficient practical skills. To address this gap, students need actionable feedback on their hands-on labs and assignments. We present an open-source scoring engine that provides iterative, step-by-step feedback, enabling students to solve complex cybersecurity problems progressively. Integrated into existing courses, this engine can enhance labs with detailed, structured feedback, bridging the gap between theoretical knowledge and practical application. A preliminary study with 11 students showed that all participants could complete complex tasks using the feedback provided by the engine, with limited instruction from the authors. Additionally, about 90% of the students reported high satisfaction with the structured feedback. This approach has the potential to transform cybersecurity education, making it more interactive, practical, and aligned with real-world requirements.