Safe & Explainable Robotics: Verification, Safety Cases & Ethics Training Course
Secure & Transparent Robotics offers a thorough training programme centered on the safety, verification, and ethical oversight of robotic systems. This course connects theoretical foundations with practical application by examining safety argument methodologies, hazard analysis, and explainable AI techniques that render robotic decision-making clear and reliable. Participants will gain the skills to ensure regulatory compliance, verify operational behaviors, and document safety assurance in accordance with international standards.
Delivered as live, instructor-led training (available online or onsite), this programme is designed for intermediate-level professionals seeking to apply verification, validation, and explainability principles to facilitate the secure and ethical deployment of robotic systems.
Upon completion of this training, participants will be equipped to:
- Create and document safety arguments for robotic and autonomous systems.
- Implement verification and validation methods within simulation environments.
- Gain insight into explainable AI frameworks relevant to robotic decision-making.
- Incorporate safety and ethical principles into system design and operations.
- Effectively communicate safety and transparency requirements to stakeholders.
Course Format
- Interactive lectures and group discussions.
- Practical simulation exercises and safety analysis tasks.
- Case studies drawn from real-world robotics applications.
Customization Options
- For bespoke training on this subject, please contact us to discuss your needs.
Course Outline
Introduction to Safety and Explainability in Robotics
- Overview of safety and transparency in robotic systems
- Regulatory and ethical landscape for robotics and AI
- Relevant standards and frameworks: ISO 26262, ISO 10218, and ISO/IEC 42001
Risk and Hazard Analysis
- Identifying hazards in autonomous and semi-autonomous systems
- Conducting Failure Mode and Effects Analysis (FMEA)
- Quantifying risk and applying mitigation through safety-driven design
Verification and Validation Techniques
- Testing robotic behaviors in simulated environments
- Formal verification and test case design
- Data-driven validation and monitoring approaches
Safety Argument Development
- Structure and content of a safety argument
- Documenting compliance and traceability
- Utilizing tools for evidence management and risk justification
Explainable AI for Robotics
- Ensuring transparency in decision-making processes
- Interpretability techniques for ML-based control systems
- Communicating robotic behaviors to users and regulators
Ethical and Governance Considerations
- Ethical principles in robotics and autonomous systems
- Bias, accountability, and responsibility in AI-driven robotics
- Balancing innovation with public trust and regulation
Hands-On Workshop: Developing a Safe and Explainable Robotics Scenario
- Designing a small robotic simulation in ROS 2 or Gazebo
- Applying verification and validation procedures
- Developing and presenting a safety argument summary
Summary and Next Steps
Requirements
- Fundamental knowledge of robotics systems and control architectures
- Familiarity with Python programming and simulation tools
- Understanding of systems engineering or safety management processes
Target Audience
- Systems engineers working on robotics or autonomous platforms
- Safety officers responsible for ensuring compliance with functional safety standards
- Technical managers overseeing the integration and deployment of robotic solutions
Open Training Courses require 5+ participants.
Safe & Explainable Robotics: Verification, Safety Cases & Ethics Training Course - Booking
Safe & Explainable Robotics: Verification, Safety Cases & Ethics Training Course - Enquiry
Safe & Explainable Robotics: Verification, Safety Cases & Ethics - Consultancy Enquiry
Testimonials (2)
Supply of the materials (virtual machine) to get straight into the excersises, and the explanation of the Ros2 core. Why things work a certain way.
Arjan Bakema
Course - Autonomous Navigation & SLAM with ROS 2
its knowledge and utilization of AI for Robotics in the Future.
Ryle - PHILIPPINE MILITARY ACADEMY
Course - Artificial Intelligence (AI) for Robotics
Upcoming Courses
Related Courses
Artificial Intelligence (AI) for Robotics
21 HoursThe field of Artificial Intelligence (AI) in Robotics merges machine learning, control systems, and sensor fusion to develop intelligent machines that can perceive, reason, and act autonomously. By leveraging contemporary tools such as ROS 2, TensorFlow, and OpenCV, engineers are now empowered to create robots capable of intelligent navigation, planning, and interaction within real-world settings.
This instructor-led live training, available online or onsite, is designed for intermediate-level engineers looking to develop, train, and deploy AI-powered robotic systems using modern open-source technologies and frameworks.
Upon completion of this training, participants will be equipped to:
- Utilize Python and ROS 2 to construct and simulate robotic behaviors.
- Apply Kalman and Particle Filters for precise localization and tracking.
- Employ computer vision techniques via OpenCV for perception and object detection.
- Leverage TensorFlow for motion prediction and learning-based control.
- Integrate SLAM (Simultaneous Localization and Mapping) to enable autonomous navigation.
- Develop reinforcement learning models to enhance robotic decision-making processes.
Course Format
- Engaging lectures and interactive discussions.
- Practical implementation exercises using ROS 2 and Python.
- Hands-on practice with both simulated and real-world robotic environments.
Customization Options
To arrange a customized training session for this course, please get in touch with us.
AI and Robotics for Nuclear - Extended
120 HoursIn this instructor-led live training Greece (online or onsite), participants will learn the different technologies, frameworks, and techniques for programming various types of robots for use in the field of nuclear technology and environmental systems.
The 6-week course is held 5 days a week. Each day is 4-hours long and consists of lectures, discussions, and hands-on robot development in a live lab environment. Participants will complete various real-world projects applicable to their work in order to practice their acquired knowledge.
The target hardware for this course will be simulated in 3D through simulation software. The ROS (Robot Operating System) open-source framework, C++ and Python will be used for programming the robots.
By the end of this training, participants will be able to:
- Understand the key concepts used in robotic technologies.
- Understand and manage the interaction between software and hardware in a robotic system.
- Understand and implement the software components that underpin robotics.
- Build and operate a simulated mechanical robot that can see, sense, process, navigate, and interact with humans through voice.
- Understand the necessary elements of artificial intelligence (machine learning, deep learning, etc.) applicable to building a smart robot.
- Implement filters (Kalman and Particle) to enable the robot to locate moving objects in its environment.
- Implement search algorithms and motion planning.
- Implement PID controls to regulate a robot's movement within an environment.
- Implement SLAM algorithms to enable a robot to map out an unknown environment.
- Extend a robot's ability to perform complex tasks through Deep Learning.
- Test and troubleshoot a robot in realistic scenarios.
AI and Robotics for Nuclear
80 HoursThis instructor-led, live training in Greece (online or onsite) allows participants to learn the technologies, frameworks, and techniques for programming various types of robots for use in nuclear technology and environmental systems.
The four-week course takes place five days a week. Each day consists of four hours of lectures, discussions, and hands-on robot development in a live lab environment. Participants will complete real-world projects relevant to their work to practice their acquired knowledge.
The target hardware for this course will be simulated in 3D via simulation software. The code will then be deployed onto physical hardware (such as Arduino) for final testing. The ROS (Robot Operating System) open-source framework, along with C++ and Python, will be used for robot programming.
By the end of this training, participants will be able to:
- Understand the core concepts used in robotic technologies.
- Understand and manage the interaction between software and hardware in a robotic system.
- Understand and implement the software components that underpin robotics.
- Build and operate a simulated mechanical robot that can see, sense, process, navigate, and interact with humans through voice.
- Understand the necessary elements of artificial intelligence (machine learning, deep learning, etc.) applicable to building a smart robot.
- Implement filters (Kalman and Particle) to enable the robot to locate moving objects in its environment.
- Implement search algorithms and motion planning.
- Implement PID controls to regulate a robot's movement within an environment.
- Implement SLAM algorithms to enable a robot to map out an unknown environment.
- Test and troubleshoot a robot in realistic scenarios.
Autonomous Navigation & SLAM with ROS 2
21 HoursROS 2 (Robot Operating System 2) is an open-source framework designed to support the development of complex and scalable robotic applications.
This instructor-led, live training (online or onsite) is aimed at intermediate-level robotics engineers and developers who wish to implement autonomous navigation and SLAM (Simultaneous Localization and Mapping) using ROS 2.
By the end of this training, participants will be able to:
- Set up and configure ROS 2 for autonomous navigation applications.
- Implement SLAM algorithms for mapping and localization.
- Integrate sensors such as LiDAR and cameras with ROS 2.
- Simulate and test autonomous navigation in Gazebo.
- Deploy navigation stacks on physical robots.
Format of the Course
- Interactive lecture and discussion.
- Hands-on practice using ROS 2 tools and simulation environments.
- Live-lab implementation and testing on virtual or physical robots.
Course Customization Options
- To request a customized training for this course, please contact us to arrange.
Developing Intelligent Bots with Azure
14 HoursAzure Bot Service unites the capabilities of the Microsoft Bot Framework with Azure Functions, offering a robust platform for rapidly constructing intelligent bots.
Through this instructor-led live training, participants will investigate efficient methods for developing intelligent bots utilizing Microsoft Azure.
Upon completing the training, participants will be able to:
Comprehend the fundamental concepts underpinning intelligent bots.
Construct intelligent bots using cloud-based applications.
Acquire practical expertise in the Microsoft Bot Framework, the Bot Builder SDK, and Azure Bot Service.
Implement established bot design patterns in practical scenarios.
Create and deploy their first intelligent bot using Microsoft Azure.
Target Audience
This course is tailored for developers, hobbyists, engineers, and IT professionals keen on bot development.
Course Format
The training integrates lectures and discussions with exercises, placing a strong emphasis on practical, hands-on application.
Computer Vision for Robotics: Perception with OpenCV & Deep Learning
21 HoursOpenCV is an open-source library for computer vision that facilitates real-time image processing, while deep learning frameworks like TensorFlow offer the necessary tools for enabling intelligent perception and decision-making capabilities within robotic systems.
This instructor-led live training, available online or on-site, is designed for intermediate-level robotics engineers, computer vision specialists, and machine learning engineers aiming to leverage computer vision and deep learning techniques for robotic perception and autonomy.
Upon completion of this training, participants will be capable of:
- Building computer vision pipelines using OpenCV.
- Integrating deep learning models for object detection and recognition.
- Leveraging vision-based data for robotic control and navigation.
- Merging classical vision algorithms with deep neural networks.
- Deploying computer vision solutions on embedded and robotic platforms.
Course Format
- Interactive lectures and discussions.
- Practical exercises using OpenCV and TensorFlow.
- Live laboratory implementation on simulated or physical robotic systems.
Customization Options
- To arrange a customized training session for this course, please contact us.
Developing a Bot
14 HoursA bot or chatbot functions as a digital assistant designed to automate user interactions across various messaging platforms, enabling faster task completion without requiring direct human intervention.
Through this instructor-led live training, participants will learn how to begin developing bots by building sample chatbots using dedicated bot development tools and frameworks.
By the conclusion of this training, participants will be able to:
- Understand the various uses and applications of bots
- Comprehend the complete bot development process
- Explore the different tools and platforms utilized in bot construction
- Construct a sample chatbot for Facebook Messenger
- Construct a sample chatbot using the Microsoft Bot Framework
Audience
- Developers interested in creating their own bot
Format of the course
- A mix of lectures, discussions, exercises, and extensive hands-on practice
Edge AI for Robots: TinyML, On-Device Inference & Optimization
21 HoursEdge AI empowers artificial intelligence models to operate directly on embedded or resource-constrained devices, thereby minimizing latency and power usage while enhancing the autonomy and privacy of robotic systems.
This instructor-led training, available online or onsite, is designed for intermediate-level embedded developers and robotics engineers aiming to implement machine learning inference and optimization techniques directly on robotic hardware via TinyML and edge AI frameworks.
Upon completion of this training, participants will be capable of:
- Gaining a solid understanding of TinyML and edge AI fundamentals within robotics.
- Converting and deploying AI models for on-device inference.
- Optimizing models to improve speed, reduce size, and enhance energy efficiency.
- Integrating edge AI systems into robotic control architectures.
- Evaluating performance and accuracy in real-world scenarios.
Course Format
- Interactive lectures and discussions.
- Hands-on practice utilizing TinyML and edge AI toolchains.
- Practical exercises conducted on embedded and robotic hardware platforms.
Customization Options
- To arrange customized training for this course, please contact us.
Human-Centric Physical AI: Collaborative Robots and Beyond
14 HoursThis instructor-led, live training in Greece (online or onsite) is designed for intermediate-level learners who want to explore the role of collaborative robots (cobots) and other human-centric AI systems in modern workplaces.
By the end of this training, participants will be able to:
- Understand the principles of Human-Centric Physical AI and its applications.
- Explore the role of collaborative robots in enhancing workplace productivity.
- Identify and address challenges in human-machine interactions.
- Design workflows that optimize collaboration between humans and AI-driven systems.
- Promote a culture of innovation and adaptability in AI-integrated workplaces.
Human-Robot Interaction (HRI): Voice, Gesture & Collaborative Control
21 HoursThe course 'Human-Robot Interaction (HRI): Voice, Gesture & Collaborative Control' is a practical programme aimed at equipping participants with the skills to design and implement intuitive interfaces for effective human–robot communication. This training blends theoretical knowledge, design principles, and hands-on programming to create natural and responsive interaction systems utilising speech, gestures, and shared control methods. Participants will gain expertise in integrating perception modules, developing multimodal input systems, and designing robots that collaborate safely with humans.
Delivered by an instructor, this live training (available online or onsite) is tailored for beginners to intermediate-level professionals looking to design and implement human–robot interaction systems that improve usability, safety, and overall user experience.
Upon completion of this training, participants will be capable of:
- Grasping the foundational concepts and design principles of human–robot interaction.
- Creating voice-based control and response mechanisms for robotic systems.
- Implementing gesture recognition through computer vision techniques.
- Designing collaborative control systems that ensure safe and shared autonomy.
- Assessing HRI systems against criteria of usability, safety, and human factors.
Course Format
- Interactive lectures and live demonstrations.
- Practical coding and design exercises.
- Hands-on experimentation within simulation or real-world robotic environments.
Customisation Options for the Course
- For a bespoke training version of this course, please contact us to arrange your session.
Industrial Robotics Automation: ROS-PLC Integration & Digital Twins
28 HoursIndustrial Robotics Automation: ROS-PLC Integration & Digital Twins is a practical course designed to bridge the gap between industrial automation and contemporary robotics frameworks. Participants will acquire the skills to integrate ROS-based robotic systems with PLCs for synchronized operations, while exploring digital twin environments to simulate, monitor, and optimize production processes. The curriculum emphasizes interoperability, real-time control, and predictive analysis through the use of digital replicas of physical systems.
This instructor-led, live training (available online or onsite) targets intermediate-level professionals aiming to develop practical expertise in connecting ROS-controlled robots with PLC environments and implementing digital twins to enhance automation and manufacturing efficiency.
Upon completion of this training, participants will be equipped to:
- Comprehend the communication protocols facilitating interaction between ROS and PLC systems.
- Implement real-time data exchange mechanisms between robots and industrial controllers.
- Develop digital twins for monitoring, testing, and process simulation purposes.
- Integrate sensors, actuators, and robotic manipulators within industrial workflows.
- Design and validate industrial automation systems using hybrid simulation environments.
Format of the Course
- Interactive lectures and architectural walkthroughs.
- Hands-on exercises focused on integrating ROS and PLC systems.
- Implementation of simulation and digital twin projects.
Course Customization Options
- To request a customized training for this course, please contact us to arrange.
Artificial Intelligence (AI) for Mechatronics
21 HoursThis instructor-led live training in Greece (available online or onsite) is designed for engineers who wish to explore the application of artificial intelligence to mechatronic systems.
By the end of this training, participants will be able to:
- Gain an overview of artificial intelligence, machine learning, and computational intelligence.
- Understand the concepts of neural networks and different learning methods.
- Choose artificial intelligence approaches effectively for real-life problems.
- Implement AI applications in mechatronic engineering.
Multi-Robot Systems and Swarm Intelligence
28 HoursMulti-Robot Systems and Swarm Intelligence is an advanced training course that delves into the design, coordination, and control of robotic teams, drawing inspiration from biological swarm behaviors. Participants will acquire skills in modeling interactions, implementing distributed decision-making, and optimizing collaboration across multiple agents. This course integrates theoretical knowledge with practical simulation exercises, preparing learners for applications in logistics, defense, search and rescue, and autonomous exploration.
This instructor-led, live training (available online or onsite) is designed for advanced-level professionals eager to design, simulate, and implement multi-robot and swarm-based systems using open-source frameworks and algorithms.
Upon completion of this training, participants will be able to:
- Grasp the principles and dynamics of swarm intelligence and cooperative robotics.
- Design communication and coordination strategies for multi-robot systems.
- Implement distributed decision-making and consensus algorithms.
- Simulate collective behaviors such as formation control, flocking, and coverage.
- Apply swarm-based techniques to real-world scenarios and optimization problems.
Format of the Course
- Advanced lectures featuring algorithmic deep dives.
- Hands-on coding and simulation exercises in ROS 2 and Gazebo.
- A collaborative project applying swarm intelligence principles.
Course Customization Options
- To request a customized training for this course, please contact us to arrange.
Smart Robots for Developers
84 HoursA Smart Robot represents an Artificial Intelligence (AI) system capable of learning from its surroundings and past experiences, thereby enhancing its capabilities based on that acquired knowledge. These intelligent entities can collaborate with humans, working alongside them and observing their behavior to adapt. Beyond performing manual labor, Smart Robots are equipped to handle complex cognitive tasks. It is important to note that Smart Robots are not limited to physical hardware; they can also exist purely as software applications within a computer, operating without moving parts or direct physical interaction with the world.
In this instructor-led, live training, participants will explore the various technologies, frameworks, and techniques required to program different types of mechanical Smart Robots. The course culminates in participants applying this knowledge to complete their own Smart Robot projects.
The curriculum is structured into 4 distinct sections. Each section spans three days, featuring lectures, interactive discussions, and hands-on robot development within a live laboratory environment. To ensure practical mastery, each section concludes with a practical, hands-on project, allowing participants to practice and demonstrate their newly acquired skills.
The hardware targeted in this course will be simulated in 3D using specialized simulation software. Programming for the robots will utilize the open-source ROS (Robot Operating System) framework, along with C++ and Python.
Upon completion of this training, participants will be able to:
- Grasp the core concepts underpinning robotic technologies
- Understand and manage the interaction between software and hardware within a robotic system
- Comprehend and implement the software components that form the foundation of Smart Robots
- Construct and operate a simulated mechanical Smart Robot capable of seeing, sensing, processing, grasping, navigating, and interacting with humans via voice
- Enhance a Smart Robot's capacity to execute complex tasks through the application of Deep Learning
- Test and troubleshoot a Smart Robot within realistic scenarios
Audience
- Developers
- Engineers
Format of the course
- A blend of lectures, discussions, exercises, and intensive hands-on practice
Note
- To customize any aspect of this course (such as the programming language or robot model), please contact us to make arrangements.
Smart Robotics in Manufacturing: AI for Perception, Planning, and Control
21 HoursIntelligent robotics involves embedding artificial intelligence into robotic systems to enhance perception, decision-making capabilities, and autonomous control.
This instructor-led live training, available either online or onsite, is designed for advanced robotics engineers, systems integrators, and automation leaders who aim to implement AI-driven perception, planning, and control within smart manufacturing settings.
Upon completion of this training, participants will be able to:
- Comprehend and apply AI methodologies for robotic perception and sensor fusion.
- Create motion planning algorithms for both collaborative and industrial robots.
- Implement learning-based control strategies to enable real-time decision-making.
- Seamlessly integrate intelligent robotic systems into smart factory workflows.
Course Format
- Interactive lectures and discussions.
- Extensive exercises and practical practice.
- Hands-on implementation within a live-lab environment.
Customization Options for the Course
- To arrange a tailored training session for this course, please get in touch with us.