Event cameras have emerged as a powerful sensing modal- ity for robotics, offering microsecond latency, high dynamic range, and low power consumption. These characteristics make them well-suited for real-time robotic perception in scenarios affected by motion blur, occlusion, and extreme changes in illumination. Despite this potential, event-based vision, particularly through video-to-event (v2e) simulation, remains underutilized in mainstream robotics simulators, limiting the advancement of event-driven solutions for navigation and manipulation. This work presents an open-source, user-friendly v2e robotics operat- ing system (ROS) package for Gazebo simulation that enables seamless event stream generation from RGB camera feeds. The package is used to investigate event-based robotic policies (ERP) for real-time navigation and manipulation. Two representative scenarios are evaluated: (1) object following with a mobile robot and (2) object detection and grasping with a robotic manipulator. Transformer-based ERPs are trained by behavior cloning and compared to RGB-based counterparts under various oper- ating conditions. Experimental results show that event-based policies consistently deliver competitive and often superior robustness in high- speed or visually challenging environments. These results highlight the potential of event-driven perception to improve real-time robotic navi- gation and manipulation, providing a foundation for broader integration of event cameras into robotic policy learning.
@inproceedings{vinod2025sebvs,
title = {SEBVS: Synthetic Event-based Visual Servoing for Robot Navigation and Manipulation},
author = {Vinod, Krishna and Ramesh, Prithvi Jai and B N, Pavan Kumar and Chakravarthi, Bharatesh},
booktitle = {},
year = {2025}
}