Event cameras are bio-inspired sensors that asynchronously capture per-pixel brightness changes at microsecond precision, offering low latency, high temporal resolution, and resilience to motion blur—critical for tracking fast-moving objects. Despite their advantages, the lack of standardized benchmarks and datasets for event-based tracking has limited their integration into computer vision pipelines. This paper introduces eSkiTB, a specialized benchmark for event-based tracking in winter sports, focusing on ski jumpers, freestyle skiers, and alpine skiers. Derived from broadcast footage covering varied lighting, weather, and clutter conditions, eSkiTB includes 300 sequences with 240 for training, 30 for validation, and 30 for testing. Using the v2e event simulator under the iso-informational constraint, we ensure neuromorphic fidelity while preserving temporal resolution. We evaluate state-of-the-art event and frame-based trackers, demonstrating the efficacy of spiking neural networks and the value of fine-tuning on domain-specific data. eSkiTB advances neuromorphic vision research and establishes a foundation for event-based tracking in extreme-motion scenarios.
@inproceedings{vinod2026eskitb,
title = {eSkiTB: A Synthetic Event-based Dataset for Tracking Skiers},
author = {Vinod, Krishna and Vishal, Joseph Raj and Chanda, Kaustav and Ramesh, Prithvi Jai and Yang, Yezhou and Chakravarthi, Bharatesh},
booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
year = {2026}
}