Event-based sensors encode visual information asynchronously with low latency and high temporal resolution.
Event-based datasets are scarce, so user-friendly methods for creating said datasets are required.
This repository contributes with code to record a dataset with a DAVIS240C event camera.
The code was used to record and process the Event-based Dataset of Assembly Tasks (EDAT24).
All data are captured in raw form (.aedat) and can be processed into numpy arrays (.npy) for ease of use.
- A DAVIS240C event camera - to obtain the data
- The jAER open-source software - to display and record the data
- An Arduino board - to trigger the commands to start and end the recordings
A detailed explanation on how to utilize the code is provided below
video.mp4
If you've found this work useful for your research, please cite our paper as follows
@article{Duarte2024,
title = {Event-based dataset for the detection and classification of manufacturing assembly tasks},
author = {Laura Duarte and Pedro Neto},
journal = {Data in Brief},
volume = {54},
year = {2024},
doi = {https://doi.org/10.1016/j.dib.2024.110340}
}