TorchSig is an open-source signal processing machine learning toolkit based on the PyTorch data handling pipeline. The user-friendly toolkit simplifies common digital signal processing operations, augmentations, and transformations when dealing with both real and complex-valued signals. TorchSig streamlines the integration process of these signals processing tools building on PyTorch, enabling faster and easier development and research for machine learning techniques applied to signals data, particularly within (but not limited to) the radio frequency domain. An example dataset, TorchSigNarrowband, based on many unique communication signal modulations is included to accelerate the field of modulation classification. Additionally, an example wideband dataset, TorchSigWideband, is also included that extends TorchSigNarrowband with larger data example sizes containing multiple signals enabling accelerated research in the fields of wideband signal detection and recognition.
- Ubuntu ≥ 20.04
- Hard drive storage with:
- ≥ 500 GB for Narrowband
- ≥ 10 TB for Wideband
- CPU with ≥ 4 cores
- GPU with ≥ 16 GB storage (reccomended)
- Python ≥ 3.9
We highly reccomend Ubuntu or using a Docker container.
Clone the torchsig
repository and install using the following commands:
git clone https://github.com/TorchDSP/torchsig.git
cd torchsig
pip install .
To create the narrowband dataset:
python3 ./scripts/generate_narrowband.py --root ./examples/datasets --all --num-workers=4
To create the wideband dataset:
python3 ./scripts/generate_wideband.py --root ./examples/datasets --all --num-workers=4
Docker can be used to generate the datasets without modifying your current Python environment. Build a Docker container:
docker build -t torchsig -f Dockerfile .
To create the narrowband dataset with the Docker container:
docker run -u $(id -u ${USER}):$(id -g ${USER}) -v `pwd`:/workspace/code/torchsig torchsig python3 torchsig/scripts/generate_narrowband.py --root=/workspace/code/torchsig/data --all
To create the wideband dataset with the Docker container:
docker run -u $(id -u ${USER}):$(id -g ${USER}) -v `pwd`:/workspace/code/torchsig torchsig python3 torchsig/scripts/generate_wideband.py --root=/workspace/code/torchsig/data --all
The example jupyter notebooks can be run within Docker with GPU support, try the command:
docker build -t torchsig -f Dockerfile .
docker run -d --rm --network=host --shm-size=32g --gpus all --name torchsig_workspace torchsig tail -f /dev/null
docker exec torchsig_workspace jupyter notebook --allow-root --ip=0.0.0.0 --no-browser
Then use the URL in the output in your browser to run the examples and notebooks.
TorchSig provides many useful tools to facilitate and accelerate research on signals processing machine learning technologies:
- The
SignalData
class and itsSignalMetadata
objects enable signals objects and meta data to be seamlessly handled and operated on throughout the TorchSig infrastructure. - The
TorchSigNarrowband
Dataset is a state-of-the-art static modulations-based RF dataset meant to serve as the next baseline for RFML classification development & evaluation. - The
ModulationsDataset
class synthetically creates, augments, and transforms the largest communications signals modulations dataset to date in a generic, flexible fashion. - The
TorchSigWideband
Dataset is a state-of-the-art static wideband RF signals dataset meant to serve as the baseline for RFML signal detection and recognition development & evaluation. - The
WidebandModulationsDataset
class synthetically creates, augments, and transforms the largest wideband communications signals dataset in a generic, flexible fashion. - Numerous signals processing transforms enable existing ML techniques to be employed on the signals data, streamline domain-specific signals augmentations in signals processing machine learning experiments, and signals-specific data transformations to speed up the field of expert feature signals processing machine learning integration.
- TorchSig also includes a model API similar to open source code in other ML domains, where several state-of-the-art convolutional and transformer-based neural architectures have been adapted to the signals domain and pretrained on the
TorchSigNarrowband
andTorchSigWideband
datasets. These models can be easily used for follow-on research in the form of additional hyperparameter tuning, out-of-the-box comparative analysis/evaluations, and/or fine-tuning to custom datasets.
Documentation can be found online or built locally by following the instructions below.
cd docs
pip install -r docs-requirements.txt
make html
firefox build/html/index.html
TorchSig is released under the MIT License. The MIT license is a popular open-source software license enabling free use, redistribution, and modifications, even for commercial purposes, provided the license is included in all copies or substantial portions of the software. TorchSig has no connection to MIT, other than through the use of this license.
Title | Year | Cite (APA) |
---|---|---|
TorchSig: A GNU Radio Block and New Spectrogram Tools for Augmenting ML Training | 2024 | Vallance, P., Oh, E., Mullins, J., Gulati, M., Hoffman, J., & Carrick, M. (2024, September). TorchSig: A GNU Radio Block and New Spectrogram Tools for Augmenting ML Training. In Proceedings of the GNU Radio Conference (Vol. 9, No. 1). |
Large Scale Radio Frequency Wideband Signal Detection & Recognition | 2022 | Boegner, L., Vanhoy, G., Vallance, P., Gulati, M., Feitzinger, D., Comar, B., & Miller, R. D. (2022). Large Scale Radio Frequency Wideband Signal Detection & Recognition. arXiv preprint arXiv:2211.10335. |
Large Scale Radio Frequency Signal Classification | 2022 | Boegner, L., Gulati, M., Vanhoy, G., Vallance, P., Comar, B., Kokalj-Filipovic, S., ... & Miller, R. D. (2022). Large Scale Radio Frequency Signal Classification. arXiv preprint arXiv:2207.09918. |
Please cite TorchSig if you use it for your research or business.
@misc{torchsig,
title={Large Scale Radio Frequency Signal Classification},
author={Luke Boegner and Manbir Gulati and Garrett Vanhoy and Phillip Vallance and Bradley Comar and Silvija Kokalj-Filipovic and Craig Lennon and Robert D. Miller},
year={2022},
archivePrefix={arXiv},
eprint={2207.09918},
primaryClass={cs-LG},
note={arXiv:2207.09918}
url={https://arxiv.org/abs/2207.09918}
}