The Natural Robustness Toolkit (NRTK) is an open source toolkit for generating operationally realistic perturbations to evaluate the natural robustness of computer vision algorithms.
The nrtk
package evaluates the natural robustness of computer vision algorithms to various perturbations, including sensor-specific changes to camera focal length, aperture
diameter, etc.
We have also created the nrtk-jatic
module to support AI T&E
use cases and workflows, through interoperability with the maite
library and integration with other JATIC tools. Users seeking to use NRTK to
perturb MAITE-wrapped datasets or evaluate MAITE-wrapped models should
start with the nrtk-jatic
module.
NRTK addresses the critical gap in evaluating computer vision model resilience to real-world operational conditions beyond what traditional image augmentation libraries cover. T&E engineers need precise methods to assess how models respond to sensor-specific variables (focal length, aperture diameter, pixel pitch) and environmental factors without the prohibitive costs of exhaustive data collection. NRTK leverages pyBSM's physics-based models to rigorously simulate how imaging sensors capture and process light, enabling systematic robustness testing across parameter sweeps, identification of performance boundaries, and visualization of model degradation. This capability is particularly valuable for satellite and aerial imaging applications, where engineers can simulate hypothetical sensor configurations to support cost-performance trade-off analysis during system design—ensuring AI models maintain reliability when deployed on actual hardware facing natural perturbations in the field.
This toolkit is intended to help data scientists, developers, and T&E engineers who want to rigorously evaluate and enhance the robustness of their computer vision models. For users of the JATIC product suite, this toolkit is used to assess model robustness against natural perturbations.
NRTK installation has been tested on Unix and Linux systems.
To install the current version via pip
:
pip install nrtk[<extra1>,<extra2>,...]
To install the current version via conda-forge
:
conda install -c conda-forge nrtk
Certain plugins may require additional runtime dependencies. Details on these requirements can be found here.
For more detailed installation instructions, visit the installation documentation.
Explore usage examples of the nrtk
package in various contexts using the Jupyter notebooks provided in the ./docs/examples/
directory.
Via the pyBSM package, NRTK exposes a large set of Optical Transfer Functions (OTFs). These OTFs can simulate different
environmental and sensor-based effects. For example, the :ref:JitterOTFPerturber <JitterOTFPerturber>
simulates
different levels of sensor jitter. By modifying its input parameters, you can observe how sensor jitter affects image
quality.
Below is an example of an input image that will undergo a Jitter OTF perturbation. This image represents the initial state before any transformation.
Below is some example code that applies a Jitter OTF transformation::
from nrtk.impls.perturb_image.pybsm.jitter_otf_perturber import JitterOTFPerturber
import numpy as np
from PIL import Image
INPUT_IMG_FILE = 'docs/images/input.jpg'
image = np.array(Image.open(INPUT_IMG_FILE))
otf = JitterOTFPerturber(sx=8e-6, sy=8e-6, name="test_name")
out_image = otf.perturb(image)
This code uses default values and provides a sample input image. However, you can adjust the parameters and use your own image to visualize the perturbation. The sx and sy parameters (the root-mean-squared jitter amplitudes in radians, in the x and y directions) are the primary way to customize a jitter perturber. Larger jitter amplitude generate a larger Gaussian blur kernel.
The output image below shows the effects of the Jitter OTF on the original input. This result illustrates the Gaussian blur introduced due to simulated sensor jitter.
Documentation for both release snapshots and the latest main branch is available on ReadTheDocs.
To build the Sphinx-based documentation locally for the latest reference:
# Install dependencies
poetry install --sync --with main,linting,tests,docs
# Navigate to the documentation root
cd docs
# Build the documentation
poetry run make html
# Open the generated documentation in your browser
firefox _build/html/index.html
Contributions are encouraged!
The following points help ensure contributions follow development practices.
- Follow the JATIC Design Principles.
- Adopt the Git Flow branching strategy.
- Detailed release information is available in docs/release_process.rst.
- Additional contribution guidelines and issue reporting steps can be found in CONTRIBUTING.md.
Ensure the source tree is acquired locally before proceeding.
You can install using Poetry:
poetry install --with main,linting,tests,docs --extras "<extra1> <extra2> ..."
Pre-commit hooks ensure that code complies with required linting and formatting guidelines. These hooks run automatically before commits but can also be executed manually. To bypass checks during a commit, use the --no-verify
flag.
To install and use pre-commit hooks:
# Install required dependencies
poetry install --sync --with main,linting,tests,docs
# Initialize pre-commit hooks for the repository
poetry run pre-commit install
# Run pre-commit checks on all files
poetry run pre-commit run --all-files
This associated project provides a local web application that provides a demonstration of visual saliency generation in a user interface. This provides an example of how image perturbation, as generated by this package, can be utilized in a user interface to facilitate dataset exploration. This tool uses the trame framework.
Principal Investigator: Brian Hu (Kitware) @brian.hu
Product Owner: Austin Whitesell (MITRE) @awhitesell
Scrum Master / Tech Lead: Brandon RichardWebster (Kitware) @b.richardwebster
Deputy Tech Lead: Emily Veenhuis (Kitware) @emily.veenhuis
This material is based upon work supported by the Chief Digital and Artificial Intelligence Office under Contract No. 519TC-23-9-2032. The views and conclusions contained herein are those of the author(s) and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of the U.S. Government.