This project implements a complete modular pipeline for classifying bearing faults using Convolutional Neural Networks on the CWRU (Case Western Reserve University) dataset.
cnn_pipeline/ ├── main.py # Main training/evaluation script ├── arguments.py # Command-line arguments ├── dataset.py # DataLoader generation ├── model.py # CNN model definition ├── train.py # Training function (with early stopping) ├── test.py # Test function ├── preprocess.py # CWRU .mat file preprocessing
This project uses the publicly available CWRU Bearing Data from Case Western Reserve University. You can download the dataset files from the official website below:
🔗 https://engineering.case.edu/bearingdatacenter/download-data-file
Please place the downloaded .mat
files inside the raw_data/
directory:
raw_data/ ├── 97.mat # Normal ├── 105.mat # Inner race fault ├── 118.mat # Ball fault ├── 130.mat # Outer race fault
The preprocess.py
module converts raw signals into sliding window segments for supervised classification.
pip install numpy scipy torch scikit-learn matplotlib tqdm
python main.py
The script will:
- Preprocess raw
.mat
files into (X, Y) arrays - Split data into train/valid/test
- Train CNN using early stopping
- Evaluate final test accuracy
Three-layer 1D convolutional encoder with ReLU, BatchNorm, and MaxPooling followed by fully connected classification layers.
Input → Conv1d → ReLU → MaxPool → Conv1d → ReLU → MaxPool → Conv1d → Flatten → FC layer → FC layer → Output
Argument | Description | Default |
---|---|---|
--epochs | Number of training epochs | 10 |
--lr | Learning rate | 1e-4 |
--lamda | LR scheduler decay | 0.97 |
--early_stop | Early stopping patience | 20 |
--train_size | Train/val split ratio | 0.8 |
--batch_size | Mini-batch size | 64 |