This repository provides a PyTorch-based framework for training and evaluating deep learning models on various mmWave ISAC datasets, including gesture recognition, pose estimation, localization, and gait identification. It supports classification and regression tasks, and includes options for background subtraction where applicable.
You can download the mmHSense dataset from IEEE DataPort.
- Download the ZIP file.
- Extract all datasets.
- Place the files in the root directory of this repository.
Supports multiple datasets:
mmWGesture – mmWave gesture recognition (classification)
5GmmGesture – 5G mmWave gesture recognition (classification)
mmWPose – mmWave skeletal pose estimation (regression)
DISAC-mmVRPose – VR-based mmWave pose estimation (regression)
mmW-Loc – mmWave localization with optional background subtraction (classification)
mmW-GaitID – mmWave gait identification with optional background subtraction (classification)
Generic ResNet18-based architecture for all datasets with flexible input channels.
Supports both classification and regression loss functions (CrossEntropyLoss and MSELoss).
All dataset options and hyperparameters are set via config.yaml
.
You can edit this file to choose your dataset, adjust training parameters, or enable optional features.
Example config.yaml
:
dataset: mmWGesture # Options: mmWGesture, 5GmmGesture, mmWPose, DISAC-mmVRPose, mmW-Loc, mmW-GaitID
epochs: 10
batch_size: 32
lr: 0.001
background: false # Only for mmW-Loc and mmW-GaitID
Usage:python main.py
@misc{bhat2025mmhsensemultimodaldistributedmmwave,
title={mmHSense: Multi-Modal and Distributed mmWave ISAC Datasets for Human Sensing},
author={Nabeel Nisar Bhat and Maksim Karnaukh and Stein Vandenbroeke and Wouter Lemoine and Jakob Struye and Jesus Omar Lacruz and Siddhartha Kumar and Mohammad Hossein Moghaddam and Joerg Widmer and Rafael Berkvens and Jeroen Famaey},
year={2025},
eprint={2509.21396},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2509.21396},
}