A pytorch based image denoising tool for the removal of noise from real photographs. Implements the models developed in
Benoit Brummer's NIND Denoise, and the Darktable workflow pioneered by Huy Hoang.
This fork makes their work easier to experiment with and accessible to a wider audience. Notable features include:
- One-click setup: Download the Lua script and let Darktable handle all Python dependencies automatically
- Support for most hardware via
torch.accelerator
- CPU-only (universal), nVidia GPU (CUDA), Intel XPU/GPU, and AMD GPU acceleration
- Intel Xe graphics are about 6x faster than CPU, though slower than CUDA
- Automatic darktable integration with export workflow
- Automatic model download and environment setup
- Processed images automatically imported and grouped with originals in Darktable library
The easiest way to use nind-denoise is through the Darktable Lua plugin with automatic environment setup.
First, install the community lua-scripts collection through Darktable's built-in interface (this will create the necessary folder structure). See the official lua-scripts repository for installation instructions.
Once installed, download nind_denoise_rl.lua
from this repository and place it in the contrib folder:
Linux:
curl -o ~/.config/darktable/lua/contrib/nind_denoise_rl.lua \
https://raw.githubusercontent.com/commreteris/nind-denoise/lua/src/lua-scripts/nind_denoise_rl.lua
macOS:
curl -o ~/Library/Application\ Support/darktable/lua/contrib/nind_denoise_rl.lua \
https://raw.githubusercontent.com/commreteris/nind-denoise/lua/src/lua-scripts/nind_denoise_rl.lua
Windows:
Invoke-WebRequest -Uri "https://raw.githubusercontent.com/commreteris/nind-denoise/lua/src/lua-scripts/nind_denoise_rl.lua" `
-OutFile "$env:LOCALAPPDATA\darktable\lua\contrib\nind_denoise_rl.lua"
Or manually download the file and copy it to the contrib folder:
- Linux:
~/.config/darktable/lua/contrib/
- macOS:
~/Library/Application Support/darktable/lua/contrib/
- Windows:
%LOCALAPPDATA%\darktable\lua\contrib\
- Start Darktable and enable the script from the Script Manager (
lighttable > script manager
) - In the lighttable module, find the "Update Environment" button
- Click it and wait while the script automatically:
- Installs the
uv
package manager (if needed) - Creates a Python virtual environment
- Downloads the denoising model (~50MB)
- Installs all required dependencies (PyTorch, GMic, tifffile, etc.)
- Installs the
Note: First-time setup can take several minutes depending on your internet connection and system. The script will display status updates. This is a one-time process.
- Select one or more images in the lighttable
- Go to the export module
- Choose "NIND-denoise RL" as the target storage
- Configure export settings:
- Select JPEG or TIFF output format
- Adjust RL deblur parameters (sigma, iterations)
- Check "import to darktable" to automatically import processed images back to your library
- Click export
Processed images will be automatically grouped with their originals in your Darktable library!
For advanced users, custom workflows, or batch processing, you can use the command-line interface directly.
To denoise an image, run:
$ python3 src/denoise.py "/path/to/photo0123.RAW"
Note: On Windows, if you use forward slashes do not use single forward slashes for paths. Double backslashes are OK:
PS> python3 src\\denoise.py "\\good\\path\\to\\photo0123.RAW"
"""
Usage:
denoise.py [-o <outpath> | --output-path=<outpath>] [-e <e> | --extension=<e>]
[-d <darktable> | --dt=<darktable>] [-g <gmic> | --gmic=<gmic>] [ -q <q> | --quality=<q>]
[--nightmode ] [ --no_deblur ] [ --debug ] [ --sigma=<sigma> ] [ --iterations=<iter> ]
[-v | --verbose] <raw_image>
denoise.py (help | -h | --help)
denoise.py --version
Options:
-o <outpath> --output-path=<outpath> Where to save the result (defaults to current directory).
-e <e> --extension=<e> Output file extension [default: jpg].
--dt=<darktable> Path to darktable-cli. Use this only if not automatically found.
-g <gmic> --gmic=<gmic> Path to gmic. Use this only if not automatically found.
-q <q> --quality=<q> JPEG compression quality [default: 90].
--nightmode Use for very dark images. Normalizes brightness before denoise [default: False].
--no_deblur Do not perform RL-deblur [default: false].
--debug Keep intermediate files.
--sigma=<sigma> Sigma to use for RL-deblur [default: 1].
--iterations=<iter> Number of iterations for RL-deblur [default: 10].
-v --verbose
--version Show version.
-h --help Show this screen.
"""
If you prefer to set up the Python environment manually (not using the Darktable auto-setup), follow these steps:
git clone https://github.com/commreteris/nind-denoise.git
cd nind-denoise
The variant-enabled version of uv
automatically installs the correct version of PyTorch for your GPU.
Windows:
PS> powershell -c { $env:INSTALLER_DOWNLOAD_URL = 'https://wheelnext.astral.sh'; irm https://astral.sh/uv/install.ps1 | iex }
PS> uv venv
PS> .venv/Scripts/activate
(nind-denoise) PS> Get-Command python
CommandType Name Version Definition
----------- ---- ------- ----------
Application python.exe 3.1x.xx C:\Users\<user>\...\nind-denoise\.venv/scripts\python.exe
Make sure python.exe
is inside the .venv
directory before proceeding.
Linux/macOS:
curl -LsSf https://astral.sh/uv/install.sh | INSTALLER_DOWNLOAD_URL=https://wheelnext.astral.sh sh
uv venv
source .venv/bin/activate
which python
# Should show: /path/to/nind-denoise/.venv/bin/python
Make sure the Python path is inside the .venv
directory before proceeding.
$ uv pip install -r requirements.in --upgrade
(nind-denoise) $ python
>>> import torch
>>> torch.accelerator.is_available()
True
>>> torch.cuda.is_available() # For nVidia GPUs
True
>>> torch.xpu.is_available() # For Intel GPUs
False
- Darktable installed and working
- Internet connection for first-time setup
- That's it! The Lua script handles everything else automatically.
- Darktable and raw images processed with darktable (with
.xmp
sidecar files) - GMic CLI (download here for Windows)
- Variant-enabled uv package manager
- Proper GPU drivers with OpenCL support
nVidia:
- Driver + CUDA toolkit
AMD:
- Driver + ROCm
Intel:
- Intel GPU drivers are slightly trickier due to version mismatches. Install minimum system packages and let uv/pip handle dependencies in the venv.
- See Intel's driver guide
- Install
libze-dev
andintel-ocloc
for PyTorch support - On Arch Linux:
intel-compute-runtime
,level-zero-loader
,level-zero-headers
- Minimum Linux kernel: 6.14
Verify GPU setup:
$ clinfo | grep device
$ darktable-cltest
Please cite Benoit Brummer's original work:
@InProceedings{Brummer_2019_CVPR_Workshops,
author = {Brummer, Benoit and De Vleeschouwer, Christophe},
title = {Natural Image Noise Dataset},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2019}
}