Skip to content

baskargroup/mpf-bench-codes

Repository files navigation

MPFBench

License: MIT License

This is the repository for the MPFBench. It contains the link to the full dataset and the code used for training the SciML operators. Additionally, we include a collection of scripts for preparing data into machine learning format and downsampling data into lower resolution.

We present the MPF-Bench dataset, encompassing 5500 bubble rise and 5500 droplet flow simulations, with each simulation containing 100 time-snapshots, making it, to our knowledge, two orders of magnitude larger -- in terms of number of time-snapshots -- than any existing multiphase flow dataset.

Table of Contents

  1. Model
  2. Data
  3. Website

Model

We include workflows to train six types of neural operators: U-Net, Fourier Neural Operators (FNO), Convolutional Neural Operators (CNO), Deep Operator Networks (DeepONets), scalable Operator Transformer (scOT), Poseidon. The implementation of the six networks are included here.

Data

This dataset features 2D and 3D transient simulations, capturing a spectrum of flow behaviors influenced by surface tension and density/viscosity ratios. MPF-Bench includes scenarios from bubble oscillations with minor surface deformations to complete bubble breakup, offering a comprehensive resource for studying bubble rise and droplet fall dynamics. The dataset is designed to support the development of next-generation scientific machine learning (SciML) neural PDE solvers, particularly those tackling complex geometries and multiphysics phenomena.

Website

We have a project website which highlights MPFBench main results. Our website gives an overview of our dataset, geometries, solver, and our research team that worked on this project.

Downloading a Folder from Our Hugging Face Repository

Below is an example of downloading a folder from our repository.

Installation

To run the example code, you need to install the following package:

pip install huggingface_hub

Example Code to Download all Files to a local Directory

The following script demonstrates how to download a directory from the Hugging Face Hub:

from huggingface_hub import HfApi, hf_hub_download
import os
import shutil

REPO_ID = "BGLab/mpf-bench"
DIRECTORY = "BUBBLE_2D"

# Initialize the Hugging Face API
api = HfApi()

# List files in the directory
files_list = api.list_repo_files(repo_id=REPO_ID, repo_type="dataset")

# Filter the files in the specified directory
files_to_download = [f for f in files_list if f.startswith(DIRECTORY)]

# Create local directory if it doesn't exist
os.makedirs(DIRECTORY, exist_ok=True)

# Download each file
for file in files_to_download:
    file_path = hf_hub_download(repo_id=REPO_ID, filename=file, repo_type="dataset")
    # Copy the file to the local directory using shutil.copy2
    shutil.copy2(file_path, os.path.join(DIRECTORY, os.path.basename(file_path)))

print("Files downloaded successfully.")

How to Run the Operators

A. To run DeepONet, U-Net, FNO, CNO:

  1. Edit the config files - All references to data and hyperparameters are in the config files. You can edit them to change the data path, batch size, learning rate, etc. The config files are located in the config directory:

    • config/deeponet/conf.yaml
    • config/unet/conf.yaml
    • config/fno/conf.yaml
    • config/cno/conf.yaml
  2. Run the training script:

python3 main_sweep.py --model "deeponet" --sweep
python3 main_sweep.py --model "unet" --sweep
python3 main_sweep.py --model "fno" --sweep
python3 main_sweep.py --model "cno" --sweep

B. To run scOT/Poseidon:

  1. Edit the config files - All references to data and hyperparameters are in the config files. You can edit them to change the data path, batch size, learning rate, etc. The config files are located in the config directory:

    • config/scot/conf.yaml
    • config/poseidon/conf.yaml
  2. Run the training script:

python3 scot_train.py --model "scot" --sweep
python3 scot_train.py --model "poseidon" --sweep

About

Contains Code for MPFBench Paper

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published