Skip to content

Producing Lyman-alpha flux fields from N-body simulations with neural networks

Notifications You must be signed in to change notification settings

pzharrington/Lya_demo

Repository files navigation

Producing Lyman-alpha flux fields from N-body simulations with neural networks

Image

This repository contains code to accompany the paper "Producing High-fidelity Flux Fields From N-body Simulations Using Physically Motivated Neural Networks", a part of the 2019 NeurIPS workshop on Machine Learning in the Physical Sciences. There are several files which define the network architectures, as well as scripts to train the networks, export trained models for inference, run inference on our validation dataset, and analyze the output of the model. Read below for instructions on downloading the data and running the code.

Requirements

The code has been developed and run with the following software:

  • numpy, scipy, matplotlib, h5py
  • tensorflow 2.0.0-beta1
  • tensorboard 1.13.1

Data & pre-trained models

A bundle of the necessary training data (the normalized dark matter density field, the real-space and redshift-space Lyman-alpha flux fields, and the baryon z-velocity field) is available here (18.4 GB). This is the data required by the code in this repository. To properly set up this data for training or analysis, simply download the bundle (either through the browser or via, e.g., wget) and unpack the compressed tar into a directory titled ./data/.

Some code in this repository requires pre-trained models. These can be generated by running your own trainings, or downloading the pre-trained models available here (1.8 GB). Similarly, the proper way to use this bundle of pre-trained models is to download the bundle and unpack the compressed archive into a directory titled ./trained_models/.

Note that the training data for this work was a Nyx simulation which was normalized and re-formatted -- the raw data used can be accessed here (instructions for exploring the raw data are available in a Jupyter notbeook).

Code

The repository contains the following code:

  • flux_mapping_net.py and warping_net.py define the network architectures and build them into an object containing necessary hyperparameters and helper routines.
  • train_flux_mapping_net.py and train_warping_net.py train each respective network. For these scripts to work, the training data must be downloaded and uncompressed. By default, the code expects data to live in a ./data/ directory, and expects a ./training_runs/ directory to store training details in. Each training run must be given a unique tag, e.g. the proper usage is python train_flux_mapping_net.py 001. This will make a subdirectory ./training_runs/fluxmapnet_001/ to store all training details (checkpoints, tensorboard logs, etc) and begin training the network.
  • export_flux_mapping_net.py exports a pre-trained flux mapping network to a tensorflow SavedModel format, which can be used for inference on the full 1024x128x0124 validation set at once.
  • slabpredict.py runs inference on the validation set using a SavedModel created by the above script. This requires a very large amount of memory when generating predicitons for the full validation set at once, and does not fit on a GPU.
  • analyze.ipynb is a Jupyter notebook which loads pre-trained weights for the warping network, performs the redshift-space distortion to produce the Lya flux field in redshift space, and produces plots and analysis comparing the generated fields to the ground truth.

Contact

Peter Harrington
Lawrence Berkeley National Laboratory
[email protected]

About

Producing Lyman-alpha flux fields from N-body simulations with neural networks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published