Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Typo corrected #390

Open
wants to merge 3 commits into
base: dev
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 3 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,7 @@
[![Documentation Status](https://readthedocs.org/projects/easyvvuq/badge/?version=latest)](https://easyvvuq.readthedocs.io/)
[![Coverage Status](https://coveralls.io/repos/github/UCL-CCS/EasyVVUQ/badge.svg?branch=dev&service=github)](https://coveralls.io/github/UCL-CCS/EasyVVUQ?branch=dev)
[![CII Best Practices](https://bestpractices.coreinfrastructure.org/projects/3796/badge)](https://bestpractices.coreinfrastructure.org/projects/3796)
[![Binder](https://mybinder.org/badge_logo.svg)](https://mybinder.org/v2/gh/UCL-CCS/EasyVVUQ/dev?filepath=tutorials)

The aim of EasyVVUQ is to facilitate verification, validation and
[![Binder]EasyVVUQ aims to facilitate verification, validation and
uncertainty quantification (VVUQ) for a wide variety of
simulations. While very convenient for simple cases, EasyVVUQ is particularly well suited in situations where the simulations are computationally expensive,
heterogeneous computing resources are necessary, the sampling space is very large or book-keeping is prohibitively
Expand All @@ -21,11 +19,10 @@ Here are some examples of questions EasyVVUQ can answer about your code:

It also lets you construct surrogate models that are cheaper to evaluate than the complete simulation.

The high-level overview of the library is avalable at our [readthedocs](https://easyvvuq.readthedocs.io/en/dev/).
The high-level overview of the library is available at our [readthedocs](https://easyvvuq.readthedocs.io/en/dev/).

## Getting Started

For the quick start with EasyVVUQ we reccommend to check our basic interactive tutorial available [here](https://mybinder.org/v2/gh/UCL-CCS/EasyVVUQ/a6852d6c5ba36f15579e601d7a8d074505f31084?filepath=tutorials%2Fbasic_tutorial.ipynb).
For a quick start with EasyVVUQ we recommend checking our basic interactive tutorial available [here](https://mybinder.org/v2/gh/UCL-CCS/EasyVVUQ/a6852d6c5ba36f15579e601d7a8d074505f31084?filepath=tutorials%2Fbasic_tutorial.ipynb).


## Functionality
Expand Down
30 changes: 15 additions & 15 deletions docs/source/old/basic_tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -54,13 +54,13 @@ The `gauss.template` is a template input file, in JSON format ::
The values for each key are tags (signified by the ``$`` delimiter) which will
be substituted by EasyVVUQ with values to sample the parameter space.
In the following tutorial, the template will be used to generate files called
`in_file.json` that will be the input to each run of `gauss.py`.
`in_file.json` will be the input to each run of `gauss.py`.

Uncertainty Quantification Workflow
-----------------------------------

In this dummy workflow we will use the *gauss* application to produce values
from normal distributions centred on 3 different means `mu`), using 5 repeat
In this dummy workflow, we will use the *gauss* application to produce values
from normal distributions centered on 3 different means `mu`), using 5 repeat
('replica') runs for each one.
The output will be collected for each run and bootstrap statistics calculated
for each set of runs.
Expand All @@ -75,7 +75,7 @@ Sections 1 to 9 contain the core EasyVVUQ workflow, section 0 sets up
convenience variables related to the application.

.. note:: In this tutorial application execution is handled locally and by
EasyVVUQ functions. In real world applications (especially for HPC
EasyVVUQ functions. In real-world applications (especially for HPC
applications the run step is beyond the scope of EasyVVUQ.

To run the workflow execute the following command ::
Expand All @@ -88,7 +88,7 @@ If this works you should see 15 lines that look something like:

where `<run-location>` is the directory in which you ran the script and
`EasyVVUQ_Campaign_zxe7_cb2` is an example of the unique directory that
EasyVVUQ created to hold all of the files created relating to a campaign.
EasyVVUQ has created to hold all of the files created relating to a campaign.

Followed by a results table that looks like:

Expand All @@ -107,7 +107,7 @@ The statistics represent the variation across the 5 replica runs executed for
each of the 3 'mu' values sampled.

Below we go through each section of the workflow, explaining each step and the
EasyVVUQ elements used to perform them.
EasyVVUQ elements are used to perform them.

Section 0: Application Setup
-----------------------------------
Expand Down Expand Up @@ -140,7 +140,7 @@ Consequently, the first step of an EasyVVUQ workflow is to create a
my_campaign = uq.Campaign(name='gauss', work_dir=".")

The reason for having a name is that in some cases it may be necessary to
combine the output of multiple *Campaigns* in a single analysis and having a
combine the output of multiple *Campaigns* in a single analysis and have a
name allows the data from each to be identified easily.

Section 2: Define Parameter Space
Expand Down Expand Up @@ -193,15 +193,15 @@ The only two parameters which could (somewhat) sensibly be sampled are 'mu'
Nonetheless we need to provide a range for 'num_steps'.
Notice that the keys in the parameter description match the tags in the template.

.. note:: The names of parameters here does not need to match the input of the
application directly. In the next section we will see how *Decoder*
.. note:: The name of parameters here does not need to match the input of the
application directly. In the next section, we will see how *Decoder*
elements map the parameter space to the application inputs.


Section 3: Wrap Application
---------------------------

In order for an application to be used in an EasyVVUQ workflow two processes
For an application to be used in an EasyVVUQ workflow two processes
have to be accounted for:

1. the parameters being sampled need to be converted into a format that
Expand All @@ -226,7 +226,7 @@ We create the encoder using the following code::
.. note:: The tags in the template here use the default $ delimiter.
Different delimiters can be specified using the `delimiter` keyword.

The output of *gauss* is a CSV format files, so we use a *Decoder* called *SimpleCSV*.
The output of *gauss* is a CSV format file, so we use a *Decoder* called *SimpleCSV*.
This requires us to specify the file to be read, the location of the header (line 0)
and the columns to keep in the data for analysis::

Expand Down Expand Up @@ -265,9 +265,9 @@ In this example we simply pick 'mu' values from a uniform distribution between

my_campaign.set_sampler(my_sampler)

Real world examples are likely to use more complicated algorithms (such as
Real-world examples are likely to use more complicated algorithms (such as
quasi-Monte Carlo or stochastic collocation) but the way of specifying
parameters to vary remains the same.
parameters to vary remain the same.

Section 5: Get Run Parameters
-----------------------------
Expand All @@ -280,7 +280,7 @@ We draw samples the number of samples we want from the *Sampler*::
replicas=5)

Here we have chosen to have 5 replicas (repeats) of each sample drawn.
At this stage all that happens is the parameter sets are added to the
At this stage, all that happens is the parameter sets are added to the
*CampaignDB*, no input files have been generated.

Section 6: Create Input Directories
Expand All @@ -306,7 +306,7 @@ command we specified in Step 0::
Section 8: Collate Output
-------------------------

The collection of simulation output simply handled by the *Campaign*::
The collection of simulation output is simply handled by the *Campaign*::

my_campaign.collate()

Expand Down
12 changes: 6 additions & 6 deletions docs/source/old/cooling_coffee_cup.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,18 +3,18 @@
A Cooling Coffee Cup with Polynomial Chaos Expansion
====================================================

In this tutorial we will perform a Polynomial Chaos Expansion for a model of a cooling coffee cup.
In this tutorial, we will perform a Polynomial Chaos Expansion for a model of a cooling coffee cup.
The model uses Newton's law of cooling to evolve the temperature, :math:`T`, over time (:math:`t`) in an environment at :math:`T_{env}`:

.. math::
\frac{dT(t)}{dt} = -\kappa (T(t) -T_{env})

The constant :math:`\kappa` characterizes the rate at which the coffee cup transfers heat to the environment.
In this example we will analyze this model using the polynomial chaos expansion (PCE) UQ algorithm.
In this example, we will analyze this model using the polynomial chaos expansion (PCE) UQ algorithm.
e will use a constant initial temperature :math:`T_0 = 95 ^\circ\text{C}`, and vary :math:`\kappa` and :math:`T_{env}` using a uniform distribution in the ranges :math:`0.025-0.075` and :math:`15-25` respectively.

Below we provide a commented script that shows how the Campaign is built up and then employed.
We also provide an outline of how each element is setup.
We also provide an outline of how each element is set up.

EasyVVUQ Script Overview
------------------------
Expand All @@ -31,7 +31,7 @@ To run the script execute the following command
Import necessary libraries
--------------------------

For this example we import both easyvvuq and chaospy (for the distributions). EasyVVUQ will be referred to as 'uq' in the code. ::
For this example, we import both easyvvuq and chaospy (for the distributions). EasyVVUQ will be referred to as 'uq' in the code. ::

import easyvvuq as uq
import chaospy as cp
Expand Down Expand Up @@ -73,7 +73,7 @@ In this example the GenericEncoder and SimpleCSV, both included in the core Easy

GenericEncoder performs simple text substitution into a supplied template, using a specified delimiter to identify where parameters should be placed.
The template is shown below (\$ is used as the delimiter).
The template substitution approach is likely to suit most simple applications but in practice many large applications have more complex requirements, for example the multiple input files or the creation of a directory hierarchy.
The template substitution approach is likely to suit most simple applications but in practice, many large applications have more complex requirements, for example the multiple input files or the creation of a directory hierarchy.
In such cases, users may write their own encoders by extending the BaseEncoder class. ::

{
Expand All @@ -93,7 +93,7 @@ As can be inferred from its name SimpleCSV reads CVS files produced by the cooli

The Sampler
-----------
The user specified which parameters will vary and their corresponding distributions. In this case the kappa and t\_env parameters are varied, both according to a uniform distribution: ::
The user specified which parameters will vary and their corresponding distributions. In this case, the kappa and t\_env parameters are varied, both according to a uniform distribution: ::

vary = {
"kappa": cp.Uniform(0.025, 0.075),
Expand Down
2 changes: 1 addition & 1 deletion docs/source/old/custom_encoder.rst
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ A custom decoder can be created in a very similar manner to the encoder: ::

The two methods that must be implemented here are sim_complete(),
which returns True if the simulation has completed (this is handled by
the decoder because it is an application specific issue), and
the decoder because it is an application-specific issue), and
parse_sim_output(), which returns a dictionary containing the desired
output, distilled from the simulation output files. This dictionary
has to follow the following list of restrictions:
Expand Down
22 changes: 11 additions & 11 deletions docs/source/old/dask_tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,10 @@
A Cooling Coffee Cup - Using Dask Jobqueue to Run on Clusters
=============================================================

In this tutorial we expand the previous :doc:`example
In this tutorial, we expand the previous :doc:`example
<cooling\_coffee\_cup>` and move our computations to computing
clusters. In order to run it you will need access to one. And if you
have access to one you most likely don't need explaining what they are
have access to one you most likely don't need to explain what they are
or how they fit in the work you do. So we will skip that part. We will
also skip the parts that are the same as in the previous tutorial. We
only outline the parts that will be different from when you ran it on
Expand All @@ -16,8 +16,8 @@ your laptop. Luckily there aren't that many differences.
Import necessary libraries
--------------------------

In addition we need to import the relevant Dask classes that will let us
set-up our cluster. Here we assume a SLURM cluster, however, other
In addition, we need to import the relevant Dask classes that will let us
set up our cluster. Here we assume a SLURM cluster, however, other
options (PBS and so on) are possible. Please refer to Dask JobQueue
`documentation <https://jobqueue.dask.org/en/latest/>`_. ::

Expand All @@ -28,7 +28,7 @@ Create a new Campaign
---------------------

As in the :doc:`Basic Tutorial <basic\_tutorial>`, we start by creating the
campaign, the only difference is that we instantiate the CampaignDask class
the campaign, the only difference is that we instantiate the CampaignDask class
instead of Campaign ::

my_campaign = uq.CampaignDask(name='coffee_pce')
Expand All @@ -39,13 +39,13 @@ Initialize Cluster
Provided that you have access to a computing cluster you can now run
your UQ workflow on it. You will need to know some technical details
about the compute nodes of your cluster. Most importantly you need to
know how many CPU cores does this node have and how much RAM. This
information is used to figure out the amount of resources we will
know how many CPU cores this node has and how much RAM. This
information is used to figure out the number of resources we will
need, namely, how many nodes to reserve.

Here we describe a single node of an example cluster. Please note that
you don't need to specify the resources you need for your run as
such. Only the resources available on a single node. Unless the
such. Only the resources are available on a single node. Unless the
resources the job needs are fewer than the node provides. For example,
if the node has 48 cores and 64 gigabytes of memory ::

Expand All @@ -60,7 +60,7 @@ for example, we can use ::

cluster.scale(96)

At this stage you can print the batch file that will be used to submit the
At this stage, you can print the batch file that will be used to submit the
worker processes. ::

print(cluster.job_script())
Expand All @@ -85,7 +85,7 @@ before. ::
my_campaign.populate_runs_dir()
my_campaign.apply_for_each_run_dir(uq.actions.ExecuteLocal("python3 cooling_model.py cooling_in.json"), client)

At this stage the computation will block until the requested resources are
At this stage, the computation will block until the requested resources are
allocated and all the computations are completed.


Expand All @@ -103,7 +103,7 @@ something like this: ::
salloc --partition=interactivequeue

The system will then try to allocate resources for you to run the
interactive job and this might take a couple of moments. After that an
interactive job and this might take a couple of moments. After that, an
interactive mode prompt will appear. Commands that you execute there
will be run on compute nodes. You would then execute the script
normally, e.g. ::
Expand Down
30 changes: 15 additions & 15 deletions docs/source/old/fusion_tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ and
.. math::
mtanh(x, b_{slope}) = \frac{(1 + x \cdot b_{slope}) exp(x) - exp(-x)}{exp(x) + exp(-x)}

A typical density profile used in these simulation is shown below:
A typical density profile used in this simulation is shown below:

.. figure:: ../images/ne.svg

Expand All @@ -92,7 +92,7 @@ The source is given by
where :math:`\alpha` is chosen so that :math:`\int\; S(\rho,t) dV =
Qe_{tot}`, the total heating power.

In this example we will analyze this model using the polynomial chaos
In this example, we will analyze this model using the polynomial chaos
expansion (PCE) UQ algorithm. The parameters that can be varied are:

================== ======= ======= =========
Expand Down Expand Up @@ -128,13 +128,13 @@ though we will restrict the variation to
for this analysis.

Below we provide a commented script that shows how the Campaign is built up and then employed.
We also provide an outline of how each element is setup.
We also provide an outline of how each element is set up.

EasyVVUQ Script Overview
------------------------

We illustrate the intended workflow using the following basic example
script, a python implementation of the reduced fusion workflow model.
We illustrate the intended workflow using the following basic example script,
a python implementation of the reduced fusion workflow model.

The input files for this tutorial are

Expand Down Expand Up @@ -162,7 +162,7 @@ To run the script execute the following command
Import necessary libraries
--------------------------

For this example we import both easyvvuq and chaospy (for the
For this example, we import both easyvvuq and chaospy (for the
distributions). EasyVVUQ will be referred to as 'uq' in the code. ::

import easyvvuq as uq
Expand Down Expand Up @@ -212,7 +212,7 @@ App Creation
------------

In this example the GenericEncoder and SimpleCSV, both included in the
core EasyVVUQ library, were used as the encoder/decoder pair for this
core EasyVVUQ library were used as the encoder/decoder pair for this
application. ::

encoder = uq.encoders.GenericEncoder(
Expand All @@ -227,8 +227,8 @@ GenericEncoder performs simple text substitution into a supplied
template, using a specified delimiter to identify where parameters
should be placed. The template is shown below (\$ is used as the
delimiter). The template substitution approach is likely to suit most
simple applications but in practice many large applications have more
complex requirements, for example the multiple input files or the
simple applications but in practice, many large applications have more
complex requirements, for example, the multiple input files or the
creation of a directory hierarchy. In such cases, users may write
their own encoders by extending the BaseEncoder class. ::

Expand Down Expand Up @@ -289,7 +289,7 @@ Calling the campaign's draw\_samples() method will cause the specified
number of samples to be added as runs to the campaign database,
awaiting encoding and execution. If no arguments are passed to
draw\_samples() then all samples will be drawn, unless the sampler is
not finite. In this case PCESampler is finite (produces a finite
not finite. In this case, PCESampler is finite (produces a finite
number of samples) and we elect to draw all of them at once: ::

my_campaign.draw_samples()
Expand Down Expand Up @@ -334,16 +334,16 @@ The output of this is dependent on the type of analysis element. ::
Typical results
---------------

The above workflow calculates the distribution of temeperatures as the
uncertain parameters are varied. A typical results is shown below.
The above workflow calculates the distribution of temperatures as the
uncertain parameters are varied. A typical result is shown below.

.. figure:: ../images/Te.svg

Here the mean temperature, the mean plus and minus one sigma, the 10
and 90 percentiles as well as the complete range are shown as a
function of :math:`\rho`.

The sensitivity of the results to the varying paramaters can be found
The sensitivity of the results to the varying parameters can be found
from the Sobol first

.. figure:: ../images/sobols_first.svg
Expand Down Expand Up @@ -383,7 +383,7 @@ in easyvvuq_fusion_dask_tutorial.py and are basically:
- or using SLURM, here configured to use

- p.tok.openmp.2h QOS
- send a mail at completion of the SLURM job(s)
- send a mail at the completion of the SLURM job(s)
- use the p.tok.openmp partition ("queue")
- 8 cores per job
- 8 processes per job
Expand Down Expand Up @@ -446,7 +446,7 @@ References
.. [MTANH] |_| See

- E. Stefanikova, M. Peterka, P. Bohm, P. Bilkova, M. Aftanas, M. Sos, J. Urban, M. Hron and R. Panek:
Fitting of the Thomson scatteringdensity and temperature profiles on the COMPASS tokamak.
Fitting of the Thomson scattering density and temperature profiles on the COMPASS tokamak.
Presented at 21st Topical Conference on High-Temperature Plasma Diagnostics
(HTPD 2016) in Madison, Wisconsin, USA and published in
Review of Scientific Instruments 87, 11E536 (2016); https://doi.org/10.1063/1.4961554.
Expand Down
Loading