Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: Fatal error if no observations available for lognormal inversion #230

Open
djvaron opened this issue Jun 27, 2024 · 1 comment
Open

Comments

@djvaron
Copy link
Collaborator

djvaron commented Jun 27, 2024

Name and Institution (Required)

Name: Daniel Varon
Institution: Harvard University

Description of your issue or question

The IMI fails if no observations are available during an inversion period (for example, due to a TROPOMI outage) when using lognormal prior errors. The problem originates in merge_partial_k.py, where K is not defined if no observations are available. The problem persists in lognormal_invert.py, which expects a K matrix.

The behavior should be as in invert.py (which assumes normal prior errors) -- if no observations are available, then K is populated with zeroes and the inversion returns the prior.

@djvaron
Copy link
Collaborator Author

djvaron commented Jun 28, 2024

Here's a kludge that seems to work but is not ideal:

  1. If there are no observations, make merge_partial_k.py return K=0 and so=1

merge_partial_k.py:

if len(files) == 0:
    print("\nWarning: No observations available during inversion period. Inversion will return prior estimate.\n")
    K = 0
    so = 1
  1. If K=0, make lognormal_invert.py exit with xhat=1 (scale factors) or xhat=0 (boundary conditions) and generate the bare minimum output to continue with the next inversion period

lognormal_invert.py:

if K_temp.size == 1:
    # Define xhat vector as ones, zero for BC
    xhat = np.ones(num_sv_elems)
    xhat[-4:] = 0
    # Define xhat lat/lon array of ones
    xhat_arr = np.ones((len(lats), len(lons)))
    xhat_da = xr.DataArray(
        data=xhat_arr,
        dims=["lat", "lon"],
        coords=dict(
            lon=(["lon"], lons.values),
            lat=(["lat"], lats.values),
        ),
    )
    # Create gridded posterior scale factors = ones
    ds = xr.Dataset(
        {
            "ScaleFactor": (["lat", "lon"], xhat_da.data),
        },
        coords={"lon": ("lon", lons.data), "lat": ("lat", lats.data)},
    )
    ds.lat.attrs["units"] = "degrees_north"
    ds.lat.attrs["long_name"] = "Latitude"
    ds.lon.attrs["units"] = "degrees_east"
    ds.lon.attrs["long_name"] = "Longitude"
    ds.ScaleFactor.attrs["units"] = "1"
    ds.to_netcdf("gridded_posterior_ln.nc")
    # Create inversion results netcdf file
    results_save_path = f"inversion_result_ln.nc"
    dataset = Dataset(results_save_path, "w", format="NETCDF4_CLASSIC")
    dataset.createDimension("nvar", num_sv_elems)
    dataset.createDimension("float", 1)
    nc_xn = dataset.createVariable("xhat", np.float32, ("nvar"))
    nc_xn[:] = xhat
    dataset.close()
    return

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant