getNLDAS(
AOI,
varname = NULL,
- model = NULL,
+ model = "FORA0125_H.002",
startDate,
endDate = NULL,
verbose = FALSE,
diff --git a/docs/search.json b/docs/search.json
index 843dcd6..8f6ded7 100644
--- a/docs/search.json
+++ b/docs/search.json
@@ -1 +1 @@
-[{"path":[]},{"path":"/articles/01-intro.html","id":"usful-packages-for-climate-data","dir":"Articles","previous_headings":"","what":"Usful Packages for climate data","title":"Welcome to climateR","text":"","code":"library(AOI) library(climateR) library(tidyterra) library(ggplot2) library(terra) library(tidyr) library(sf)"},{"path":"/articles/01-intro.html","id":"climater-examples","dir":"Articles","previous_headings":"","what":"climateR examples","title":"Welcome to climateR","text":"climateR package supplemented AOI framework established AOI R package. get climate product, area interest must defined: loading polygon state North Carolina examples constructing AOI calls can found . AOI, can construct call dataset parameter(s) date(s) choice. querying PRISM dataset maximum minimum temperature October 29, 2018:","code":"AOI = aoi_get(state = \"NC\") plot(AOI$geometry) system.time({ p = getPRISM(AOI, varname = c('tmax','tmin'), startDate = \"2018-10-29\") }) #> user system elapsed #> 0.845 0.193 8.782"},{"path":"/articles/01-intro.html","id":"data-from-known-bounding-coordinates","dir":"Articles","previous_headings":"","what":"Data from known bounding coordinates","title":"Welcome to climateR","text":"climateR offers support sf, sfc, bbox objects. requesting wind velocity data four corners region USA bounding coordinates.","code":"AOI = st_as_sfc(st_bbox(c(xmin = -112, xmax = -105, ymax = 39, ymin = 34), crs = 4326)) g = getGridMET(AOI, varname = \"vs\", startDate = \"2018-09-01\")"},{"path":"/articles/01-intro.html","id":"data-through-time","dir":"Articles","previous_headings":"","what":"Data through time …","title":"Welcome to climateR","text":"addition multiple variables can request variables time, let’s look gridMET rainfall Gulf Coast Hurricane Harvey:","code":"harvey = getGridMET(aoi_get(state = c(\"TX\", \"FL\")), varname = \"pr\", startDate = \"2017-08-20\", endDate = \"2017-08-31\") ggplot() + geom_spatraster(data = harvey$precipitation_amount) + facet_wrap(~lyr) + scale_fill_whitebox_c( palette = \"muted\", na.value = \"white\") + theme_minimal()"},{"path":"/articles/01-intro.html","id":"climate-projections","dir":"Articles","previous_headings":"","what":"Climate Projections","title":"Welcome to climateR","text":"sources downscaled Global Climate Models (GCMs). allow query forecasted ensemble members different models /climate scenarios. One example MACA dataset: Getting multiple models results also quite simple: don’t know models, can always grab random set specifying number:","code":"system.time({ m = getMACA(AOI = aoi_get(state = \"FL\"), model = \"CCSM4\", varname = 'pr', scenario = c('rcp45', 'rcp85'), startDate = \"2080-06-29\", endDate = \"2080-06-30\") }) #> user system elapsed #> 0.144 0.028 4.141 models = c(\"BNU-ESM\",\"CanESM2\", \"CCSM4\") temp = getMACA(AOI = aoi_get(state = \"CO\"), varname = 'tasmin', model = models, startDate = \"2080-11-29\") temp[[1]]$mean = app(temp[[1]], mean) names(temp[[1]]) = c(models, \"Ensemble Mean\") random = getMACA(aoi_get(state = \"MI\"), model = 3, varname = \"pr\", startDate = \"2050-10-29\")"},{"path":"/articles/01-intro.html","id":"global-datasets","dir":"Articles","previous_headings":"","what":"Global Datasets","title":"Welcome to climateR","text":"datasets USA focused either. TerraClimate offers global, monthly data current year many variables, CHIRPS provides daily rainfall data:","code":"kenya = aoi_get(country = \"Kenya\") tc = getTerraClim(kenya, varname = \"pet\", startDate = \"2018-01-01\") chirps = getCHIRPS(kenya, startDate = \"2018-01-01\", endDate = \"2018-01-04\" )"},{"path":"/articles/01-intro.html","id":"point-based-data","dir":"Articles","previous_headings":"","what":"Point Based Data","title":"Welcome to climateR","text":"Finally, data gathering limited areal extents can retrieved time series locations.","code":"ts = data.frame(lng = -105.0668, lat = 40.55085) %>% sf::st_as_sf(coords = c('lng', 'lat'), crs = 4326) %>% getGridMET(varname = c(\"pr\", 'srad'), startDate = \"2021-01-01\", endDate = \"2021-12-31\")"},{"path":"/articles/01-intro.html","id":"point-based-ensemble","dir":"Articles","previous_headings":"","what":"Point Based Ensemble","title":"Welcome to climateR","text":"","code":"future = getMACA(geocode(\"Fort Collins\", pt = TRUE), model = 5, varname = \"tasmax\", startDate = \"2050-01-01\", endDate = \"2050-01-31\") future_long = pivot_longer(future, -date) ggplot(data = future_long, aes(x = date, y = value, col = name)) + geom_line() + theme_linedraw() + scale_color_brewer(palette = \"Dark2\") + labs(title = \"Fort Collins Temperture: January, 2050\", x = \"Date\", y = \"Degree K\", color = \"Model\")"},{"path":"/articles/01-intro.html","id":"multi-site-extraction","dir":"Articles","previous_headings":"","what":"Multi site extraction","title":"Welcome to climateR","text":"Extracting data set points interesting challenge. turns much efficient grab underlying raster stack extract time series opposed iterating locations: Starting set locations Colorado: climateR grab SpatRaster underlying bounding area points Use extract_sites extract times series locations. id parameter unique identifier site data names resulting columns. make data ‘tidy’ simply pivot date column:","code":"f = system.file(\"co/cities_colorado.rds\", package = \"climateR\") cities = readRDS(f) sites_stack = getTerraClim(AOI = cities, varname = \"tmax\", startDate = \"2018-01-01\", endDate = \"2018-12-31\") sites_wide = extract_sites(r = sites_stack, pts = cities, id = \"NAME\") sites_wide[[1]][1:5, 1:5] #> date ADAMSCITY AGATE AGUILAR AKRON #> 1 2018-01-01 9.5 8.2 11.4 7.1 #> 2 2018-02-01 8.1 7.1 9.9 5.8 #> 3 2018-03-01 14.6 14.1 15.0 13.5 #> 4 2018-04-01 17.5 16.6 17.6 16.2 #> 5 2018-05-01 25.1 25.0 25.5 24.8 tmax = tidyr::pivot_longer(sites_wide[[1]], -date) head(tmax) #> # A tibble: 6 × 3 #> date name value #> #> 1 2018-01-01 00:00:00 ADAMSCITY 9.5 #> 2 2018-01-01 00:00:00 AGATE 8.2 #> 3 2018-01-01 00:00:00 AGUILAR 11.4 #> 4 2018-01-01 00:00:00 AKRON 7.10 #> 5 2018-01-01 00:00:00 ALAMOSA 5.2 #> 6 2018-01-01 00:00:00 ALLENSPARK 6.10"},{"path":"/articles/02-catalogs.html","id":"catalogs","dir":"Articles","previous_headings":"","what":"Catalogs","title":"Catalog Automation","text":"order provide evolving, federated collection datasets, climateR makes use preprocessed catalog, updated monthly cycle. catalog hosted generated climateR-catalogs repository. catalog contains 100,000 thousand datasets 2,000 data providers/archives. following section describes design catalog data pipeline.","code":""},{"path":"/articles/02-catalogs.html","id":"design","dir":"Articles","previous_headings":"Catalogs","what":"Design","title":"Catalog Automation","text":"catalog data pipeline uses targets package establish declarative workflow using data sources target creators. particular, data sources treated dynamic plugins data pipeline, data sources composable within pipeline framework utilizing R6 classes. data source R6 classes expose simple interface plugin creators, adding new data source defined giving data source three things: id pull function tidy function id represents unique identifier data source contained final catalog. pull function function containing number arguments gather catalog items endpoint, collect data.frame. tidy function function accepts least single argument output pull function. function perform necessary actions conform argument close catalog schema possible. Using data sources built top R6-based framework, pipeline given targets correspond (1) loading R6 class, (2) calling pull function, (3) calling tidy function. three steps mapped across available data sources loaded pipeline environment, joined together create seamless table representing catalog. Finally, schema table handled ensure conforms catalog specification, outputs JSON Parquet released.","code":""},{"path":[]},{"path":"/articles/02-catalogs.html","id":"targets-serialization","dir":"Articles","previous_headings":"Catalogs > Design > Technical Details","what":"Targets Serialization","title":"Catalog Automation","text":"key point highlight targets R package, individual targets serialized specific format completed. Dependent targets also read serialization format back R necessary. default format targets use R RDS format. However, since pipeline already requires Apache Arrow dependency due Parquet output, take advantage Arrow IPC file/stream formats serialization targets. Specifically, pull tidy targets always return data source R6 class, succeeding targets catalog generation return data frame. targets returning R6 classes, custom serializer performs /O R6 class metadata Arrow IPC Stream format implemented. targets returning data frames, use Arrow IPC File format. Arrow IPC formats chosen fashion due smaller memory footprint performance gained zero-copy pass targets. also enables data sources built various programming languages access data needed, due zero-copy property Arrow’s IPC formats.","code":""},{"path":"/articles/02-catalogs.html","id":"pipeline-infrastructure","dir":"Articles","previous_headings":"Catalogs > Design > Technical Details","what":"Pipeline Infrastructure","title":"Catalog Automation","text":"catalog data pipeline built top R targets package, aid generating catalog, utilize GitHub Actions. Despite primarily CI/CD workflows, concept CI/CD can generalized data well. example, data engineering, Apache Airflow predominant application constructing data workflows. two, primary difference GitHub Actions generalized, offers less direct integrations data engineering. context mind, GitHub Actions workflow catalog data pipeline , essence, runner calls targets::tar_make() run pipeline. targets complete, workflow takes outputted catalog files uploads GitHub repository release. Furthermore, workflow scheduled run monthly basis, ensuring catalog stays consistently date latest datasets offered data providers described data source plugins.","code":""},{"path":"/articles/02-catalogs.html","id":"release-strategy","dir":"Articles","previous_headings":"Catalogs > Design > Technical Details","what":"Release Strategy","title":"Catalog Automation","text":"monthly Github Actions update, new release catalog provided JSON parquet formats release page.","code":""},{"path":"/articles/03-intro-climatepy.html","id":"useful-packages-for-climate-data","dir":"Articles","previous_headings":"","what":"Useful Packages for climate data","title":"Welcome to climatePy","text":"","code":"# climatePy import climatePy # vector data libs import geopandas as gpd import shapely from shapely.geometry import box # gridded data libs import xarray as xr # geoencoding service import geopy # misc import numpy as np import pandas as pd import random import joblib # plotting libs import matplotlib.pyplot as plt import seaborn as sns"},{"path":"/articles/03-intro-climatepy.html","id":"climatepy-examples","dir":"Articles","previous_headings":"","what":"climatePy examples","title":"Welcome to climatePy","text":"climatePy package supplemented geopy Python package allows easy use interface many geocoding APIs. get climate product, area interest must defined: loading polygon state North Carolina examples constructing AOI calls can found AOI, can construct call dataset parameter(s) date(s) choice. querying PRISM dataset maximum minimum temperature October 29, 2018:","code":"# get AOI polygon from OpenStreetMap API nom = geopy.geocoders.Nominatim(user_agent=\"climatePy\") geolocal = nom.geocode(\"North Carolina\", geometry='wkt') AOI = gpd.GeoDataFrame( {\"geometry\" : [shapely.wkt.loads(geolocal.raw['geotext'])] }, crs = \"EPSG:4326\" ) p = climatePy.getPRISM( AOI = AOI, varname = ['tmax','tmin'], startDate = \"2018-10-29\", timeRes = \"daily\", dopar = False )"},{"path":"/articles/03-intro-climatepy.html","id":"data-from-known-bounding-coordinates","dir":"Articles","previous_headings":"","what":"Data from known bounding coordinates","title":"Welcome to climatePy","text":"climatePy offers support shapely geopandas objects. requesting wind velocity data four corners region USA bounding coordinates.","code":"from shapely.geometry import box # 4 corners region of USA xmin, xmax, ymin, ymax = -112, -105, 34, 39 # make bounding box AOI = box(xmin, ymin, xmax, ymax) # insert bounding box into geodataframe # AOI = gpd.GeoDataFrame(geometry=[AOI], crs ='EPSG:4326') g = climatePy.getGridMET( AOI = AOI, varname = \"vs\", startDate = \"2018-09-01\", dopar = False )"},{"path":"/articles/03-intro-climatepy.html","id":"data-through-time","dir":"Articles","previous_headings":"","what":"Data through time …","title":"Welcome to climatePy","text":"addition multiple variables can request variables time, let’s look gridMET rainfall Gulf Coast Hurricane Harvey:","code":"texas = nom.geocode(\"Texas\", geometry='wkt') florida = nom.geocode(\"Florida\", geometry='wkt') AOI = gpd.GeoDataFrame({ \"geometry\" : [shapely.wkt.loads(texas.raw['geotext']), shapely.wkt.loads(florida.raw['geotext'])] }, crs = \"EPSG:4326\" ) harvey = climatePy.getGridMET( AOI = AOI, varname = \"pr\", startDate = \"2017-08-20\", endDate = \"2017-08-31\", dopar = False )"},{"path":"/articles/03-intro-climatepy.html","id":"climate-projections","dir":"Articles","previous_headings":"","what":"Climate Projections","title":"Welcome to climatePy","text":"sources downscaled Global Climate Models (GCMs). allow query forecasted ensemble members different models /climate scenarios. One example MACA dataset: Getting multiple models results also quite simple: don’t know models, can always grab random set specifying number:","code":"AOI = gpd.GeoDataFrame({ \"geometry\" : [shapely.wkt.loads(florida.raw['geotext'])]}, crs = \"EPSG:4326\" ) m = climatePy.getMACA( AOI = AOI, model = \"CCSM4\", varname = \"pr\", scenario = [\"rcp45\", \"rcp85\"], startDate = \"2080-06-29\", endDate = \"2080-06-30\", dopar = False ) AOI = gpd.GeoDataFrame({\"geometry\" : [shapely.wkt.loads(nom.geocode(\"Colorado\", geometry='wkt').raw['geotext'])]}, crs = \"EPSG:4326\" ) models = [\"BNU-ESM\",\"CanESM2\", \"CCSM4\"] temp = climatePy.getMACA( AOI = AOI, varname = \"tasmin\", model = models, startDate = \"2080-11-29\", dopar = False ) # calculate average Data Array avg = temp['tasmin'].mean(dim = \"time\") avg = avg.expand_dims(time = xr.DataArray([\"tasmin_Ensemble_mean\"], dims='time')).transpose('x', 'y', 'time') # Concatonate original data arrays with average data array temp['tasmin'] = xr.concat([temp['tasmin'], avg], dim=\"time\") # AOI (Michigan, USA) AOI = gpd.GeoDataFrame({ \"geometry\" : [shapely.wkt.loads(nom.geocode(\"Michigan, USA\", geometry='wkt').raw['geotext'])] }, crs = \"EPSG:4326\" ) # get 3 random MACA models random_models = climatePy.getMACA( AOI = AOI, model = 3, varname = \"tasmin\", startDate = \"2050-10-29\", dopar = False )"},{"path":"/articles/03-intro-climatepy.html","id":"global-datasets","dir":"Articles","previous_headings":"","what":"Global Datasets","title":"Welcome to climatePy","text":"datasets USA focused either. TerraClimate offers global, monthly data current year many variables, CHIRPS provides daily rainfall data:","code":"kenya = gpd.GeoDataFrame({ \"geometry\" : [shapely.wkt.loads(nom.geocode(\"Kenya\", geometry='wkt').raw['geotext'])] }, crs = \"EPSG:4326\" ) # TerraClim PET tc = climatePy.getTerraClim( AOI = kenya, varname = \"pet\", startDate = \"2018-01-01\", dopar = False ) # CHIRPS precip chirps = climatePy.getCHIRPS( AOI = kenya, startDate = \"2018-01-01\", endDate = \"2018-01-01\", dopar = False )"},{"path":"/articles/03-intro-climatepy.html","id":"point-based-data","dir":"Articles","previous_headings":"","what":"Point Based Data","title":"Welcome to climatePy","text":"Finally, data gathering limited areal extents can retrieved time series locations.","code":"# Create a DataFrame with 'lng' and 'lat' columns df = pd.DataFrame({'lng': [-105.0668], 'lat': [40.55085]}) pt = (gpd.GeoDataFrame(geometry=gpd.points_from_xy(df['lng'], df['lat']), crs='EPSG:4326')) ts = climatePy.getGridMET( AOI = pt, varname = [\"pr\", 'srad'], startDate = \"2021-01-01\", endDate = \"2021-12-31\", dopar = False )"},{"path":"/articles/03-intro-climatepy.html","id":"point-based-ensemble","dir":"Articles","previous_headings":"","what":"Point Based Ensemble","title":"Welcome to climatePy","text":"","code":"# Point Based Ensemble future = climatePy.getMACA( AOI = pt, model = 5, varname = \"tasmax\", startDate = \"2050-01-01\", endDate = \"2050-01-31\", dopar = False )"},{"path":"/articles/03-intro-climatepy.html","id":"multi-site-extraction","dir":"Articles","previous_headings":"","what":"Multi Site extraction","title":"Welcome to climatePy","text":"Extracting data set points interesting challenge. turns much efficient grab underlying raster stack extract time series opposed iterating locations: Starting set 50 random points Colorado. climatePy grab DataArray underlying bounding area points Use extract_sites extract times series locations. id parameter unique identifier site data names resulting columns. Providing stack DataArrays extract_sites points_df extract raster values point across time.","code":"# Colorado state polygon AOI = gpd.GeoDataFrame({ \"geometry\" : [shapely.wkt.loads(nom.geocode(\"Colorado\", geometry='wkt').raw['geotext'])] }, crs = \"EPSG:4326\" ) # create 10 random Lat/lon points within the AOI bounding box points = [shapely.geometry.Point(random.uniform(AOI.bounds.minx[0], AOI.bounds.maxx[0]), random.uniform(AOI.bounds.miny[0], AOI.bounds.maxy[0])) for _ in range(50) ] # make geopandas dataframe from points points_df = gpd.GeoDataFrame(geometry=points, crs = \"EPSG:4326\") # create a unique identifier column points_df[\"uid\"] = [\"uid_\" + str(i) for i in range(len(points_df))] sites_stack = climatePy.getTerraClim( AOI = points_df, varname = \"tmax\", startDate = \"2018-01-01\", endDate = \"2018-12-31\" ) # extract wide sites data sites_wide = climatePy.extract_sites(r = sites_stack[\"tmax\"], pts = points_df, id = \"uid\")"},{"path":"/authors.html","id":null,"dir":"","previous_headings":"","what":"Authors","title":"Authors and Citation","text":"Mike Johnson. Author, maintainer. Justin Singh. Contributor. Angus Watters. Contributor. . Funder. . Funder.","code":""},{"path":"/authors.html","id":"citation","dir":"","previous_headings":"","what":"Citation","title":"Authors and Citation","text":"Johnson M (2023). climateR: climateR. R package version 0.3.1.4, https://github.com/mikejohnson51/climateR.","code":"@Manual{, title = {climateR: climateR}, author = {Mike Johnson}, year = {2023}, note = {R package version 0.3.1.4}, url = {https://github.com/mikejohnson51/climateR}, }"},{"path":"/index.html","id":"welcome","dir":"","previous_headings":"","what":"climateR","title":"climateR","text":"climateR simplifies steps needed get climate data R. core provides three main things: catalog 100,000k datasets 2,000 data providers/archives. See (climateR::params) catalog evolving, federated collection datasets can accessed data access utilities. general toolkit accessing remote local gridded data files bounded space, time, variable constraints (dap, dap_crop, read_dap_file) set shortcuts implement methods core set selected catalog elements ⚠️ Python Users: Data catalog access available USGS gdptools package. Directly analogous climateR functionality can found climatePy","code":"nrow(params) #> [1] 107857 length(unique(params$id)) #> [1] 2075 length(unique(params$asset)) #> [1] 4653"},{"path":"/index.html","id":"installation","dir":"","previous_headings":"","what":"Installation","title":"climateR","text":"","code":"remotes::install_github(\"mikejohnson51/AOI\") # suggested! remotes::install_github(\"mikejohnson51/climateR\")"},{"path":"/index.html","id":"basic-usage","dir":"","previous_headings":"","what":"Basic Usage","title":"climateR","text":"Finding rainfall Colorado October 29,1991 - November 6, 1991. source dataset example uses getGridMET shortcut.","code":"library(AOI) library(terra) library(climateR) AOI = aoi_get(state = \"CO\", county = \"all\") system.time({ d = getGridMET(AOI, varname = \"pr\", startDate = \"1991-10-29\", endDate = \"1991-11-06\") }) #> user system elapsed #> 0.245 0.054 0.982"},{"path":"/index.html","id":"basic-animation","dir":"","previous_headings":"","what":"Basic Animation","title":"climateR","text":"","code":"animation(d$precipitation_amount, AOI = AOI, outfile = \"man/figures/rast_gif.gif\")"},{"path":"/index.html","id":"integration-with-zonal","dir":"","previous_headings":"","what":"Integration with zonal","title":"climateR","text":"","code":"library(zonal) system.time({ county = execute_zonal(d, geom = AOI, ID = \"fip_code\") }) #> user system elapsed #> 0.328 0.018 0.366 animation(county, feild_pattern = \"pr_\", outfile = \"man/figures/vect_gif.gif\")"},{"path":"/reference/animation.html","id":null,"dir":"Reference","previous_headings":"","what":"Animate Object as GIF — animation","title":"Animate Object as GIF — animation","text":"Animate SpatRaster object gif.","code":""},{"path":"/reference/animation.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Animate Object as GIF — animation","text":"","code":"animation(data, AOI = NULL, feild_pattern = NULL, outfile, colors = blues9)"},{"path":"/reference/animation.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Animate Object as GIF — animation","text":"data SpatVect sf object AOI optional AOI sf SpatVect object overlay gif feild_pattern optional string vector filter desired attributes outfile path write gif file, must .gif extenstion colors colors plot ","code":""},{"path":"/reference/animation.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Animate Object as GIF — animation","text":"file.path","code":""},{"path":[]},{"path":"/reference/animation_raster.html","id":null,"dir":"Reference","previous_headings":"","what":"Animate SpatRast as GIF — animation_raster","title":"Animate SpatRast as GIF — animation_raster","text":"Animate SpatRaster object gif.","code":""},{"path":"/reference/animation_raster.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Animate SpatRast as GIF — animation_raster","text":"","code":"animation_raster(data, AOI = NULL, outfile, colors = blues9)"},{"path":"/reference/animation_raster.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Animate SpatRast as GIF — animation_raster","text":"data single SpatRast object AOI optional AOI sf SpatVect object overlay gif outfile path write gif file, must .gif extenstion colors colors plot ","code":""},{"path":"/reference/animation_raster.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Animate SpatRast as GIF — animation_raster","text":"file.path","code":""},{"path":[]},{"path":"/reference/animation_vector.html","id":null,"dir":"Reference","previous_headings":"","what":"Animate vector as GIF — animation_vector","title":"Animate vector as GIF — animation_vector","text":"Animate sf SpatVect object gif.","code":""},{"path":"/reference/animation_vector.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Animate vector as GIF — animation_vector","text":"","code":"animation_vector(data, feild_pattern = NULL, outfile, colors = blues9)"},{"path":"/reference/animation_vector.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Animate vector as GIF — animation_vector","text":"data SpatVect sf object feild_pattern optional string vector filter desired attributes outfile path write gif file, must .gif extenstion colors colors plot ","code":""},{"path":"/reference/animation_vector.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Animate vector as GIF — animation_vector","text":"file.path","code":""},{"path":[]},{"path":"/reference/catalog.html","id":null,"dir":"Reference","previous_headings":"","what":"ClimateR Catalog — catalog","title":"ClimateR Catalog — catalog","text":"ClimateR Catalog","code":""},{"path":"/reference/catalog.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"ClimateR Catalog — catalog","text":"","code":"catalog"},{"path":"/reference/catalog.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"ClimateR Catalog — catalog","text":"object class tbl_df (inherits tbl, data.frame) 37010 rows 29 columns.","code":""},{"path":[]},{"path":"/reference/checkDodsrc.html","id":null,"dir":"Reference","previous_headings":"","what":"Check dodsrc file — checkDodsrc","title":"Check dodsrc file — checkDodsrc","text":"Check netrc file valid entry urs.earthdata.nasa.gov.","code":""},{"path":"/reference/checkDodsrc.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Check dodsrc file — checkDodsrc","text":"","code":"checkDodsrc(dodsrcFile = getDodsrcPath(), netrcFile = getNetrcPath())"},{"path":"/reference/checkDodsrc.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Check dodsrc file — checkDodsrc","text":"dodsrcFile File path dodsrc file check. netrcFile File path netrc file check.","code":""},{"path":"/reference/checkDodsrc.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Check dodsrc file — checkDodsrc","text":"logical","code":""},{"path":[]},{"path":"/reference/checkNetrc.html","id":null,"dir":"Reference","previous_headings":"","what":"Check netrc file — checkNetrc","title":"Check netrc file — checkNetrc","text":"Check netrc file valid entry urs.earthdata.nasa.gov.","code":""},{"path":"/reference/checkNetrc.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Check netrc file — checkNetrc","text":"","code":"checkNetrc(netrcFile = getNetrcPath(), machine = \"urs.earthdata.nasa.gov\")"},{"path":"/reference/checkNetrc.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Check netrc file — checkNetrc","text":"netrcFile character. File path netrc file check. machine machine logging ","code":""},{"path":"/reference/checkNetrc.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Check netrc file — checkNetrc","text":"logical","code":""},{"path":[]},{"path":"/reference/climater_dap.html","id":null,"dir":"Reference","previous_headings":"","what":"ClimateR dry run — climater_dap","title":"ClimateR dry run — climater_dap","text":"ClimateR dry run","code":""},{"path":"/reference/climater_dap.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"ClimateR dry run — climater_dap","text":"","code":"climater_dap(id, args, verbose, dryrun, print.arg = FALSE)"},{"path":"/reference/climater_dap.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"ClimateR dry run — climater_dap","text":"id resource name, agency, catalog identifier args parent function arguments verbose messages emited? dryrun Return summary data prior retrieving print.arg arguments printed? Usefull debugging","code":""},{"path":"/reference/climater_dap.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"ClimateR dry run — climater_dap","text":"data.frame","code":""},{"path":[]},{"path":"/reference/climater_filter.html","id":null,"dir":"Reference","previous_headings":"","what":"ClimateR Catalog Filter — climater_filter","title":"ClimateR Catalog Filter — climater_filter","text":"Filter climateR catalog based set constraints","code":""},{"path":"/reference/climater_filter.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"ClimateR Catalog Filter — climater_filter","text":"","code":"climater_filter( id = NULL, asset = NULL, AOI = NULL, startDate = NULL, endDate = NULL, varname = NULL, model = NULL, scenario = NULL, ensemble = NULL )"},{"path":"/reference/climater_filter.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"ClimateR Catalog Filter — climater_filter","text":"id resource, agency, catalog identifier asset subdataset asset given resource AOI sf SpatVect point polygon extract data startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data varname variable name extract (e.g. tmin) model GCM model name generating scenario climate modeling scenario ensemble model ensemble member used generate data","code":""},{"path":"/reference/climater_filter.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"ClimateR Catalog Filter — climater_filter","text":"data.frame","code":""},{"path":[]},{"path":"/reference/dap.html","id":null,"dir":"Reference","previous_headings":"","what":"Get Data (Data Access Protocol) — dap","title":"Get Data (Data Access Protocol) — dap","text":"function provides consistent data access protocol (dap) wide range local remote resources including VRT, TDS, NetCDF Define get data DAP resource","code":""},{"path":"/reference/dap.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get Data (Data Access Protocol) — dap","text":"","code":"dap( URL = NULL, catalog = NULL, AOI = NULL, startDate = NULL, endDate = NULL, varname = NULL, grid = NULL, start = NULL, end = NULL, toptobottom = FALSE, verbose = TRUE )"},{"path":"/reference/dap.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get Data (Data Access Protocol) — dap","text":"URL local file path URL catalog subset open.dap catalog AOI sf SpatVect point polygon extract data startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data varname variable name extract (e.g. tmin) grid list containing extent (), crs start non \"dated\" items, start can called index end non \"dated\" items, end can called index toptobottom data inverse? verbose dap_summary printed?","code":""},{"path":"/reference/dap.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get Data (Data Access Protocol) — dap","text":"data.frame","code":""},{"path":"/reference/dap.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Get Data (Data Access Protocol) — dap","text":"Wraps dap_get dap_crop one. AOI NULL spatial crop executed. startDate endDate NULL, temporal crop executed. just endDate NULL defaults startDate.","code":""},{"path":[]},{"path":"/reference/dap_crop.html","id":null,"dir":"Reference","previous_headings":"","what":"Crop DAP file — dap_crop","title":"Crop DAP file — dap_crop","text":"Crop DAP file","code":""},{"path":"/reference/dap_crop.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Crop DAP file — dap_crop","text":"","code":"dap_crop( URL = NULL, catalog = NULL, AOI = NULL, startDate = NULL, endDate = NULL, start = NULL, end = NULL, varname = NULL, verbose = TRUE )"},{"path":"/reference/dap_crop.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Crop DAP file — dap_crop","text":"URL local file path URL catalog subset open.dap catalog AOI sf SpatVect point polygon extract data startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data start non \"dated\" items, start can called index end non \"dated\" items, end can called index varname variable name extract (e.g. tmin) verbose dap_summary printed?","code":""},{"path":"/reference/dap_crop.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Crop DAP file — dap_crop","text":"data.frame","code":""},{"path":[]},{"path":"/reference/dap_get.html","id":null,"dir":"Reference","previous_headings":"","what":"Get DAP resource data — dap_get","title":"Get DAP resource data — dap_get","text":"Get DAP resource data","code":""},{"path":"/reference/dap_get.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get DAP resource data — dap_get","text":"","code":"dap_get(dap, varname = NULL)"},{"path":"/reference/dap_get.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get DAP resource data — dap_get","text":"dap data.frame catalog dap_crop varname name variable extract. NULL, get ","code":""},{"path":"/reference/dap_get.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get DAP resource data — dap_get","text":"SpatRaster","code":""},{"path":[]},{"path":"/reference/dap_meta.html","id":null,"dir":"Reference","previous_headings":"","what":"Find DAP Metadata — dap_meta","title":"Find DAP Metadata — dap_meta","text":"Find DAP Metadata","code":""},{"path":"/reference/dap_meta.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Find DAP Metadata — dap_meta","text":"","code":"dap_meta(raw)"},{"path":"/reference/dap_meta.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Find DAP Metadata — dap_meta","text":"raw data.frame","code":""},{"path":"/reference/dap_meta.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Find DAP Metadata — dap_meta","text":"data.frame","code":""},{"path":[]},{"path":"/reference/dap_summary.html","id":null,"dir":"Reference","previous_headings":"","what":"Print Summary Information About a OpenDAP Resource — dap_summary","title":"Print Summary Information About a OpenDAP Resource — dap_summary","text":"Print summary information DAP summary","code":""},{"path":"/reference/dap_summary.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Print Summary Information About a OpenDAP Resource — dap_summary","text":"","code":"dap_summary(dap = NULL, url = NULL)"},{"path":"/reference/dap_summary.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Print Summary Information About a OpenDAP Resource — dap_summary","text":"dap data.frame catalog dap_crop url Unique Resource Identifier (http local)","code":""},{"path":[]},{"path":"/reference/dap_to_local.html","id":null,"dir":"Reference","previous_headings":"","what":"Convert OpenDAP to start/count call — dap_to_local","title":"Convert OpenDAP to start/count call — dap_to_local","text":"Convert OpenDAP start/count call","code":""},{"path":"/reference/dap_to_local.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Convert OpenDAP to start/count call — dap_to_local","text":"","code":"dap_to_local(dap, get = TRUE)"},{"path":"/reference/dap_to_local.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Convert OpenDAP to start/count call — dap_to_local","text":"dap dap description get data collected?","code":""},{"path":"/reference/dap_to_local.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Convert OpenDAP to start/count call — dap_to_local","text":"numeric array","code":""},{"path":[]},{"path":"/reference/dap_xyzv.html","id":null,"dir":"Reference","previous_headings":"","what":"Get XYTV data from DAP URL — dap_xyzv","title":"Get XYTV data from DAP URL — dap_xyzv","text":"Get XYTV data DAP URL","code":""},{"path":"/reference/dap_xyzv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get XYTV data from DAP URL — dap_xyzv","text":"","code":"dap_xyzv(obj, varname = NULL, varmeta = FALSE)"},{"path":"/reference/dap_xyzv.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get XYTV data from DAP URL — dap_xyzv","text":"obj OpenDap URL NetCDF object varname name variable extract. NULL, get varmeta variable metadata appended?","code":""},{"path":"/reference/dap_xyzv.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get XYTV data from DAP URL — dap_xyzv","text":"data.frame (varname, X_name, Y_name, T_name)","code":""},{"path":[]},{"path":"/reference/dot-resource_grid.html","id":null,"dir":"Reference","previous_headings":"","what":"Extract grid metadata from NC Pointer — .resource_grid","title":"Extract grid metadata from NC Pointer — .resource_grid","text":"Extract grid metadata NC Pointer","code":""},{"path":"/reference/dot-resource_grid.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Extract grid metadata from NC Pointer — .resource_grid","text":"","code":".resource_grid(URL, X_name = NULL, Y_name = NULL, stopIfNotEqualSpaced = TRUE)"},{"path":"/reference/dot-resource_grid.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Extract grid metadata from NC Pointer — .resource_grid","text":"URL location data process X_name Name X diminsion. NULL found Y_name Name Y diminsion. NULL found stopIfNotEqualSpaced stop equal space grid","code":""},{"path":"/reference/dot-resource_grid.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Extract grid metadata from NC Pointer — .resource_grid","text":"list","code":""},{"path":[]},{"path":"/reference/dot-resource_time.html","id":null,"dir":"Reference","previous_headings":"","what":"Extract time metadata from NC Pointer — .resource_time","title":"Extract time metadata from NC Pointer — .resource_time","text":"Extract time metadata NC Pointer","code":""},{"path":"/reference/dot-resource_time.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Extract time metadata from NC Pointer — .resource_time","text":"","code":".resource_time(URL, T_name = NULL)"},{"path":"/reference/dot-resource_time.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Extract time metadata from NC Pointer — .resource_time","text":"URL location data process T_name Name T dimension. NULL found","code":""},{"path":"/reference/dot-resource_time.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Extract time metadata from NC Pointer — .resource_time","text":"list","code":""},{"path":[]},{"path":"/reference/extract_sites.html","id":null,"dir":"Reference","previous_headings":"","what":"Extract Sites — extract_sites","title":"Extract Sites — extract_sites","text":"extract timeseries values raster stack set points","code":""},{"path":"/reference/extract_sites.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Extract Sites — extract_sites","text":"","code":"extract_sites(r, pts, id)"},{"path":"/reference/extract_sites.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Extract Sites — extract_sites","text":"r SpatRaster object pts point extract id unique identifier point (column name pts)","code":""},{"path":"/reference/extract_sites.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Extract Sites — extract_sites","text":"data.frame columes representing points, rows time periods","code":""},{"path":[]},{"path":"/reference/get3DEP.html","id":null,"dir":"Reference","previous_headings":"","what":"Get USGS 3DEP DEMs — get3DEP","title":"Get USGS 3DEP DEMs — get3DEP","text":"Get USGS 3DEP DEMs","code":""},{"path":"/reference/get3DEP.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get USGS 3DEP DEMs — get3DEP","text":"","code":"get3DEP(AOI, resolution = \"30m\")"},{"path":"/reference/get3DEP.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get USGS 3DEP DEMs — get3DEP","text":"AOI sf SpatVect point polygon extract data resolution DEM resolution (10m 30m (default))","code":""},{"path":"/reference/get3DEP.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get USGS 3DEP DEMs — get3DEP","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getBCCA.html","id":null,"dir":"Reference","previous_headings":"","what":"Get BCCA data — getBCCA","title":"Get BCCA data — getBCCA","text":"Get BCCA data","code":""},{"path":"/reference/getBCCA.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get BCCA data — getBCCA","text":"","code":"getBCCA( AOI, varname, model = \"CCSM4\", scenario = \"rcp45\", ensemble = NULL, startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getBCCA.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get BCCA data — getBCCA","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) model GCM model name generating scenario climate modeling scenario ensemble model ensemble member used generate data startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getBCCA.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get BCCA data — getBCCA","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getCHIRPS.html","id":null,"dir":"Reference","previous_headings":"","what":"Get CHIRPS data — getCHIRPS","title":"Get CHIRPS data — getCHIRPS","text":"Get CHIRPS data","code":""},{"path":"/reference/getCHIRPS.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get CHIRPS data — getCHIRPS","text":"","code":"getCHIRPS( AOI, varname = NULL, timeRes = \"daily\", startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getCHIRPS.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get CHIRPS data — getCHIRPS","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) timeRes \"Pentad\", \"Annual\", \"Daily\" (default), \"Monthly\" startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getCHIRPS.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get CHIRPS data — getCHIRPS","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getDaymet.html","id":null,"dir":"Reference","previous_headings":"","what":"Get Daymet Climate Data for an Area of Interest — getDaymet","title":"Get Daymet Climate Data for an Area of Interest — getDaymet","text":"dataset provides Daymet Version 4 model output data gridded estimates daily weather parameters North America. Daymet output variables include following parameters: minimum temperature, maximum temperature, precipitation, shortwave radiation, vapor pressure, snow water equivalent, day length. dataset covers period January 1, 1980 December 31 recent full calendar year. subsequent year processed individually close calendar year allowing adequate time input weather station data archive quality. Daymet variables continuous surfaces provided individual files, year, 1-km x 1-km spatial resolution daily temporal resolution. Data Lambert Conformal Conic projection North America netCDF file format compliant Climate Forecast (CF) metadata conventions.","code":""},{"path":"/reference/getDaymet.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get Daymet Climate Data for an Area of Interest — getDaymet","text":"","code":"getDaymet( AOI, varname = NULL, startDate = NULL, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getDaymet.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get Daymet Climate Data for an Area of Interest — getDaymet","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getDaymet.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get Daymet Climate Data for an Area of Interest — getDaymet","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getDodsrcPath.html","id":null,"dir":"Reference","previous_headings":"","what":"Get a default dodsrc file path — getDodsrcPath","title":"Get a default dodsrc file path — getDodsrcPath","text":"Get default dodsrc file path","code":""},{"path":"/reference/getDodsrcPath.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get a default dodsrc file path — getDodsrcPath","text":"","code":"getDodsrcPath()"},{"path":"/reference/getDodsrcPath.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get a default dodsrc file path — getDodsrcPath","text":"character vector containing default netrc file path","code":""},{"path":[]},{"path":"/reference/getDodsrcPath.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get a default dodsrc file path — getDodsrcPath","text":"","code":"getDodsrcPath() #> [1] \"/Users/mjohnson/.dodsrc\""},{"path":"/reference/getGLDAS.html","id":null,"dir":"Reference","previous_headings":"","what":"Get GLDAS data — getGLDAS","title":"Get GLDAS data — getGLDAS","text":"Get GLDAS data","code":""},{"path":"/reference/getGLDAS.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get GLDAS data — getGLDAS","text":"","code":"getGLDAS( AOI, varname = NULL, model = NULL, startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getGLDAS.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get GLDAS data — getGLDAS","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) model GCM model name generating startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getGLDAS.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get GLDAS data — getGLDAS","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getGridMET.html","id":null,"dir":"Reference","previous_headings":"","what":"Get GridMet Climate Data for an Area of Interest — getGridMET","title":"Get GridMet Climate Data for an Area of Interest — getGridMET","text":"gridMET dataset daily high-spatial resolution (~4-km, 1/24th degree) surface meteorological data covering contiguous US 1979-yesterday. data updated daily.","code":""},{"path":"/reference/getGridMET.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get GridMet Climate Data for an Area of Interest — getGridMET","text":"","code":"getGridMET( AOI, varname, startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getGridMET.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get GridMet Climate Data for an Area of Interest — getGridMET","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getGridMET.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get GridMet Climate Data for an Area of Interest — getGridMET","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getLCMAP.html","id":null,"dir":"Reference","previous_headings":"","what":"Get USGS LCMAP — getLCMAP","title":"Get USGS LCMAP — getLCMAP","text":"Land Change Monitoring, Assessment, Projection","code":""},{"path":"/reference/getLCMAP.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get USGS LCMAP — getLCMAP","text":"","code":"getLCMAP(AOI, year = 2019, type = \"primary landcover\")"},{"path":"/reference/getLCMAP.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get USGS LCMAP — getLCMAP","text":"AOI sf SpatVect point polygon extract data year Land cover product year 1985 - 2019 (default = 2019) type product type (primary landcover (default), secondary landcover, primary confidence, secondary confidence, cover change, change day, change magniture, model cquality, spectral stability, spectral lastchance)","code":""},{"path":"/reference/getLCMAP.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get USGS LCMAP — getLCMAP","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getLOCA.html","id":null,"dir":"Reference","previous_headings":"","what":"Get LOCA Climate Data for an Area of Interest — getLOCA","title":"Get LOCA Climate Data for an Area of Interest — getLOCA","text":"LOCA statistical downscaling technique uses past history add improved fine-scale detail global climate models. LOCA used downscale 32 global climate models CMIP5 archive 1/16th degree spatial resolution, covering North America central Mexico Southern Canada. historical period 1950-2005, two future scenarios available: RCP 4.5 RCP 8.5 period 2006-2100 (although models stop 2099). variables currently available daily minimum maximum temperature, daily precipitation. information visit: http://loca.ucsd.edu/.","code":""},{"path":"/reference/getLOCA.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get LOCA Climate Data for an Area of Interest — getLOCA","text":"","code":"getLOCA( AOI, varname, model = \"CCSM4\", scenario = \"rcp45\", startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getLOCA.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get LOCA Climate Data for an Area of Interest — getLOCA","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) model GCM model name generating scenario climate modeling scenario startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getLOCA.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get LOCA Climate Data for an Area of Interest — getLOCA","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getLOCA_hydro.html","id":null,"dir":"Reference","previous_headings":"","what":"Get LOCA Hydrology data — getLOCA_hydro","title":"Get LOCA Hydrology data — getLOCA_hydro","text":"Get LOCA Hydrology data","code":""},{"path":"/reference/getLOCA_hydro.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get LOCA Hydrology data — getLOCA_hydro","text":"","code":"getLOCA_hydro( AOI, varname, model = \"CCSM4\", scenario = \"rcp45\", startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getLOCA_hydro.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get LOCA Hydrology data — getLOCA_hydro","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) model GCM model name generating scenario climate modeling scenario startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getLOCA_hydro.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get LOCA Hydrology data — getLOCA_hydro","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getLivneh.html","id":null,"dir":"Reference","previous_headings":"","what":"Get Livneh data — getLivneh","title":"Get Livneh data — getLivneh","text":"Get Livneh data","code":""},{"path":"/reference/getLivneh.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get Livneh data — getLivneh","text":"","code":"getLivneh( AOI, varname = NULL, startDate, endDate = NULL, timeRes = \"daily\", verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getLivneh.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get Livneh data — getLivneh","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data timeRes daily monthly verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getLivneh.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get Livneh data — getLivneh","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getLivneh_fluxes.html","id":null,"dir":"Reference","previous_headings":"","what":"Get Livneh Flux data — getLivneh_fluxes","title":"Get Livneh Flux data — getLivneh_fluxes","text":"Get Livneh Flux data","code":""},{"path":"/reference/getLivneh_fluxes.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get Livneh Flux data — getLivneh_fluxes","text":"","code":"getLivneh_fluxes( AOI, varname = NULL, startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getLivneh_fluxes.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get Livneh Flux data — getLivneh_fluxes","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getLivneh_fluxes.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get Livneh Flux data — getLivneh_fluxes","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getMACA.html","id":null,"dir":"Reference","previous_headings":"","what":"Get MACA Climate Data for an Area of Interest — getMACA","title":"Get MACA Climate Data for an Area of Interest — getMACA","text":"Multivariate Adaptive Constructed Analogs (MACA) statistical method downscaling Global Climate Models (GCMs) native coarse resolution higher spatial resolution captures reflects observed patterns daily near-surface meteorology simulated changes GCMs experiments.","code":""},{"path":"/reference/getMACA.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get MACA Climate Data for an Area of Interest — getMACA","text":"","code":"getMACA( AOI, varname, timeRes = \"day\", model = \"CCSM4\", scenario = \"rcp45\", startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getMACA.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get MACA Climate Data for an Area of Interest — getMACA","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) timeRes daily monthly model GCM model name generating scenario climate modeling scenario startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getMACA.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get MACA Climate Data for an Area of Interest — getMACA","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getMODIS.html","id":null,"dir":"Reference","previous_headings":"","what":"Get MODIS data — getMODIS","title":"Get MODIS data — getMODIS","text":"Get MODIS data","code":""},{"path":"/reference/getMODIS.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get MODIS data — getMODIS","text":"","code":"getMODIS( AOI, asset = NULL, varname = NULL, startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getMODIS.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get MODIS data — getMODIS","text":"AOI sf SpatVect point polygon extract data asset MODIS sensor varname variable name extract (e.g. tmin) startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getMODIS.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get MODIS data — getMODIS","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getNASADEM.html","id":null,"dir":"Reference","previous_headings":"","what":"Get NASA Global DEM — getNASADEM","title":"Get NASA Global DEM — getNASADEM","text":"Get NASA Global DEM","code":""},{"path":"/reference/getNASADEM.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get NASA Global DEM — getNASADEM","text":"","code":"getNASADEM(AOI)"},{"path":"/reference/getNASADEM.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get NASA Global DEM — getNASADEM","text":"AOI sf SpatVect point polygon extract data ","code":""},{"path":"/reference/getNASADEM.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get NASA Global DEM — getNASADEM","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getNLCD.html","id":null,"dir":"Reference","previous_headings":"","what":"Get USGS National Land Cover Dataset — getNLCD","title":"Get USGS National Land Cover Dataset — getNLCD","text":"Get USGS National Land Cover Dataset","code":""},{"path":"/reference/getNLCD.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get USGS National Land Cover Dataset — getNLCD","text":"","code":"getNLCD(AOI, year = 2019, type = \"land cover\")"},{"path":"/reference/getNLCD.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get USGS National Land Cover Dataset — getNLCD","text":"AOI sf SpatVect point polygon extract data year Landcover product year (2001, 2011,2016,2019) type product type","code":""},{"path":"/reference/getNLCD.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get USGS National Land Cover Dataset — getNLCD","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getNLDAS.html","id":null,"dir":"Reference","previous_headings":"","what":"Get NLDAS data — getNLDAS","title":"Get NLDAS data — getNLDAS","text":"Get NLDAS data","code":""},{"path":"/reference/getNLDAS.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get NLDAS data — getNLDAS","text":"","code":"getNLDAS( AOI, varname = NULL, model = NULL, startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getNLDAS.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get NLDAS data — getNLDAS","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) model GCM model name generating startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getNLDAS.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get NLDAS data — getNLDAS","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getNetrcPath.html","id":null,"dir":"Reference","previous_headings":"","what":"Get the default netrc file path — getNetrcPath","title":"Get the default netrc file path — getNetrcPath","text":"Get default netrc file path","code":""},{"path":"/reference/getNetrcPath.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get the default netrc file path — getNetrcPath","text":"","code":"getNetrcPath()"},{"path":"/reference/getNetrcPath.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get the default netrc file path — getNetrcPath","text":"character vector containing default netrc file path","code":""},{"path":[]},{"path":"/reference/getNetrcPath.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get the default netrc file path — getNetrcPath","text":"","code":"getNetrcPath() #> [1] \"/Users/mjohnson/.netrc\""},{"path":"/reference/getPRISM.html","id":null,"dir":"Reference","previous_headings":"","what":"Get PRISM data — getPRISM","title":"Get PRISM data — getPRISM","text":"Get PRISM data","code":""},{"path":"/reference/getPRISM.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get PRISM data — getPRISM","text":"","code":"getPRISM( AOI, varname = NULL, startDate, endDate = NULL, timeRes = \"daily\", verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getPRISM.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get PRISM data — getPRISM","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data timeRes daily monthly verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getPRISM.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get PRISM data — getPRISM","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getTerraClim.html","id":null,"dir":"Reference","previous_headings":"","what":"Get Terra Climate Data for an Area of Interest — getTerraClim","title":"Get Terra Climate Data for an Area of Interest — getTerraClim","text":"Get Terra Climate Data Area Interest","code":""},{"path":"/reference/getTerraClim.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get Terra Climate Data for an Area of Interest — getTerraClim","text":"","code":"getTerraClim( AOI, varname = NULL, startDate = NULL, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getTerraClim.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get Terra Climate Data for an Area of Interest — getTerraClim","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getTerraClim.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get Terra Climate Data for an Area of Interest — getTerraClim","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getTerraClimNormals.html","id":null,"dir":"Reference","previous_headings":"","what":"Get Terra Climate Normals for an Area of Interest — getTerraClimNormals","title":"Get Terra Climate Normals for an Area of Interest — getTerraClimNormals","text":"layers TerraClimate creating using climatically aided interpolation monthly anomalies CRU Ts4.0 Japanese 55-year Reanalysis (JRA-55) datasets WorldClim v2.0 climatologies.","code":""},{"path":"/reference/getTerraClimNormals.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get Terra Climate Normals for an Area of Interest — getTerraClimNormals","text":"","code":"getTerraClimNormals( AOI, varname, scenario = \"19812010\", month = 1:12, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getTerraClimNormals.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get Terra Climate Normals for an Area of Interest — getTerraClimNormals","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) scenario climate modeling scenario month numeric. month vector months access. Default 1:12 verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getTerraClimNormals.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get Terra Climate Normals for an Area of Interest — getTerraClimNormals","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getVIC.html","id":null,"dir":"Reference","previous_headings":"","what":"Get VIC data — getVIC","title":"Get VIC data — getVIC","text":"Get VIC data","code":""},{"path":"/reference/getVIC.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get VIC data — getVIC","text":"","code":"getVIC( AOI, varname, model = \"CCSM4\", scenario = \"rcp45\", startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getVIC.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get VIC data — getVIC","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) model GCM model name generating scenario climate modeling scenario startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getVIC.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get VIC data — getVIC","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getWorldClim.html","id":null,"dir":"Reference","previous_headings":"","what":"Get WorlClim gridded weather and climate data for historical (near current) conditions. — getWorldClim","title":"Get WorlClim gridded weather and climate data for historical (near current) conditions. — getWorldClim","text":"WorldClim database high spatial resolution global weather climate data. data can used mapping spatial modeling.","code":""},{"path":"/reference/getWorldClim.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get WorlClim gridded weather and climate data for historical (near current) conditions. — getWorldClim","text":"","code":"getWorldClim( AOI = NULL, varname = NULL, model = \"wc2.1_5m\", month = 1:12, verbose = TRUE )"},{"path":"/reference/getWorldClim.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get WorlClim gridded weather and climate data for historical (near current) conditions. — getWorldClim","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) model GCM model name generating month numeric. month vector months access. Default 1:12 verbose messages emited?","code":""},{"path":"/reference/getWorldClim.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get WorlClim gridded weather and climate data for historical (near current) conditions. — getWorldClim","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/get_data.html","id":null,"dir":"Reference","previous_headings":"","what":"Get DAP Array — get_data","title":"Get DAP Array — get_data","text":"Get DAP Array","code":""},{"path":"/reference/get_data.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get DAP Array — get_data","text":"","code":"get_data(dap)"},{"path":"/reference/get_data.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get DAP Array — get_data","text":"dap dap description","code":""},{"path":"/reference/get_data.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get DAP Array — get_data","text":"SpatRast","code":""},{"path":[]},{"path":"/reference/go_get_dap_data.html","id":null,"dir":"Reference","previous_headings":"","what":"Read formated DAP URL as SpatRast — go_get_dap_data","title":"Read formated DAP URL as SpatRast — go_get_dap_data","text":"Read formated DAP URL SpatRast","code":""},{"path":"/reference/go_get_dap_data.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Read formated DAP URL as SpatRast — go_get_dap_data","text":"","code":"go_get_dap_data(dap)"},{"path":"/reference/go_get_dap_data.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Read formated DAP URL as SpatRast — go_get_dap_data","text":"dap output dap_crop","code":""},{"path":"/reference/go_get_dap_data.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Read formated DAP URL as SpatRast — go_get_dap_data","text":"SpatRast","code":""},{"path":[]},{"path":"/reference/grid_meta.html","id":null,"dir":"Reference","previous_headings":"","what":"Find DAP grid metadata — grid_meta","title":"Find DAP grid metadata — grid_meta","text":"Find DAP grid metadata","code":""},{"path":"/reference/grid_meta.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Find DAP grid metadata — grid_meta","text":"","code":"grid_meta(raw)"},{"path":"/reference/grid_meta.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Find DAP grid metadata — grid_meta","text":"raw data.frame","code":""},{"path":"/reference/grid_meta.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Find DAP grid metadata — grid_meta","text":"data.frame","code":""},{"path":[]},{"path":"/reference/make_ext.html","id":null,"dir":"Reference","previous_headings":"","what":"Convert catalog entry to extent — make_ext","title":"Convert catalog entry to extent — make_ext","text":"Convert catalog entry extent","code":""},{"path":"/reference/make_ext.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Convert catalog entry to extent — make_ext","text":"","code":"make_ext(cat)"},{"path":"/reference/make_ext.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Convert catalog entry to extent — make_ext","text":"cat catalog entry (data.frame Xn, X1, Yn, Y1, crs)","code":""},{"path":"/reference/make_ext.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Convert catalog entry to extent — make_ext","text":"SpatExtent","code":""},{"path":[]},{"path":"/reference/make_vect.html","id":null,"dir":"Reference","previous_headings":"","what":"Make Vector — make_vect","title":"Make Vector — make_vect","text":"Make Vector","code":""},{"path":"/reference/make_vect.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Make Vector — make_vect","text":"","code":"make_vect(cat)"},{"path":"/reference/make_vect.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Make Vector — make_vect","text":"cat catalog entry (data.frame Xn, X1, Yn, Y1, crs)","code":""},{"path":"/reference/make_vect.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Make Vector — make_vect","text":"SpatVect","code":""},{"path":[]},{"path":"/reference/merge_across_time.html","id":null,"dir":"Reference","previous_headings":"","what":"Merge List of SpatRaster's across time — merge_across_time","title":"Merge List of SpatRaster's across time — merge_across_time","text":"Given list SpatRasters possibly shared names, merge across time","code":""},{"path":"/reference/merge_across_time.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Merge List of SpatRaster's across time — merge_across_time","text":"","code":"merge_across_time(data)"},{"path":"/reference/merge_across_time.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Merge List of SpatRaster's across time — merge_across_time","text":"data list names SpatRasters","code":""},{"path":"/reference/merge_across_time.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Merge List of SpatRaster's across time — merge_across_time","text":"data.frame (varname, X_name, Y_name, T_name)","code":""},{"path":[]},{"path":"/reference/params.html","id":null,"dir":"Reference","previous_headings":"","what":"ClimateR Catalog — params","title":"ClimateR Catalog — params","text":"ClimateR Catalog","code":""},{"path":"/reference/params.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"ClimateR Catalog — params","text":"","code":"params"},{"path":"/reference/params.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"ClimateR Catalog — params","text":"object class data.table (inherits data.frame) 107857 rows 28 columns.","code":""},{"path":[]},{"path":"/reference/parse_date.html","id":null,"dir":"Reference","previous_headings":"","what":"Parse Dates from duration and interval — parse_date","title":"Parse Dates from duration and interval — parse_date","text":"Parse Dates duration interval","code":""},{"path":"/reference/parse_date.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Parse Dates from duration and interval — parse_date","text":"","code":"parse_date(duration, interval)"},{"path":"/reference/parse_date.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Parse Dates from duration and interval — parse_date","text":"duration time duration interval time interval","code":""},{"path":[]},{"path":"/reference/read_dap_file.html","id":null,"dir":"Reference","previous_headings":"","what":"Read from a OpenDAP landing page — read_dap_file","title":"Read from a OpenDAP landing page — read_dap_file","text":"Reads OpenDap resources returns metadata","code":""},{"path":"/reference/read_dap_file.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Read from a OpenDAP landing page — read_dap_file","text":"","code":"read_dap_file(URL, varname = NULL, id, varmeta = TRUE)"},{"path":"/reference/read_dap_file.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Read from a OpenDAP landing page — read_dap_file","text":"URL URL OpenDap resource varname name variable extract. NULL, get id character. Uniquely named dataset identifier varmeta variable metadata appended?","code":""},{"path":"/reference/read_dap_file.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Read from a OpenDAP landing page — read_dap_file","text":"data.frame","code":""},{"path":[]},{"path":"/reference/read_ftp.html","id":null,"dir":"Reference","previous_headings":"","what":"Read from FTP — read_ftp","title":"Read from FTP — read_ftp","text":"Read FTP","code":""},{"path":"/reference/read_ftp.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Read from FTP — read_ftp","text":"","code":"read_ftp(URL, cat, lyrs = 1, AOI, ext = NULL, crs = NULL, dates = NULL)"},{"path":"/reference/read_ftp.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Read from FTP — read_ftp","text":"URL Unique Resource Identifier (http local) cat catalog element lyrs lyrs extract AOI Area Interest ext extent source (needed) crs crs source (needed) dates dates data","code":""},{"path":"/reference/read_ftp.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Read from FTP — read_ftp","text":"SpatRaster","code":""},{"path":[]},{"path":"/reference/read_live_catalog.html","id":null,"dir":"Reference","previous_headings":"","what":"Read Live Catalog from Github release — read_live_catalog","title":"Read Live Catalog from Github release — read_live_catalog","text":"Every month, data catalog refreshed. function reads current catalog Github release.","code":""},{"path":"/reference/read_live_catalog.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Read Live Catalog from Github release — read_live_catalog","text":"","code":"read_live_catalog( url = \"https://github.com/mikejohnson51/climateR-catalogs/releases/latest/download/catalog.parquet\" )"},{"path":"/reference/read_live_catalog.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Read Live Catalog from Github release — read_live_catalog","text":"url URL read","code":""},{"path":"/reference/read_live_catalog.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Read Live Catalog from Github release — read_live_catalog","text":"data.frame","code":""},{"path":[]},{"path":"/reference/time_meta.html","id":null,"dir":"Reference","previous_headings":"","what":"Find DAP time metadata — time_meta","title":"Find DAP time metadata — time_meta","text":"Find DAP time metadata","code":""},{"path":"/reference/time_meta.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Find DAP time metadata — time_meta","text":"","code":"time_meta(raw)"},{"path":"/reference/time_meta.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Find DAP time metadata — time_meta","text":"raw data.frame","code":""},{"path":"/reference/time_meta.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Find DAP time metadata — time_meta","text":"data.frame","code":""},{"path":[]},{"path":"/reference/try_att.html","id":null,"dir":"Reference","previous_headings":"","what":"TryCatch around RNetCDF::att.get.nc() — try_att","title":"TryCatch around RNetCDF::att.get.nc() — try_att","text":"TryCatch around RNetCDF::att.get.nc()","code":""},{"path":"/reference/try_att.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"TryCatch around RNetCDF::att.get.nc() — try_att","text":"","code":"try_att(nc, variable, attribute)"},{"path":"/reference/try_att.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"TryCatch around RNetCDF::att.get.nc() — try_att","text":"nc \"NetCDF\" object points NetCDF dataset. Found RNetCDF::open.nc. variable ID name variable attribute read, \"NC_GLOBAL\" global attribute. attribute Attribute name ID.","code":""},{"path":"/reference/try_att.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"TryCatch around RNetCDF::att.get.nc() — try_att","text":"Vector data type depends NetCDF variable. NetCDF variables type NC_CHAR, R type either character raw, specified argument rawchar. NC_STRING, R type character. Numeric variables read double precision default, smallest R type exactly represents external type used fitnum TRUE.","code":""},{"path":[]},{"path":"/reference/var_to_terra.html","id":null,"dir":"Reference","previous_headings":"","what":"Variable Array to SpatRast — var_to_terra","title":"Variable Array to SpatRast — var_to_terra","text":"Variable Array SpatRast","code":""},{"path":"/reference/var_to_terra.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Variable Array to SpatRast — var_to_terra","text":"","code":"var_to_terra(var, dap)"},{"path":"/reference/var_to_terra.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Variable Array to SpatRast — var_to_terra","text":"var numeric array dap dap description","code":""},{"path":"/reference/var_to_terra.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Variable Array to SpatRast — var_to_terra","text":"SpatRast","code":""},{"path":[]},{"path":"/reference/variable_meta.html","id":null,"dir":"Reference","previous_headings":"","what":"Find DAP variable metadata — variable_meta","title":"Find DAP variable metadata — variable_meta","text":"Find DAP variable metadata","code":""},{"path":"/reference/variable_meta.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Find DAP variable metadata — variable_meta","text":"","code":"variable_meta(raw, verbose = TRUE)"},{"path":"/reference/variable_meta.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Find DAP variable metadata — variable_meta","text":"raw data.frame verbose emit messages","code":""},{"path":"/reference/variable_meta.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Find DAP variable metadata — variable_meta","text":"data.frame","code":""},{"path":[]},{"path":"/reference/vrt_crop_get.html","id":null,"dir":"Reference","previous_headings":"","what":"VRT Crop — vrt_crop_get","title":"VRT Crop — vrt_crop_get","text":"VRT Crop","code":""},{"path":"/reference/vrt_crop_get.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"VRT Crop — vrt_crop_get","text":"","code":"vrt_crop_get( URL = NULL, catalog = NULL, AOI = NULL, grid = NULL, varname = NULL, start = NULL, end = NULL, toptobottom = FALSE, verbose = TRUE )"},{"path":"/reference/vrt_crop_get.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"VRT Crop — vrt_crop_get","text":"URL local file path URL catalog subset open.dap catalog AOI sf SpatVect point polygon extract data grid list containing extent (), crs varname variable name extract (e.g. tmin) start non \"dated\" items, start can called index end non \"dated\" items, end can called index toptobottom data inverse? verbose dap_summary printed?","code":""},{"path":"/reference/vrt_crop_get.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"VRT Crop — vrt_crop_get","text":"SpatRaster","code":""},{"path":[]},{"path":"/reference/writeDodsrc.html","id":null,"dir":"Reference","previous_headings":"","what":"Write dodsrc file — writeDodsrc","title":"Write dodsrc file — writeDodsrc","text":"Write dodsrc file valid netrc file","code":""},{"path":"/reference/writeDodsrc.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Write dodsrc file — writeDodsrc","text":"","code":"writeDodsrc(netrcFile = getNetrcPath(), dodsrcFile = \".dodsrc\")"},{"path":"/reference/writeDodsrc.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Write dodsrc file — writeDodsrc","text":"netrcFile character. path netrc file . dodsrcFile path dodsrc file want write default go home directory, advised","code":""},{"path":"/reference/writeDodsrc.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Write dodsrc file — writeDodsrc","text":"character vector containing netrc file path","code":""},{"path":[]},{"path":"/reference/writeNetrc.html","id":null,"dir":"Reference","previous_headings":"","what":"Write netrc file — writeNetrc","title":"Write netrc file — writeNetrc","text":"Write netrc file valid accessing urs.earthdata.nasa.gov","code":""},{"path":"/reference/writeNetrc.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Write netrc file — writeNetrc","text":"","code":"writeNetrc( login, password, machine = \"urs.earthdata.nasa.gov\", netrcFile = getNetrcPath(), overwrite = FALSE )"},{"path":"/reference/writeNetrc.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Write netrc file — writeNetrc","text":"login character. Email address used logging earthdata password character. Password associated login. machine machine logging netrcFile character. path netrc file written. default go home directory, advised overwrite logical. overwrite existing netrc file?","code":""},{"path":"/reference/writeNetrc.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Write netrc file — writeNetrc","text":"character vector containing netrc file path","code":""},{"path":"/reference/writeNetrc.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Write netrc file — writeNetrc","text":"database accessed user's credentials. netrc file storing login password information required. See . set must following (1) Login EarthData (2) Go Applications > Authorized Apps (3) NASA GESDISC DATA ARCHIVE Approved Applications list, select APPROVE APPLICATIONS (4) Find NASA GESDISC DATA ARCHIVE click AUTHORIZE instruction register set DataSpace credential.","code":""},{"path":[]},{"path":"/reference/writeNetrc.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Write netrc file — writeNetrc","text":"","code":"if (FALSE) { writeNetrc( login = \"XXX@email.com\", password = \"yourSecretPassword\" ) }"}]
+[{"path":[]},{"path":"/articles/01-intro.html","id":"usful-packages-for-climate-data","dir":"Articles","previous_headings":"","what":"Usful Packages for climate data","title":"Welcome to climateR","text":"","code":"library(AOI) library(climateR) library(tidyterra) library(ggplot2) library(terra) library(tidyr) library(sf)"},{"path":"/articles/01-intro.html","id":"climater-examples","dir":"Articles","previous_headings":"","what":"climateR examples","title":"Welcome to climateR","text":"climateR package supplemented AOI framework established AOI R package. get climate product, area interest must defined: loading polygon state North Carolina examples constructing AOI calls can found . AOI, can construct call dataset parameter(s) date(s) choice. querying PRISM dataset maximum minimum temperature October 29, 2018:","code":"AOI = aoi_get(state = \"NC\") plot(AOI$geometry) system.time({ p = getPRISM(AOI, varname = c('tmax','tmin'), startDate = \"2018-10-29\") }) #> user system elapsed #> 0.442 0.058 2.508"},{"path":"/articles/01-intro.html","id":"data-from-known-bounding-coordinates","dir":"Articles","previous_headings":"","what":"Data from known bounding coordinates","title":"Welcome to climateR","text":"climateR offers support sf, sfc, bbox objects. requesting wind velocity data four corners region USA bounding coordinates.","code":"AOI = st_as_sfc(st_bbox(c(xmin = -112, xmax = -105, ymax = 39, ymin = 34), crs = 4326)) g = getGridMET(st_as_sf(AOI), varname = \"vs\", startDate = \"2018-09-01\")"},{"path":"/articles/01-intro.html","id":"data-through-time","dir":"Articles","previous_headings":"","what":"Data through time …","title":"Welcome to climateR","text":"addition multiple variables can request variables time, let’s look gridMET rainfall Gulf Coast Hurricane Harvey:","code":"harvey = getGridMET(aoi_get(state = c(\"TX\", \"FL\")), varname = \"pr\", startDate = \"2017-08-20\", endDate = \"2017-08-31\") ggplot() + geom_spatraster(data = harvey$precipitation_amount) + facet_wrap(~lyr) + scale_fill_whitebox_c( palette = \"muted\", na.value = \"white\") + theme_minimal()"},{"path":"/articles/01-intro.html","id":"climate-projections","dir":"Articles","previous_headings":"","what":"Climate Projections","title":"Welcome to climateR","text":"sources downscaled Global Climate Models (GCMs). allow query forecasted ensemble members different models /climate scenarios. One example MACA dataset: Getting multiple models results also quite simple: don’t know models, can always grab random set specifying number:","code":"system.time({ m = getMACA(AOI = aoi_get(state = \"FL\"), model = \"CCSM4\", varname = 'pr', scenario = c('rcp45', 'rcp85'), startDate = \"2080-06-29\", endDate = \"2080-06-30\") }) #> user system elapsed #> 0.205 0.036 2.479 models = c(\"BNU-ESM\",\"CanESM2\", \"CCSM4\") temp = getMACA(AOI = aoi_get(state = \"CO\"), varname = 'tasmin', model = models, startDate = \"2080-11-29\") temp[[1]]$mean = app(temp[[1]], mean) names(temp[[1]]) = c(models, \"Ensemble Mean\") random = getMACA(aoi_get(state = \"MI\"), model = 3, varname = \"pr\", startDate = \"2050-10-29\")"},{"path":"/articles/01-intro.html","id":"global-datasets","dir":"Articles","previous_headings":"","what":"Global Datasets","title":"Welcome to climateR","text":"datasets USA focused either. TerraClimate offers global, monthly data current year many variables, CHIRPS provides daily rainfall data:","code":"kenya = aoi_get(country = \"Kenya\") tc = getTerraClim(kenya, varname = \"pet\", startDate = \"2018-01-01\") chirps = getCHIRPS(kenya, startDate = \"2018-01-01\", endDate = \"2018-01-04\" )"},{"path":"/articles/01-intro.html","id":"point-based-data","dir":"Articles","previous_headings":"","what":"Point Based Data","title":"Welcome to climateR","text":"Finally, data gathering limited areal extents can retrieved time series locations.","code":"ts = data.frame(lng = -105.0668, lat = 40.55085) %>% st_as_sf(coords = c('lng', 'lat'), crs = 4326) %>% getGridMET(varname = c(\"pr\", 'srad'), startDate = \"2021-01-01\", endDate = \"2021-12-31\")"},{"path":"/articles/01-intro.html","id":"point-based-ensemble","dir":"Articles","previous_headings":"","what":"Point Based Ensemble","title":"Welcome to climateR","text":"","code":"future = getMACA(geocode(\"Fort Collins\", pt = TRUE), model = 5, varname = \"tasmax\", startDate = \"2050-01-01\", endDate = \"2050-01-31\") future_long = pivot_longer(future, -date) ggplot(data = future_long, aes(x = date, y = value, col = name)) + geom_line() + theme_linedraw() + scale_color_brewer(palette = \"Dark2\") + labs(title = \"Fort Collins Temperture: January, 2050\", x = \"Date\", y = \"Degree K\", color = \"Model\")"},{"path":"/articles/01-intro.html","id":"multi-site-extraction","dir":"Articles","previous_headings":"","what":"Multi site extraction","title":"Welcome to climateR","text":"Extracting data set points interesting challenge. turns much efficient grab underlying raster stack extract time series opposed iterating locations: Starting set locations Colorado: climateR grab SpatRaster underlying bounding area points Use extract_sites extract times series locations. id parameter unique identifier site data names resulting columns. make data ‘tidy’ simply pivot date column:","code":"f = system.file(\"co/cities_colorado.rds\", package = \"climateR\") cities = readRDS(f) sites_stack = getTerraClim(AOI = cities, varname = \"tmax\", startDate = \"2018-01-01\", endDate = \"2018-12-31\") sites_wide = extract_sites(r = sites_stack, pts = cities, id = \"NAME\") sites_wide[[1]][1:5, 1:5] #> date ADAMSCITY AGATE AGUILAR AKRON #> 1 2018-01-01 9.5 8.2 11.4 7.1 #> 2 2018-02-01 8.1 7.1 9.9 5.8 #> 3 2018-03-01 14.6 14.1 15.0 13.5 #> 4 2018-04-01 17.5 16.6 17.6 16.2 #> 5 2018-05-01 25.1 25.0 25.5 24.8 tmax = tidyr::pivot_longer(sites_wide[[1]], -date) head(tmax) #> # A tibble: 6 × 3 #> date name value #> #> 1 2018-01-01 00:00:00 ADAMSCITY 9.5 #> 2 2018-01-01 00:00:00 AGATE 8.2 #> 3 2018-01-01 00:00:00 AGUILAR 11.4 #> 4 2018-01-01 00:00:00 AKRON 7.10 #> 5 2018-01-01 00:00:00 ALAMOSA 5.2 #> 6 2018-01-01 00:00:00 ALLENSPARK 6.10"},{"path":"/articles/02-catalogs.html","id":"catalogs","dir":"Articles","previous_headings":"","what":"Catalogs","title":"Catalog Automation","text":"order provide evolving, federated collection datasets, climateR makes use preprocessed catalog, updated monthly cycle. catalog hosted generated climateR-catalogs repository. catalog contains 100,000 thousand datasets 2,000 data providers/archives. following section describes design catalog data pipeline.","code":""},{"path":"/articles/02-catalogs.html","id":"design","dir":"Articles","previous_headings":"Catalogs","what":"Design","title":"Catalog Automation","text":"catalog data pipeline uses targets package establish declarative workflow using data sources target creators. particular, data sources treated dynamic plugins data pipeline, data sources composable within pipeline framework utilizing R6 classes. data source R6 classes expose simple interface plugin creators, adding new data source defined giving data source three things: id pull function tidy function id represents unique identifier data source contained final catalog. pull function function containing number arguments gather catalog items endpoint, collect data.frame. tidy function function accepts least single argument output pull function. function perform necessary actions conform argument close catalog schema possible. Using data sources built top R6-based framework, pipeline given targets correspond (1) loading R6 class, (2) calling pull function, (3) calling tidy function. three steps mapped across available data sources loaded pipeline environment, joined together create seamless table representing catalog. Finally, schema table handled ensure conforms catalog specification, outputs JSON Parquet released.","code":""},{"path":[]},{"path":"/articles/02-catalogs.html","id":"targets-serialization","dir":"Articles","previous_headings":"Catalogs > Design > Technical Details","what":"Targets Serialization","title":"Catalog Automation","text":"key point highlight targets R package, individual targets serialized specific format completed. Dependent targets also read serialization format back R necessary. default format targets use R RDS format. However, since pipeline already requires Apache Arrow dependency due Parquet output, take advantage Arrow IPC file/stream formats serialization targets. Specifically, pull tidy targets always return data source R6 class, succeeding targets catalog generation return data frame. targets returning R6 classes, custom serializer performs /O R6 class metadata Arrow IPC Stream format implemented. targets returning data frames, use Arrow IPC File format. Arrow IPC formats chosen fashion due smaller memory footprint performance gained zero-copy pass targets. also enables data sources built various programming languages access data needed, due zero-copy property Arrow’s IPC formats.","code":""},{"path":"/articles/02-catalogs.html","id":"pipeline-infrastructure","dir":"Articles","previous_headings":"Catalogs > Design > Technical Details","what":"Pipeline Infrastructure","title":"Catalog Automation","text":"catalog data pipeline built top R targets package, aid generating catalog, utilize GitHub Actions. Despite primarily CI/CD workflows, concept CI/CD can generalized data well. example, data engineering, Apache Airflow predominant application constructing data workflows. two, primary difference GitHub Actions generalized, offers less direct integrations data engineering. context mind, GitHub Actions workflow catalog data pipeline , essence, runner calls targets::tar_make() run pipeline. targets complete, workflow takes outputted catalog files uploads GitHub repository release. Furthermore, workflow scheduled run monthly basis, ensuring catalog stays consistently date latest datasets offered data providers described data source plugins.","code":""},{"path":"/articles/02-catalogs.html","id":"release-strategy","dir":"Articles","previous_headings":"Catalogs > Design > Technical Details","what":"Release Strategy","title":"Catalog Automation","text":"monthly Github Actions update, new release catalog provided JSON parquet formats release page.","code":""},{"path":"/articles/03-intro-climatepy.html","id":"useful-packages-for-climate-data","dir":"Articles","previous_headings":"","what":"Useful Packages for climate data","title":"Welcome to climatePy","text":"","code":"# climatePy import climatePy # vector data libs import geopandas as gpd import shapely from shapely.geometry import box # gridded data libs import xarray as xr # geoencoding service import geopy # misc import numpy as np import pandas as pd import random import joblib # plotting libs import matplotlib.pyplot as plt import seaborn as sns"},{"path":"/articles/03-intro-climatepy.html","id":"climatepy-examples","dir":"Articles","previous_headings":"","what":"climatePy examples","title":"Welcome to climatePy","text":"climatePy package supplemented geopy Python package allows easy use interface many geocoding APIs. get climate product, area interest must defined: loading polygon state North Carolina examples constructing AOI calls can found AOI, can construct call dataset parameter(s) date(s) choice. querying PRISM dataset maximum minimum temperature October 29, 2018:","code":"# get AOI polygon from OpenStreetMap API nom = geopy.geocoders.Nominatim(user_agent=\"climatePy\") geolocal = nom.geocode(\"North Carolina\", geometry='wkt') AOI = gpd.GeoDataFrame( {\"geometry\" : [shapely.wkt.loads(geolocal.raw['geotext'])] }, crs = \"EPSG:4326\" ) p = climatePy.getPRISM( AOI = AOI, varname = ['tmax','tmin'], startDate = \"2018-10-29\", timeRes = \"daily\", dopar = False )"},{"path":"/articles/03-intro-climatepy.html","id":"data-from-known-bounding-coordinates","dir":"Articles","previous_headings":"","what":"Data from known bounding coordinates","title":"Welcome to climatePy","text":"climatePy offers support shapely geopandas objects. requesting wind velocity data four corners region USA bounding coordinates.","code":"from shapely.geometry import box # 4 corners region of USA xmin, xmax, ymin, ymax = -112, -105, 34, 39 # make bounding box AOI = box(xmin, ymin, xmax, ymax) # insert bounding box into geodataframe # AOI = gpd.GeoDataFrame(geometry=[AOI], crs ='EPSG:4326') g = climatePy.getGridMET( AOI = AOI, varname = \"vs\", startDate = \"2018-09-01\", dopar = False )"},{"path":"/articles/03-intro-climatepy.html","id":"data-through-time","dir":"Articles","previous_headings":"","what":"Data through time …","title":"Welcome to climatePy","text":"addition multiple variables can request variables time, let’s look gridMET rainfall Gulf Coast Hurricane Harvey:","code":"texas = nom.geocode(\"Texas\", geometry='wkt') florida = nom.geocode(\"Florida\", geometry='wkt') AOI = gpd.GeoDataFrame({ \"geometry\" : [shapely.wkt.loads(texas.raw['geotext']), shapely.wkt.loads(florida.raw['geotext'])] }, crs = \"EPSG:4326\" ) harvey = climatePy.getGridMET( AOI = AOI, varname = \"pr\", startDate = \"2017-08-20\", endDate = \"2017-08-31\", dopar = False )"},{"path":"/articles/03-intro-climatepy.html","id":"climate-projections","dir":"Articles","previous_headings":"","what":"Climate Projections","title":"Welcome to climatePy","text":"sources downscaled Global Climate Models (GCMs). allow query forecasted ensemble members different models /climate scenarios. One example MACA dataset: Getting multiple models results also quite simple: don’t know models, can always grab random set specifying number:","code":"AOI = gpd.GeoDataFrame({ \"geometry\" : [shapely.wkt.loads(florida.raw['geotext'])]}, crs = \"EPSG:4326\" ) m = climatePy.getMACA( AOI = AOI, model = \"CCSM4\", varname = \"pr\", scenario = [\"rcp45\", \"rcp85\"], startDate = \"2080-06-29\", endDate = \"2080-06-30\", dopar = False ) AOI = gpd.GeoDataFrame({\"geometry\" : [shapely.wkt.loads(nom.geocode(\"Colorado\", geometry='wkt').raw['geotext'])]}, crs = \"EPSG:4326\" ) models = [\"BNU-ESM\",\"CanESM2\", \"CCSM4\"] temp = climatePy.getMACA( AOI = AOI, varname = \"tasmin\", model = models, startDate = \"2080-11-29\", dopar = False ) # calculate average Data Array avg = temp['tasmin'].mean(dim = \"time\") avg = avg.expand_dims(time = xr.DataArray([\"tasmin_Ensemble_mean\"], dims='time')).transpose('x', 'y', 'time') # Concatonate original data arrays with average data array temp['tasmin'] = xr.concat([temp['tasmin'], avg], dim=\"time\") # AOI (Michigan, USA) AOI = gpd.GeoDataFrame({ \"geometry\" : [shapely.wkt.loads(nom.geocode(\"Michigan, USA\", geometry='wkt').raw['geotext'])] }, crs = \"EPSG:4326\" ) # get 3 random MACA models random_models = climatePy.getMACA( AOI = AOI, model = 3, varname = \"tasmin\", startDate = \"2050-10-29\", dopar = False )"},{"path":"/articles/03-intro-climatepy.html","id":"global-datasets","dir":"Articles","previous_headings":"","what":"Global Datasets","title":"Welcome to climatePy","text":"datasets USA focused either. TerraClimate offers global, monthly data current year many variables, CHIRPS provides daily rainfall data:","code":"kenya = gpd.GeoDataFrame({ \"geometry\" : [shapely.wkt.loads(nom.geocode(\"Kenya\", geometry='wkt').raw['geotext'])] }, crs = \"EPSG:4326\" ) # TerraClim PET tc = climatePy.getTerraClim( AOI = kenya, varname = \"pet\", startDate = \"2018-01-01\", dopar = False ) # CHIRPS precip chirps = climatePy.getCHIRPS( AOI = kenya, startDate = \"2018-01-01\", endDate = \"2018-01-01\", dopar = False )"},{"path":"/articles/03-intro-climatepy.html","id":"point-based-data","dir":"Articles","previous_headings":"","what":"Point Based Data","title":"Welcome to climatePy","text":"Finally, data gathering limited areal extents can retrieved time series locations.","code":"# Create a DataFrame with 'lng' and 'lat' columns df = pd.DataFrame({'lng': [-105.0668], 'lat': [40.55085]}) pt = (gpd.GeoDataFrame(geometry=gpd.points_from_xy(df['lng'], df['lat']), crs='EPSG:4326')) ts = climatePy.getGridMET( AOI = pt, varname = [\"pr\", 'srad'], startDate = \"2021-01-01\", endDate = \"2021-12-31\", dopar = False )"},{"path":"/articles/03-intro-climatepy.html","id":"point-based-ensemble","dir":"Articles","previous_headings":"","what":"Point Based Ensemble","title":"Welcome to climatePy","text":"","code":"# Point Based Ensemble future = climatePy.getMACA( AOI = pt, model = 5, varname = \"tasmax\", startDate = \"2050-01-01\", endDate = \"2050-01-31\", dopar = False )"},{"path":"/articles/03-intro-climatepy.html","id":"multi-site-extraction","dir":"Articles","previous_headings":"","what":"Multi Site extraction","title":"Welcome to climatePy","text":"Extracting data set points interesting challenge. turns much efficient grab underlying raster stack extract time series opposed iterating locations: Starting set 50 random points Colorado. climatePy grab DataArray underlying bounding area points Use extract_sites extract times series locations. id parameter unique identifier site data names resulting columns. Providing stack DataArrays extract_sites points_df extract raster values point across time.","code":"# Colorado state polygon AOI = gpd.GeoDataFrame({ \"geometry\" : [shapely.wkt.loads(nom.geocode(\"Colorado\", geometry='wkt').raw['geotext'])] }, crs = \"EPSG:4326\" ) # create 10 random Lat/lon points within the AOI bounding box points = [shapely.geometry.Point(random.uniform(AOI.bounds.minx[0], AOI.bounds.maxx[0]), random.uniform(AOI.bounds.miny[0], AOI.bounds.maxy[0])) for _ in range(50) ] # make geopandas dataframe from points points_df = gpd.GeoDataFrame(geometry=points, crs = \"EPSG:4326\") # create a unique identifier column points_df[\"uid\"] = [\"uid_\" + str(i) for i in range(len(points_df))] sites_stack = climatePy.getTerraClim( AOI = points_df, varname = \"tmax\", startDate = \"2018-01-01\", endDate = \"2018-12-31\" ) # extract wide sites data sites_wide = climatePy.extract_sites(r = sites_stack[\"tmax\"], pts = points_df, id = \"uid\")"},{"path":"/articles/04-stream-morph.html","id":"examples","dir":"Articles","previous_headings":"","what":"Examples","title":"Continental Stream Morphology Research empowered by open data","text":"can agree access tools perform spatial operations revolutionized field hydrologic sciences offering powerful platforms access satellite imagery, reanalysis products, diverse datasets crucial spatial analysis hydrologic modeling. tools facilitate retrieval processing vast amounts geospatial data, allowing researchers practitioners perform comprehensive analyses various spatial temporal scales, turn greatly benefits field hydrology. team Lynker developed climateR climatePy.key advantages using platforms like climateR accessibility wealth satellite imagery spanning multiple decades. archives satellite data readily available, hydrologists can track changes land cover, monitor hydrologic phenomena, assess impacts climate change water resources. ability access analyze historical data allows identification long-term trends, facilitating better understanding prediction hydrologic processes. Furthermore, climateR foster collaboration knowledge sharing within hydrologic community. provide platform scientists researchers across globe access standardized datasets, share methodologies, collaborate solving complex hydrologic challenges. Also, puts forth easy accessible way perform large spatiotemporal operations support NOAA effort. collaborative environment encourages development innovative models techniques water resource management decision-making. demonstrate several examples access databases using climateR perform massive spatial temporal aggregations.","code":""},{"path":"/articles/04-stream-morph.html","id":"massive-spatial-aggregation-of-terraclimate","dir":"Articles","previous_headings":"Examples","what":"Massive Spatial Aggregation of TerraClimate","title":"Continental Stream Morphology Research empowered by open data","text":"integration reanalysis products various datasets platform enables users perform sophisticated spatial operations analyses. Hydrologists can aggregate data specific points polygons, allowing extraction critical information regarding water resources, precipitation patterns, evapotranspiration rates, soil moisture content. facilitates characterization watersheds, assessment water availability, prediction potential flood drought events. want extract long term historical mean value TerraClimate bands NOAA Next Generation (NextGen) National Hydrologic Geospatial Fabric (hydrofabric) divides entire CONUS. doubt surmised, expensive task go monthly TerraClimate dataset past 20 years average byt climateR easy strait forward task. One can access hydrofabric case NextGen hydrofabric Lynker-spatial s2 account: Now can extract divides layer given VPU extract data TerraClimate: just calculated 20 year average 9 different variables 882,945 divides cover CONUS hour (2472.777 seconds = .68 hours) normal laptop!! impressive.","code":"library(hydrofabric) library(lubridate) # Then specify the S3 bucket and file path bucket_name <- \"lynker-spatial\" file_key <- \"v20/gpkg/nextgen_12.gpkg\" # Now download the GeoPackage file from S3 to a temporary file temp_file <- tempfile(fileext = \".gpkg\") s3read_using(file = temp_file, FUN = get_object, object = file_key, bucket = bucket_name) # Finally read the GeoPackage file into an sf object gpkg_sf <- st_read(temp_file) # List of VPU's for CONUS vpu_list = vpu_boundaries$VPUID[1:21] # Variables of Interest vars <- c(\"PDSI\",\"aet\",\"soil\",\"def\",\"ppt\",\"q\",\"tmin\",\"tmax\",\"pet\") # Loop through the VPU's and extract data and time the execution system.time({ for (vpu in vpu_list) { # Read the file file_key <- paste0(\"v20/gpkg/nextgen_\", vpu, \".gpkg\") # Download the GeoPackage file from S3 to a temporary file temp_file <- tempfile(fileext = \".gpkg\") s3read_using(file = temp_file, FUN = get_object, object = file_key, bucket = bucket_name) # Just read the divides divides = read_sf(temp_file, \"divides\") # Use climateR to extract the variables between 2000-2021 out_raster <- getTerraClim(AOI = divides, varname = vars, startDate = \"2000-01-01\", endDate = \"2021-01-01\") # Use rast() to do a temporal mean aggregation and zonal to do a spatial aggregation using divide_id output = execute_zonal(data = rast(lapply(out_raster, mean)), geom = div, fun = \"mean\", ID = \"divide_id\", join = FALSE) # Finally write the data frame to a parquet file write_parquet(output, sprintf(\"/your_path/conus_terraclimate_vpu_%s.parquet\", vpu)) } })"},{"path":"/articles/04-stream-morph.html","id":"comparison-to-gee","dir":"Articles","previous_headings":"Examples","what":"Comparison to GEE","title":"Continental Stream Morphology Research empowered by open data","text":"Now lets compare well known frequently used Google Earth Engine (GEE). start, process 882,945 divides time GEE personal experience shown batches 200 divides ideal size avoid infamous Computation Timed Error. can write script perform batch operation . Breaking batches 200 set two batchs takes 1-3 hours complete (see figure ). Based , scale application, GEE require weeks finsih!!","code":"// This requires uploading the divides into EE assets // A for loop to execute 100 batches of 200 divides as an example for (var i=1; i<100; i++){ runExtract(divides, i, 'last'); } runExtract(divides, 100, 'first'); function runExtract(data, num, first){ var list_feature = data.toList(data.size()); var batch = num; switch(first){ case 'first': var data = ee.FeatureCollection(list_feature.slice(0, 2000-(batch-1)*200)); break; case 'last': var data = ee.FeatureCollection(list_feature.slice(2000-batch*200, 2000-(batch-1)*200)); break; case 'custom': var data = ee.FeatureCollection(list_feature); break; } batch = batch.toString(); // Load TerraClimate var dataset = ee.ImageCollection('IDAHO_EPSCOR/TERRACLIMATE') .filter(ee.Filter.date('2000-01-01', '2022-01-01')); // Performs a temporal mean var aet = dataset.mean().select('aet'); var soil = dataset.mean().select('soil'); var pet = dataset.mean().select('pet'); var def = dataset.mean().select('def'); var pdsi = dataset.mean().select('pdsi'); var ro = dataset.mean().select('ro'); var tmmn = dataset.mean().select('tmmn'); var tmmx = dataset.mean().select('tmmx'); // _______Extract data_________________ function updateDivides(img_dataset, old_dataset, bandname, newname, reducer) { function dataExtract(feat) { var stats = img_dataset.reduceRegion({ reducer: reducer, geometry: feat.geometry(), scale: 4638.3, bestEffort: true }); return ee.Algorithms.If(ee.Number(stats.size()).eq(0), feat.set(newname, ee.Number(999999999)), feat.set(newname, stats.first().get(bandname))); } var new_dataset = old_dataset.map(dataExtract); return new_dataset; } data = updateStation(aet, data,'aet', 'aet', ee.Reducer.mean()); data = updateStation(soil, data,'soil', 'soil', ee.Reducer.mean()); data = updateStation(pet, data,'pet', 'pet', ee.Reducer.mean()); data = updateStation(def, data,'def', 'def', ee.Reducer.mean()); data = updateStation(pdsi, data,'pdsi', 'pdsi', ee.Reducer.mean()); data = updateStation(ro, data,'ro', 'ro', ee.Reducer.mean()); data = updateStation(tmmn, data,'tmmn', 'tmmn', ee.Reducer.mean()); data = updateStation(tmmx, data,'tmmx', 'tmmx', ee.Reducer.mean()); var exp_name = 'TerraClimate_divide_b'+batch; Export.table.toDrive(data, exp_name, 'TerraClimate_exports', exp_name, 'CSV'); } knitr::include_graphics(\"../man/figures/ee_task.png\")"},{"path":"/articles/04-stream-morph.html","id":"massive-temporal-and-spatial-aggregation-with-gldas","dir":"Articles","previous_headings":"Examples","what":"Massive Temporal and Spatial Aggregation with GLDAS","title":"Continental Stream Morphology Research empowered by open data","text":"Now lets say even computationally demanding task try find historical mean daily product GLDAS. case can break period chunks (e.g., 4 years) extract data.","code":"# Define start and end dates start_date <- ymd(\"2004-01-01\") end_date <- ymd(\"2021-01-01\") # Create a sequence of dates with a step of 4 years date_seq <- seq(start_date, end_date, by = \"4 years\") # New names for the columns vars <- c(\"qsb_tavg\", \"qs_tavg\", \"gws_tavg\", \"esoil_tavg\", \"ecanop_tavg\", \"canopint_tavg\", \"avgsurft_tavg\") # Loop through the VPU's and extract data and time the execution system.time({ for (vpu in vpu_list) { # Read the file file_key <- paste0(\"v20/gpkg/nextgen_\", vpu, \".gpkg\") # Download the GeoPackage file from S3 to a temporary file temp_file <- tempfile(fileext = \".gpkg\") s3read_using(file = temp_file, FUN = get_object, object = file_key, bucket = bucket_name) # Just read the divides divides = read_sf(temp_file, \"divides\") for (i in seq_along(date_seq)) { current_start <- date_seq[i] current_end <- current_start + years(4) - days(1) current_start <- format(current_start, \"%Y-%m-%d\") current_end <- format(current_end, \"%Y-%m-%d\") print(paste(\"initiated batch > \", current_start)) # Use climateR to extract the variables between 2004-21 out_raster <- getGLDAS(AOI = div, varname = vars, model = \"CLSM025_DA1_D.2.2\", startDate = current_start, endDate. = current_end) output = execute_zonal(data = rast(lapply(out_raster, mean)), geom = div, fun = \"mean\", ID = \"divide_id\", join = FALSE) current_start_year <- as.character(year(current_start)) current_end_year <- as.character(year(current_end)) write_parquet(output, sprintf(\"/your_path/conus_gldas_vpu_%s_date_%s_%s.parquet\", vpu, current_start_year, current_end_year)) } } })"},{"path":"/articles/04-stream-morph.html","id":"custom-data","dir":"Articles","previous_headings":"Examples","what":"Custom Data","title":"Continental Stream Morphology Research empowered by open data","text":"can also use custom datasets form local drive s3 bucket perform different aggregations. example can access POLARIS soil dataset just spatial average multiple virtual rasters divide polygons. collection POLARIS data resampled native 30m resolution 300m COG.","code":"vars = c(\"alpha\", \"om\", \"ph\") data = rast(glue('/vsis3/lynker-spatial/gridded-resources/polaris300/{vars}_mean_0_5.tif')) system.time({ for (vpu in vpu_list) { # Read the file file_key <- paste0(\"v20/gpkg/nextgen_\", vpu, \".gpkg\") # Download the GeoPackage file from S3 to a temporary file temp_file <- tempfile(fileext = \".gpkg\") s3read_using(file = temp_file, FUN = get_object, object = file_key, bucket = bucket_name) # Just read the divides divides = read_sf(temp_file, \"divides\") polaris = execute_zonal(data = data, geom = divides, fun = \"mean\", ID = \"divide_id\", join = FALSE) # Finally write the data frame to a parquet file write_parquet(output, sprintf(\"/your_path/conus_polaris_vpu_%s.parquet\", vpu)) } })"},{"path":"/articles/04-stream-morph.html","id":"conclusion","dir":"Articles","previous_headings":"","what":"Conclusion","title":"Continental Stream Morphology Research empowered by open data","text":"summary, using climateR significantly benefits hydrological sciences providing unprecedented access diverse datasets. tools empower researchers, policymakers, water resource managers conduct -depth spatial analyses, ultimately enhancing understanding hydrological processes improving water resource management strategies sustainable future.","code":""},{"path":"/articles/05-mros-climateR.html","id":"rain-or-snow","dir":"Articles","previous_headings":"","what":"Rain or snow?","title":"A one-platform approach to processing citizen science data with climateR","text":"Mountain Rain Snow citizen science project goal better predict precipitation phase funded NASA’s Citizen Science Earth Systems Program (CSESP). Citizen scientists across Continental United States respond ‘falling sky?’ reporting precipitation observations rain, mixed, snow using mobile app. several years, project collected around 40,000 observations (Figure 1). observation easy human observers make challenging atmospheric models predict. valuable dataset provides basis improve models. Citizen science observations 2021-2023.","code":""},{"path":"/articles/05-mros-climateR.html","id":"a-data-assimilation-challenge","dir":"Articles","previous_headings":"Rain or snow?","what":"A data assimilation challenge","title":"A one-platform approach to processing citizen science data with climateR","text":"observation point requires rigorous data processing eventually model outputs air, dew point, wet-bulb temperatures, relative humidity value observation point. Additionally, part project’s mission improve satellite-based algorithms, observation point associated probability liquid precipitation (pLP) IMERG, gridded NASA product. raw outputs observation timestamp, location report (latitude/longitude), reported precipitation phase. Ancillary information like elevation station data meteorological networks near observation critical inputs temperature modeling. Previous processing workflows collected data various platforms providers, brought R analysis. meant accessing elevation pLP data via external platform observation point, exporting data bind dataframe. process required intermediate file storage maintaining code unique data provider. ensure workflow reproducibility simplify processing chain, Mountain Rain Snow team integrated climateR organize workflow. climateR large (growing) catalog data providers. benefit approach allows future changes processing phase observations. New data products model output can quickly subset included without adding additional dependencies codebase writing code robustly access large, gridded files. processing now kept single language (R), seamless retrieval elevation pLP data external providers integration original dataframe. summary process illustrated Figure 2. data Elevation data observation point extracted USGS 3DEP 1/3 arc-second (10-meter) dataset, highest resolution USGS DEM available (see get3DEP function). IMERG pLP data accessed dap function, allows consistent data retrieval NASA’s Goddard Earth Sciences Data Information Services Center (GES DISC). Illustration current versus previous workflow. current workflow uses climateR functions create organized workflow within R. previous workflow steps utilizing external data collection platforms collect data, making difficult reproduce workflow. functions climateR provide solutions may unique Mountain Rain Snow project, integrating multiple data products common problem many projects face. Adding issue, reproducibility core practicing good science, ordered workflows often difficult establish. single platform approach data collection processing climateR solves issues.","code":""},{"path":"/articles/05-mros-climateR.html","id":"acknowledgments","dir":"Articles","previous_headings":"Rain or snow?","what":"Acknowledgments","title":"A one-platform approach to processing citizen science data with climateR","text":"Thank Dillon Ragar (Lynker) Rachel Bash (Lynker) review article. mentioned article, Mountain Rain Snow funded NASA’s Citizen Science Earth Systems Program. Co-PIs Mountain Rain Snow Dr. Keith Jennings (Lynker), Meghan Collins (DRI, UNR), Dr. Monica Arienzo (DRI, UNR).","code":""},{"path":"/authors.html","id":null,"dir":"","previous_headings":"","what":"Authors","title":"Authors and Citation","text":"Mike Johnson. Author, maintainer. Justin Singh. Contributor. Angus Watters. Contributor. . Funder. . Funder.","code":""},{"path":"/authors.html","id":"citation","dir":"","previous_headings":"","what":"Citation","title":"Authors and Citation","text":"Johnson M (2023). climateR: climateR. R package version 0.3.1.4, https://github.com/mikejohnson51/climateR.","code":"@Manual{, title = {climateR: climateR}, author = {Mike Johnson}, year = {2023}, note = {R package version 0.3.1.4}, url = {https://github.com/mikejohnson51/climateR}, }"},{"path":"/index.html","id":"welcome","dir":"","previous_headings":"","what":"climateR","title":"climateR","text":"climateR simplifies steps needed get climate data R. core provides three main things: catalog 100,000k datasets 2,000 data providers/archives. See (climateR::params) catalog evolving, federated collection datasets can accessed data access utilities. general toolkit accessing remote local gridded data files bounded space, time, variable constraints (dap, dap_crop, read_dap_file) set shortcuts implement methods core set selected catalog elements ⚠️ Python Users: Data catalog access available USGS gdptools package. Directly analogous climateR functionality can found climatePy","code":"nrow(params) #> [1] 107857 length(unique(params$id)) #> [1] 2075 length(unique(params$asset)) #> [1] 4653"},{"path":"/index.html","id":"installation","dir":"","previous_headings":"","what":"Installation","title":"climateR","text":"","code":"remotes::install_github(\"mikejohnson51/AOI\") # suggested! remotes::install_github(\"mikejohnson51/climateR\")"},{"path":"/index.html","id":"basic-usage","dir":"","previous_headings":"","what":"Basic Usage","title":"climateR","text":"Finding rainfall Colorado October 29,1991 - November 6, 1991. source dataset example uses getGridMET shortcut.","code":"library(AOI) library(terra) library(climateR) AOI = aoi_get(state = \"CO\", county = \"all\") system.time({ d = getGridMET(AOI, varname = \"pr\", startDate = \"1991-10-29\", endDate = \"1991-11-06\") }) #> user system elapsed #> 0.245 0.054 0.982"},{"path":"/index.html","id":"basic-animation","dir":"","previous_headings":"","what":"Basic Animation","title":"climateR","text":"","code":"animation(d$precipitation_amount, AOI = AOI, outfile = \"man/figures/rast_gif.gif\")"},{"path":"/index.html","id":"integration-with-zonal","dir":"","previous_headings":"","what":"Integration with zonal","title":"climateR","text":"","code":"library(zonal) system.time({ county = execute_zonal(d, geom = AOI, ID = \"fip_code\") }) #> user system elapsed #> 0.328 0.018 0.366 animation(county, feild_pattern = \"pr_\", outfile = \"man/figures/vect_gif.gif\")"},{"path":"/reference/animation.html","id":null,"dir":"Reference","previous_headings":"","what":"Animate Object as GIF — animation","title":"Animate Object as GIF — animation","text":"Animate SpatRaster object gif.","code":""},{"path":"/reference/animation.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Animate Object as GIF — animation","text":"","code":"animation(data, AOI = NULL, feild_pattern = NULL, outfile, colors = blues9)"},{"path":"/reference/animation.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Animate Object as GIF — animation","text":"data SpatVect sf object AOI optional AOI sf SpatVect object overlay gif feild_pattern optional string vector filter desired attributes outfile path write gif file, must .gif extenstion colors colors plot ","code":""},{"path":"/reference/animation.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Animate Object as GIF — animation","text":"file.path","code":""},{"path":[]},{"path":"/reference/animation_raster.html","id":null,"dir":"Reference","previous_headings":"","what":"Animate SpatRast as GIF — animation_raster","title":"Animate SpatRast as GIF — animation_raster","text":"Animate SpatRaster object gif.","code":""},{"path":"/reference/animation_raster.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Animate SpatRast as GIF — animation_raster","text":"","code":"animation_raster(data, AOI = NULL, outfile, colors = blues9)"},{"path":"/reference/animation_raster.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Animate SpatRast as GIF — animation_raster","text":"data single SpatRast object AOI optional AOI sf SpatVect object overlay gif outfile path write gif file, must .gif extenstion colors colors plot ","code":""},{"path":"/reference/animation_raster.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Animate SpatRast as GIF — animation_raster","text":"file.path","code":""},{"path":[]},{"path":"/reference/animation_vector.html","id":null,"dir":"Reference","previous_headings":"","what":"Animate vector as GIF — animation_vector","title":"Animate vector as GIF — animation_vector","text":"Animate sf SpatVect object gif.","code":""},{"path":"/reference/animation_vector.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Animate vector as GIF — animation_vector","text":"","code":"animation_vector(data, feild_pattern = NULL, outfile, colors = blues9)"},{"path":"/reference/animation_vector.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Animate vector as GIF — animation_vector","text":"data SpatVect sf object feild_pattern optional string vector filter desired attributes outfile path write gif file, must .gif extenstion colors colors plot ","code":""},{"path":"/reference/animation_vector.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Animate vector as GIF — animation_vector","text":"file.path","code":""},{"path":[]},{"path":"/reference/catalog.html","id":null,"dir":"Reference","previous_headings":"","what":"ClimateR Catalog — catalog","title":"ClimateR Catalog — catalog","text":"ClimateR Catalog","code":""},{"path":"/reference/catalog.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"ClimateR Catalog — catalog","text":"","code":"catalog"},{"path":"/reference/catalog.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"ClimateR Catalog — catalog","text":"object class tbl_df (inherits tbl, data.frame) 37010 rows 29 columns.","code":""},{"path":[]},{"path":"/reference/checkDodsrc.html","id":null,"dir":"Reference","previous_headings":"","what":"Check dodsrc file — checkDodsrc","title":"Check dodsrc file — checkDodsrc","text":"Check netrc file valid entry urs.earthdata.nasa.gov.","code":""},{"path":"/reference/checkDodsrc.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Check dodsrc file — checkDodsrc","text":"","code":"checkDodsrc(dodsrcFile = getDodsrcPath(), netrcFile = getNetrcPath())"},{"path":"/reference/checkDodsrc.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Check dodsrc file — checkDodsrc","text":"dodsrcFile File path dodsrc file check. netrcFile File path netrc file check.","code":""},{"path":"/reference/checkDodsrc.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Check dodsrc file — checkDodsrc","text":"logical","code":""},{"path":[]},{"path":"/reference/checkNetrc.html","id":null,"dir":"Reference","previous_headings":"","what":"Check netrc file — checkNetrc","title":"Check netrc file — checkNetrc","text":"Check netrc file valid entry urs.earthdata.nasa.gov.","code":""},{"path":"/reference/checkNetrc.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Check netrc file — checkNetrc","text":"","code":"checkNetrc(netrcFile = getNetrcPath(), machine = \"urs.earthdata.nasa.gov\")"},{"path":"/reference/checkNetrc.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Check netrc file — checkNetrc","text":"netrcFile character. File path netrc file check. machine machine logging ","code":""},{"path":"/reference/checkNetrc.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Check netrc file — checkNetrc","text":"logical","code":""},{"path":[]},{"path":"/reference/climater_dap.html","id":null,"dir":"Reference","previous_headings":"","what":"ClimateR dry run — climater_dap","title":"ClimateR dry run — climater_dap","text":"ClimateR dry run","code":""},{"path":"/reference/climater_dap.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"ClimateR dry run — climater_dap","text":"","code":"climater_dap(id, args, verbose, dryrun, print.arg = FALSE)"},{"path":"/reference/climater_dap.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"ClimateR dry run — climater_dap","text":"id resource name, agency, catalog identifier args parent function arguments verbose messages emited? dryrun Return summary data prior retrieving print.arg arguments printed? Usefull debugging","code":""},{"path":"/reference/climater_dap.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"ClimateR dry run — climater_dap","text":"data.frame","code":""},{"path":[]},{"path":"/reference/climater_filter.html","id":null,"dir":"Reference","previous_headings":"","what":"ClimateR Catalog Filter — climater_filter","title":"ClimateR Catalog Filter — climater_filter","text":"Filter climateR catalog based set constraints","code":""},{"path":"/reference/climater_filter.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"ClimateR Catalog Filter — climater_filter","text":"","code":"climater_filter( id = NULL, asset = NULL, AOI = NULL, startDate = NULL, endDate = NULL, varname = NULL, model = NULL, scenario = NULL, ensemble = NULL )"},{"path":"/reference/climater_filter.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"ClimateR Catalog Filter — climater_filter","text":"id resource, agency, catalog identifier asset subdataset asset given resource AOI sf SpatVect point polygon extract data startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data varname variable name extract (e.g. tmin) model GCM model name generating scenario climate modeling scenario ensemble model ensemble member used generate data","code":""},{"path":"/reference/climater_filter.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"ClimateR Catalog Filter — climater_filter","text":"data.frame","code":""},{"path":[]},{"path":"/reference/dap.html","id":null,"dir":"Reference","previous_headings":"","what":"Get Data (Data Access Protocol) — dap","title":"Get Data (Data Access Protocol) — dap","text":"function provides consistent data access protocol (dap) wide range local remote resources including VRT, TDS, NetCDF Define get data DAP resource","code":""},{"path":"/reference/dap.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get Data (Data Access Protocol) — dap","text":"","code":"dap( URL = NULL, catalog = NULL, AOI = NULL, startDate = NULL, endDate = NULL, varname = NULL, grid = NULL, start = NULL, end = NULL, toptobottom = FALSE, verbose = TRUE )"},{"path":"/reference/dap.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get Data (Data Access Protocol) — dap","text":"URL local file path URL catalog subset open.dap catalog AOI sf SpatVect point polygon extract data startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data varname variable name extract (e.g. tmin) grid list containing extent (), crs start non \"dated\" items, start can called index end non \"dated\" items, end can called index toptobottom data inverse? verbose dap_summary printed?","code":""},{"path":"/reference/dap.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get Data (Data Access Protocol) — dap","text":"data.frame","code":""},{"path":"/reference/dap.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Get Data (Data Access Protocol) — dap","text":"Wraps dap_get dap_crop one. AOI NULL spatial crop executed. startDate endDate NULL, temporal crop executed. just endDate NULL defaults startDate.","code":""},{"path":[]},{"path":"/reference/dap_crop.html","id":null,"dir":"Reference","previous_headings":"","what":"Crop DAP file — dap_crop","title":"Crop DAP file — dap_crop","text":"Crop DAP file","code":""},{"path":"/reference/dap_crop.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Crop DAP file — dap_crop","text":"","code":"dap_crop( URL = NULL, catalog = NULL, AOI = NULL, startDate = NULL, endDate = NULL, start = NULL, end = NULL, varname = NULL, verbose = TRUE )"},{"path":"/reference/dap_crop.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Crop DAP file — dap_crop","text":"URL local file path URL catalog subset open.dap catalog AOI sf SpatVect point polygon extract data startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data start non \"dated\" items, start can called index end non \"dated\" items, end can called index varname variable name extract (e.g. tmin) verbose dap_summary printed?","code":""},{"path":"/reference/dap_crop.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Crop DAP file — dap_crop","text":"data.frame","code":""},{"path":[]},{"path":"/reference/dap_get.html","id":null,"dir":"Reference","previous_headings":"","what":"Get DAP resource data — dap_get","title":"Get DAP resource data — dap_get","text":"Get DAP resource data","code":""},{"path":"/reference/dap_get.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get DAP resource data — dap_get","text":"","code":"dap_get(dap, varname = NULL)"},{"path":"/reference/dap_get.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get DAP resource data — dap_get","text":"dap data.frame catalog dap_crop varname name variable extract. NULL, get ","code":""},{"path":"/reference/dap_get.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get DAP resource data — dap_get","text":"SpatRaster","code":""},{"path":[]},{"path":"/reference/dap_meta.html","id":null,"dir":"Reference","previous_headings":"","what":"Find DAP Metadata — dap_meta","title":"Find DAP Metadata — dap_meta","text":"Find DAP Metadata","code":""},{"path":"/reference/dap_meta.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Find DAP Metadata — dap_meta","text":"","code":"dap_meta(raw)"},{"path":"/reference/dap_meta.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Find DAP Metadata — dap_meta","text":"raw data.frame","code":""},{"path":"/reference/dap_meta.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Find DAP Metadata — dap_meta","text":"data.frame","code":""},{"path":[]},{"path":"/reference/dap_summary.html","id":null,"dir":"Reference","previous_headings":"","what":"Print Summary Information About a OpenDAP Resource — dap_summary","title":"Print Summary Information About a OpenDAP Resource — dap_summary","text":"Print summary information DAP summary","code":""},{"path":"/reference/dap_summary.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Print Summary Information About a OpenDAP Resource — dap_summary","text":"","code":"dap_summary(dap = NULL, url = NULL)"},{"path":"/reference/dap_summary.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Print Summary Information About a OpenDAP Resource — dap_summary","text":"dap data.frame catalog dap_crop url Unique Resource Identifier (http local)","code":""},{"path":[]},{"path":"/reference/dap_to_local.html","id":null,"dir":"Reference","previous_headings":"","what":"Convert OpenDAP to start/count call — dap_to_local","title":"Convert OpenDAP to start/count call — dap_to_local","text":"Convert OpenDAP start/count call","code":""},{"path":"/reference/dap_to_local.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Convert OpenDAP to start/count call — dap_to_local","text":"","code":"dap_to_local(dap, get = TRUE)"},{"path":"/reference/dap_to_local.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Convert OpenDAP to start/count call — dap_to_local","text":"dap dap description get data collected?","code":""},{"path":"/reference/dap_to_local.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Convert OpenDAP to start/count call — dap_to_local","text":"numeric array","code":""},{"path":[]},{"path":"/reference/dap_xyzv.html","id":null,"dir":"Reference","previous_headings":"","what":"Get XYTV data from DAP URL — dap_xyzv","title":"Get XYTV data from DAP URL — dap_xyzv","text":"Get XYTV data DAP URL","code":""},{"path":"/reference/dap_xyzv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get XYTV data from DAP URL — dap_xyzv","text":"","code":"dap_xyzv(obj, varname = NULL, varmeta = FALSE)"},{"path":"/reference/dap_xyzv.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get XYTV data from DAP URL — dap_xyzv","text":"obj OpenDap URL NetCDF object varname name variable extract. NULL, get varmeta variable metadata appended?","code":""},{"path":"/reference/dap_xyzv.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get XYTV data from DAP URL — dap_xyzv","text":"data.frame (varname, X_name, Y_name, T_name)","code":""},{"path":[]},{"path":"/reference/dot-resource_grid.html","id":null,"dir":"Reference","previous_headings":"","what":"Extract grid metadata from NC Pointer — .resource_grid","title":"Extract grid metadata from NC Pointer — .resource_grid","text":"Extract grid metadata NC Pointer","code":""},{"path":"/reference/dot-resource_grid.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Extract grid metadata from NC Pointer — .resource_grid","text":"","code":".resource_grid(URL, X_name = NULL, Y_name = NULL, stopIfNotEqualSpaced = TRUE)"},{"path":"/reference/dot-resource_grid.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Extract grid metadata from NC Pointer — .resource_grid","text":"URL location data process X_name Name X diminsion. NULL found Y_name Name Y diminsion. NULL found stopIfNotEqualSpaced stop equal space grid","code":""},{"path":"/reference/dot-resource_grid.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Extract grid metadata from NC Pointer — .resource_grid","text":"list","code":""},{"path":[]},{"path":"/reference/dot-resource_time.html","id":null,"dir":"Reference","previous_headings":"","what":"Extract time metadata from NC Pointer — .resource_time","title":"Extract time metadata from NC Pointer — .resource_time","text":"Extract time metadata NC Pointer","code":""},{"path":"/reference/dot-resource_time.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Extract time metadata from NC Pointer — .resource_time","text":"","code":".resource_time(URL, T_name = NULL)"},{"path":"/reference/dot-resource_time.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Extract time metadata from NC Pointer — .resource_time","text":"URL location data process T_name Name T dimension. NULL found","code":""},{"path":"/reference/dot-resource_time.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Extract time metadata from NC Pointer — .resource_time","text":"list","code":""},{"path":[]},{"path":"/reference/extract_sites.html","id":null,"dir":"Reference","previous_headings":"","what":"Extract Sites — extract_sites","title":"Extract Sites — extract_sites","text":"extract timeseries values raster stack set points","code":""},{"path":"/reference/extract_sites.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Extract Sites — extract_sites","text":"","code":"extract_sites(r, pts, id)"},{"path":"/reference/extract_sites.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Extract Sites — extract_sites","text":"r SpatRaster object pts point extract id unique identifier point (column name pts)","code":""},{"path":"/reference/extract_sites.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Extract Sites — extract_sites","text":"data.frame columes representing points, rows time periods","code":""},{"path":[]},{"path":"/reference/get3DEP.html","id":null,"dir":"Reference","previous_headings":"","what":"Get USGS 3DEP DEMs — get3DEP","title":"Get USGS 3DEP DEMs — get3DEP","text":"Get USGS 3DEP DEMs","code":""},{"path":"/reference/get3DEP.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get USGS 3DEP DEMs — get3DEP","text":"","code":"get3DEP(AOI, resolution = \"30m\")"},{"path":"/reference/get3DEP.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get USGS 3DEP DEMs — get3DEP","text":"AOI sf SpatVect point polygon extract data resolution DEM resolution (10m 30m (default))","code":""},{"path":"/reference/get3DEP.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get USGS 3DEP DEMs — get3DEP","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getBCCA.html","id":null,"dir":"Reference","previous_headings":"","what":"Get BCCA data — getBCCA","title":"Get BCCA data — getBCCA","text":"Get BCCA data","code":""},{"path":"/reference/getBCCA.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get BCCA data — getBCCA","text":"","code":"getBCCA( AOI, varname, model = \"CCSM4\", scenario = \"rcp45\", ensemble = NULL, startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getBCCA.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get BCCA data — getBCCA","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) model GCM model name generating scenario climate modeling scenario ensemble model ensemble member used generate data startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getBCCA.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get BCCA data — getBCCA","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getCHIRPS.html","id":null,"dir":"Reference","previous_headings":"","what":"Get CHIRPS data — getCHIRPS","title":"Get CHIRPS data — getCHIRPS","text":"Get CHIRPS data","code":""},{"path":"/reference/getCHIRPS.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get CHIRPS data — getCHIRPS","text":"","code":"getCHIRPS( AOI, varname = NULL, timeRes = \"daily\", startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getCHIRPS.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get CHIRPS data — getCHIRPS","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) timeRes \"Pentad\", \"Annual\", \"Daily\" (default), \"Monthly\" startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getCHIRPS.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get CHIRPS data — getCHIRPS","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getDaymet.html","id":null,"dir":"Reference","previous_headings":"","what":"Get Daymet Climate Data for an Area of Interest — getDaymet","title":"Get Daymet Climate Data for an Area of Interest — getDaymet","text":"dataset provides Daymet Version 4 model output data gridded estimates daily weather parameters North America. Daymet output variables include following parameters: minimum temperature, maximum temperature, precipitation, shortwave radiation, vapor pressure, snow water equivalent, day length. dataset covers period January 1, 1980 December 31 recent full calendar year. subsequent year processed individually close calendar year allowing adequate time input weather station data archive quality. Daymet variables continuous surfaces provided individual files, year, 1-km x 1-km spatial resolution daily temporal resolution. Data Lambert Conformal Conic projection North America netCDF file format compliant Climate Forecast (CF) metadata conventions.","code":""},{"path":"/reference/getDaymet.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get Daymet Climate Data for an Area of Interest — getDaymet","text":"","code":"getDaymet( AOI, varname = NULL, startDate = NULL, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getDaymet.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get Daymet Climate Data for an Area of Interest — getDaymet","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getDaymet.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get Daymet Climate Data for an Area of Interest — getDaymet","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getDodsrcPath.html","id":null,"dir":"Reference","previous_headings":"","what":"Get a default dodsrc file path — getDodsrcPath","title":"Get a default dodsrc file path — getDodsrcPath","text":"Get default dodsrc file path","code":""},{"path":"/reference/getDodsrcPath.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get a default dodsrc file path — getDodsrcPath","text":"","code":"getDodsrcPath()"},{"path":"/reference/getDodsrcPath.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get a default dodsrc file path — getDodsrcPath","text":"character vector containing default netrc file path","code":""},{"path":[]},{"path":"/reference/getDodsrcPath.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get a default dodsrc file path — getDodsrcPath","text":"","code":"getDodsrcPath() #> [1] \"/Users/mjohnson/.dodsrc\""},{"path":"/reference/getGLDAS.html","id":null,"dir":"Reference","previous_headings":"","what":"Get GLDAS data — getGLDAS","title":"Get GLDAS data — getGLDAS","text":"Get GLDAS data","code":""},{"path":"/reference/getGLDAS.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get GLDAS data — getGLDAS","text":"","code":"getGLDAS( AOI, varname = NULL, model = NULL, startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getGLDAS.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get GLDAS data — getGLDAS","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) model GCM model name generating startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getGLDAS.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get GLDAS data — getGLDAS","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getGridMET.html","id":null,"dir":"Reference","previous_headings":"","what":"Get GridMet Climate Data for an Area of Interest — getGridMET","title":"Get GridMet Climate Data for an Area of Interest — getGridMET","text":"gridMET dataset daily high-spatial resolution (~4-km, 1/24th degree) surface meteorological data covering contiguous US 1979-yesterday. data updated daily.","code":""},{"path":"/reference/getGridMET.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get GridMet Climate Data for an Area of Interest — getGridMET","text":"","code":"getGridMET( AOI, varname, startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getGridMET.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get GridMet Climate Data for an Area of Interest — getGridMET","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getGridMET.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get GridMet Climate Data for an Area of Interest — getGridMET","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getLCMAP.html","id":null,"dir":"Reference","previous_headings":"","what":"Get USGS LCMAP — getLCMAP","title":"Get USGS LCMAP — getLCMAP","text":"Land Change Monitoring, Assessment, Projection","code":""},{"path":"/reference/getLCMAP.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get USGS LCMAP — getLCMAP","text":"","code":"getLCMAP(AOI, year = 2019, type = \"primary landcover\")"},{"path":"/reference/getLCMAP.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get USGS LCMAP — getLCMAP","text":"AOI sf SpatVect point polygon extract data year Land cover product year 1985 - 2019 (default = 2019) type product type (primary landcover (default), secondary landcover, primary confidence, secondary confidence, cover change, change day, change magniture, model cquality, spectral stability, spectral lastchance)","code":""},{"path":"/reference/getLCMAP.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get USGS LCMAP — getLCMAP","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getLOCA.html","id":null,"dir":"Reference","previous_headings":"","what":"Get LOCA Climate Data for an Area of Interest — getLOCA","title":"Get LOCA Climate Data for an Area of Interest — getLOCA","text":"LOCA statistical downscaling technique uses past history add improved fine-scale detail global climate models. LOCA used downscale 32 global climate models CMIP5 archive 1/16th degree spatial resolution, covering North America central Mexico Southern Canada. historical period 1950-2005, two future scenarios available: RCP 4.5 RCP 8.5 period 2006-2100 (although models stop 2099). variables currently available daily minimum maximum temperature, daily precipitation. information visit: http://loca.ucsd.edu/.","code":""},{"path":"/reference/getLOCA.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get LOCA Climate Data for an Area of Interest — getLOCA","text":"","code":"getLOCA( AOI, varname, model = \"CCSM4\", scenario = \"rcp45\", startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getLOCA.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get LOCA Climate Data for an Area of Interest — getLOCA","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) model GCM model name generating scenario climate modeling scenario startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getLOCA.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get LOCA Climate Data for an Area of Interest — getLOCA","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getLOCA_hydro.html","id":null,"dir":"Reference","previous_headings":"","what":"Get LOCA Hydrology data — getLOCA_hydro","title":"Get LOCA Hydrology data — getLOCA_hydro","text":"Get LOCA Hydrology data","code":""},{"path":"/reference/getLOCA_hydro.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get LOCA Hydrology data — getLOCA_hydro","text":"","code":"getLOCA_hydro( AOI, varname, model = \"CCSM4\", scenario = \"rcp45\", startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getLOCA_hydro.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get LOCA Hydrology data — getLOCA_hydro","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) model GCM model name generating scenario climate modeling scenario startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getLOCA_hydro.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get LOCA Hydrology data — getLOCA_hydro","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getLivneh.html","id":null,"dir":"Reference","previous_headings":"","what":"Get Livneh data — getLivneh","title":"Get Livneh data — getLivneh","text":"Get Livneh data","code":""},{"path":"/reference/getLivneh.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get Livneh data — getLivneh","text":"","code":"getLivneh( AOI, varname = NULL, startDate, endDate = NULL, timeRes = \"daily\", verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getLivneh.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get Livneh data — getLivneh","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data timeRes daily monthly verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getLivneh.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get Livneh data — getLivneh","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getLivneh_fluxes.html","id":null,"dir":"Reference","previous_headings":"","what":"Get Livneh Flux data — getLivneh_fluxes","title":"Get Livneh Flux data — getLivneh_fluxes","text":"Get Livneh Flux data","code":""},{"path":"/reference/getLivneh_fluxes.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get Livneh Flux data — getLivneh_fluxes","text":"","code":"getLivneh_fluxes( AOI, varname = NULL, startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getLivneh_fluxes.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get Livneh Flux data — getLivneh_fluxes","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getLivneh_fluxes.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get Livneh Flux data — getLivneh_fluxes","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getMACA.html","id":null,"dir":"Reference","previous_headings":"","what":"Get MACA Climate Data for an Area of Interest — getMACA","title":"Get MACA Climate Data for an Area of Interest — getMACA","text":"Multivariate Adaptive Constructed Analogs (MACA) statistical method downscaling Global Climate Models (GCMs) native coarse resolution higher spatial resolution captures reflects observed patterns daily near-surface meteorology simulated changes GCMs experiments.","code":""},{"path":"/reference/getMACA.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get MACA Climate Data for an Area of Interest — getMACA","text":"","code":"getMACA( AOI, varname, timeRes = \"day\", model = \"CCSM4\", scenario = \"rcp45\", startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getMACA.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get MACA Climate Data for an Area of Interest — getMACA","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) timeRes daily monthly model GCM model name generating scenario climate modeling scenario startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getMACA.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get MACA Climate Data for an Area of Interest — getMACA","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getMODIS.html","id":null,"dir":"Reference","previous_headings":"","what":"Get MODIS data — getMODIS","title":"Get MODIS data — getMODIS","text":"Get MODIS data","code":""},{"path":"/reference/getMODIS.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get MODIS data — getMODIS","text":"","code":"getMODIS( AOI, asset = NULL, varname = NULL, startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getMODIS.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get MODIS data — getMODIS","text":"AOI sf SpatVect point polygon extract data asset MODIS sensor varname variable name extract (e.g. tmin) startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getMODIS.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get MODIS data — getMODIS","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getNASADEM.html","id":null,"dir":"Reference","previous_headings":"","what":"Get NASA Global DEM — getNASADEM","title":"Get NASA Global DEM — getNASADEM","text":"Get NASA Global DEM","code":""},{"path":"/reference/getNASADEM.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get NASA Global DEM — getNASADEM","text":"","code":"getNASADEM(AOI)"},{"path":"/reference/getNASADEM.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get NASA Global DEM — getNASADEM","text":"AOI sf SpatVect point polygon extract data ","code":""},{"path":"/reference/getNASADEM.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get NASA Global DEM — getNASADEM","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getNLCD.html","id":null,"dir":"Reference","previous_headings":"","what":"Get USGS National Land Cover Dataset — getNLCD","title":"Get USGS National Land Cover Dataset — getNLCD","text":"Get USGS National Land Cover Dataset","code":""},{"path":"/reference/getNLCD.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get USGS National Land Cover Dataset — getNLCD","text":"","code":"getNLCD(AOI, year = 2019, type = \"land cover\")"},{"path":"/reference/getNLCD.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get USGS National Land Cover Dataset — getNLCD","text":"AOI sf SpatVect point polygon extract data year Landcover product year (2001, 2011,2016,2019) type product type","code":""},{"path":"/reference/getNLCD.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get USGS National Land Cover Dataset — getNLCD","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getNLDAS.html","id":null,"dir":"Reference","previous_headings":"","what":"Get NLDAS data — getNLDAS","title":"Get NLDAS data — getNLDAS","text":"Get NLDAS data","code":""},{"path":"/reference/getNLDAS.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get NLDAS data — getNLDAS","text":"","code":"getNLDAS( AOI, varname = NULL, model = \"FORA0125_H.002\", startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getNLDAS.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get NLDAS data — getNLDAS","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) model GCM model name generating startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getNLDAS.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get NLDAS data — getNLDAS","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getNetrcPath.html","id":null,"dir":"Reference","previous_headings":"","what":"Get the default netrc file path — getNetrcPath","title":"Get the default netrc file path — getNetrcPath","text":"Get default netrc file path","code":""},{"path":"/reference/getNetrcPath.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get the default netrc file path — getNetrcPath","text":"","code":"getNetrcPath()"},{"path":"/reference/getNetrcPath.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get the default netrc file path — getNetrcPath","text":"character vector containing default netrc file path","code":""},{"path":[]},{"path":"/reference/getNetrcPath.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get the default netrc file path — getNetrcPath","text":"","code":"getNetrcPath() #> [1] \"/Users/mjohnson/.netrc\""},{"path":"/reference/getPRISM.html","id":null,"dir":"Reference","previous_headings":"","what":"Get PRISM data — getPRISM","title":"Get PRISM data — getPRISM","text":"Get PRISM data","code":""},{"path":"/reference/getPRISM.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get PRISM data — getPRISM","text":"","code":"getPRISM( AOI, varname = NULL, startDate, endDate = NULL, timeRes = \"daily\", verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getPRISM.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get PRISM data — getPRISM","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data timeRes daily monthly verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getPRISM.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get PRISM data — getPRISM","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getTerraClim.html","id":null,"dir":"Reference","previous_headings":"","what":"Get Terra Climate Data for an Area of Interest — getTerraClim","title":"Get Terra Climate Data for an Area of Interest — getTerraClim","text":"Get Terra Climate Data Area Interest","code":""},{"path":"/reference/getTerraClim.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get Terra Climate Data for an Area of Interest — getTerraClim","text":"","code":"getTerraClim( AOI, varname = NULL, startDate = NULL, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getTerraClim.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get Terra Climate Data for an Area of Interest — getTerraClim","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getTerraClim.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get Terra Climate Data for an Area of Interest — getTerraClim","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getTerraClimNormals.html","id":null,"dir":"Reference","previous_headings":"","what":"Get Terra Climate Normals for an Area of Interest — getTerraClimNormals","title":"Get Terra Climate Normals for an Area of Interest — getTerraClimNormals","text":"layers TerraClimate creating using climatically aided interpolation monthly anomalies CRU Ts4.0 Japanese 55-year Reanalysis (JRA-55) datasets WorldClim v2.0 climatologies.","code":""},{"path":"/reference/getTerraClimNormals.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get Terra Climate Normals for an Area of Interest — getTerraClimNormals","text":"","code":"getTerraClimNormals( AOI, varname, scenario = \"19812010\", month = 1:12, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getTerraClimNormals.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get Terra Climate Normals for an Area of Interest — getTerraClimNormals","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) scenario climate modeling scenario month numeric. month vector months access. Default 1:12 verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getTerraClimNormals.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get Terra Climate Normals for an Area of Interest — getTerraClimNormals","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getVIC.html","id":null,"dir":"Reference","previous_headings":"","what":"Get VIC data — getVIC","title":"Get VIC data — getVIC","text":"Get VIC data","code":""},{"path":"/reference/getVIC.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get VIC data — getVIC","text":"","code":"getVIC( AOI, varname, model = \"CCSM4\", scenario = \"rcp45\", startDate, endDate = NULL, verbose = FALSE, dryrun = FALSE )"},{"path":"/reference/getVIC.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get VIC data — getVIC","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) model GCM model name generating scenario climate modeling scenario startDate start date given \"YYYY-MM-DD\" extract data endDate end date given \"YYYY-MM-DD\" extract data verbose messages emited? dryrun Return summary data prior retrieving ","code":""},{"path":"/reference/getVIC.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get VIC data — getVIC","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/getWorldClim.html","id":null,"dir":"Reference","previous_headings":"","what":"Get WorlClim gridded weather and climate data for historical (near current) conditions. — getWorldClim","title":"Get WorlClim gridded weather and climate data for historical (near current) conditions. — getWorldClim","text":"WorldClim database high spatial resolution global weather climate data. data can used mapping spatial modeling.","code":""},{"path":"/reference/getWorldClim.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get WorlClim gridded weather and climate data for historical (near current) conditions. — getWorldClim","text":"","code":"getWorldClim( AOI = NULL, varname = NULL, model = \"wc2.1_5m\", month = 1:12, verbose = TRUE )"},{"path":"/reference/getWorldClim.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get WorlClim gridded weather and climate data for historical (near current) conditions. — getWorldClim","text":"AOI sf SpatVect point polygon extract data varname variable name extract (e.g. tmin) model GCM model name generating month numeric. month vector months access. Default 1:12 verbose messages emited?","code":""},{"path":"/reference/getWorldClim.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get WorlClim gridded weather and climate data for historical (near current) conditions. — getWorldClim","text":"AOI polygon list SpatRasters, AOI point data.frame modeled records.","code":""},{"path":[]},{"path":"/reference/get_data.html","id":null,"dir":"Reference","previous_headings":"","what":"Get DAP Array — get_data","title":"Get DAP Array — get_data","text":"Get DAP Array","code":""},{"path":"/reference/get_data.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get DAP Array — get_data","text":"","code":"get_data(dap)"},{"path":"/reference/get_data.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get DAP Array — get_data","text":"dap dap description","code":""},{"path":"/reference/get_data.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get DAP Array — get_data","text":"SpatRast","code":""},{"path":[]},{"path":"/reference/go_get_dap_data.html","id":null,"dir":"Reference","previous_headings":"","what":"Read formated DAP URL as SpatRast — go_get_dap_data","title":"Read formated DAP URL as SpatRast — go_get_dap_data","text":"Read formated DAP URL SpatRast","code":""},{"path":"/reference/go_get_dap_data.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Read formated DAP URL as SpatRast — go_get_dap_data","text":"","code":"go_get_dap_data(dap)"},{"path":"/reference/go_get_dap_data.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Read formated DAP URL as SpatRast — go_get_dap_data","text":"dap output dap_crop","code":""},{"path":"/reference/go_get_dap_data.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Read formated DAP URL as SpatRast — go_get_dap_data","text":"SpatRast","code":""},{"path":[]},{"path":"/reference/grid_meta.html","id":null,"dir":"Reference","previous_headings":"","what":"Find DAP grid metadata — grid_meta","title":"Find DAP grid metadata — grid_meta","text":"Find DAP grid metadata","code":""},{"path":"/reference/grid_meta.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Find DAP grid metadata — grid_meta","text":"","code":"grid_meta(raw)"},{"path":"/reference/grid_meta.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Find DAP grid metadata — grid_meta","text":"raw data.frame","code":""},{"path":"/reference/grid_meta.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Find DAP grid metadata — grid_meta","text":"data.frame","code":""},{"path":[]},{"path":"/reference/make_ext.html","id":null,"dir":"Reference","previous_headings":"","what":"Convert catalog entry to extent — make_ext","title":"Convert catalog entry to extent — make_ext","text":"Convert catalog entry extent","code":""},{"path":"/reference/make_ext.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Convert catalog entry to extent — make_ext","text":"","code":"make_ext(cat)"},{"path":"/reference/make_ext.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Convert catalog entry to extent — make_ext","text":"cat catalog entry (data.frame Xn, X1, Yn, Y1, crs)","code":""},{"path":"/reference/make_ext.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Convert catalog entry to extent — make_ext","text":"SpatExtent","code":""},{"path":[]},{"path":"/reference/make_vect.html","id":null,"dir":"Reference","previous_headings":"","what":"Make Vector — make_vect","title":"Make Vector — make_vect","text":"Make Vector","code":""},{"path":"/reference/make_vect.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Make Vector — make_vect","text":"","code":"make_vect(cat)"},{"path":"/reference/make_vect.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Make Vector — make_vect","text":"cat catalog entry (data.frame Xn, X1, Yn, Y1, crs)","code":""},{"path":"/reference/make_vect.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Make Vector — make_vect","text":"SpatVect","code":""},{"path":[]},{"path":"/reference/merge_across_time.html","id":null,"dir":"Reference","previous_headings":"","what":"Merge List of SpatRaster's across time — merge_across_time","title":"Merge List of SpatRaster's across time — merge_across_time","text":"Given list SpatRasters possibly shared names, merge across time","code":""},{"path":"/reference/merge_across_time.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Merge List of SpatRaster's across time — merge_across_time","text":"","code":"merge_across_time(data)"},{"path":"/reference/merge_across_time.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Merge List of SpatRaster's across time — merge_across_time","text":"data list names SpatRasters","code":""},{"path":"/reference/merge_across_time.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Merge List of SpatRaster's across time — merge_across_time","text":"data.frame (varname, X_name, Y_name, T_name)","code":""},{"path":[]},{"path":"/reference/params.html","id":null,"dir":"Reference","previous_headings":"","what":"ClimateR Catalog — params","title":"ClimateR Catalog — params","text":"ClimateR Catalog","code":""},{"path":"/reference/params.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"ClimateR Catalog — params","text":"","code":"params"},{"path":"/reference/params.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"ClimateR Catalog — params","text":"object class data.table (inherits data.frame) 107857 rows 28 columns.","code":""},{"path":[]},{"path":"/reference/parse_date.html","id":null,"dir":"Reference","previous_headings":"","what":"Parse Dates from duration and interval — parse_date","title":"Parse Dates from duration and interval — parse_date","text":"Parse Dates duration interval","code":""},{"path":"/reference/parse_date.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Parse Dates from duration and interval — parse_date","text":"","code":"parse_date(duration, interval)"},{"path":"/reference/parse_date.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Parse Dates from duration and interval — parse_date","text":"duration time duration interval time interval","code":""},{"path":[]},{"path":"/reference/read_dap_file.html","id":null,"dir":"Reference","previous_headings":"","what":"Read from a OpenDAP landing page — read_dap_file","title":"Read from a OpenDAP landing page — read_dap_file","text":"Reads OpenDap resources returns metadata","code":""},{"path":"/reference/read_dap_file.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Read from a OpenDAP landing page — read_dap_file","text":"","code":"read_dap_file(URL, varname = NULL, id, varmeta = TRUE)"},{"path":"/reference/read_dap_file.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Read from a OpenDAP landing page — read_dap_file","text":"URL URL OpenDap resource varname name variable extract. NULL, get id character. Uniquely named dataset identifier varmeta variable metadata appended?","code":""},{"path":"/reference/read_dap_file.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Read from a OpenDAP landing page — read_dap_file","text":"data.frame","code":""},{"path":[]},{"path":"/reference/read_ftp.html","id":null,"dir":"Reference","previous_headings":"","what":"Read from FTP — read_ftp","title":"Read from FTP — read_ftp","text":"Read FTP","code":""},{"path":"/reference/read_ftp.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Read from FTP — read_ftp","text":"","code":"read_ftp(URL, cat, lyrs = 1, AOI, ext = NULL, crs = NULL, dates = NULL)"},{"path":"/reference/read_ftp.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Read from FTP — read_ftp","text":"URL Unique Resource Identifier (http local) cat catalog element lyrs lyrs extract AOI Area Interest ext extent source (needed) crs crs source (needed) dates dates data","code":""},{"path":"/reference/read_ftp.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Read from FTP — read_ftp","text":"SpatRaster","code":""},{"path":[]},{"path":"/reference/read_live_catalog.html","id":null,"dir":"Reference","previous_headings":"","what":"Read Live Catalog from Github release — read_live_catalog","title":"Read Live Catalog from Github release — read_live_catalog","text":"Every month, data catalog refreshed. function reads current catalog Github release.","code":""},{"path":"/reference/read_live_catalog.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Read Live Catalog from Github release — read_live_catalog","text":"","code":"read_live_catalog( url = \"https://github.com/mikejohnson51/climateR-catalogs/releases/latest/download/catalog.parquet\" )"},{"path":"/reference/read_live_catalog.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Read Live Catalog from Github release — read_live_catalog","text":"url URL read","code":""},{"path":"/reference/read_live_catalog.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Read Live Catalog from Github release — read_live_catalog","text":"data.frame","code":""},{"path":[]},{"path":"/reference/time_meta.html","id":null,"dir":"Reference","previous_headings":"","what":"Find DAP time metadata — time_meta","title":"Find DAP time metadata — time_meta","text":"Find DAP time metadata","code":""},{"path":"/reference/time_meta.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Find DAP time metadata — time_meta","text":"","code":"time_meta(raw)"},{"path":"/reference/time_meta.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Find DAP time metadata — time_meta","text":"raw data.frame","code":""},{"path":"/reference/time_meta.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Find DAP time metadata — time_meta","text":"data.frame","code":""},{"path":[]},{"path":"/reference/try_att.html","id":null,"dir":"Reference","previous_headings":"","what":"TryCatch around RNetCDF::att.get.nc() — try_att","title":"TryCatch around RNetCDF::att.get.nc() — try_att","text":"TryCatch around RNetCDF::att.get.nc()","code":""},{"path":"/reference/try_att.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"TryCatch around RNetCDF::att.get.nc() — try_att","text":"","code":"try_att(nc, variable, attribute)"},{"path":"/reference/try_att.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"TryCatch around RNetCDF::att.get.nc() — try_att","text":"nc \"NetCDF\" object points NetCDF dataset. Found RNetCDF::open.nc. variable ID name variable attribute read, \"NC_GLOBAL\" global attribute. attribute Attribute name ID.","code":""},{"path":"/reference/try_att.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"TryCatch around RNetCDF::att.get.nc() — try_att","text":"Vector data type depends NetCDF variable. NetCDF variables type NC_CHAR, R type either character raw, specified argument rawchar. NC_STRING, R type character. Numeric variables read double precision default, smallest R type exactly represents external type used fitnum TRUE.","code":""},{"path":[]},{"path":"/reference/var_to_terra.html","id":null,"dir":"Reference","previous_headings":"","what":"Variable Array to SpatRast — var_to_terra","title":"Variable Array to SpatRast — var_to_terra","text":"Variable Array SpatRast","code":""},{"path":"/reference/var_to_terra.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Variable Array to SpatRast — var_to_terra","text":"","code":"var_to_terra(var, dap)"},{"path":"/reference/var_to_terra.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Variable Array to SpatRast — var_to_terra","text":"var numeric array dap dap description","code":""},{"path":"/reference/var_to_terra.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Variable Array to SpatRast — var_to_terra","text":"SpatRast","code":""},{"path":[]},{"path":"/reference/variable_meta.html","id":null,"dir":"Reference","previous_headings":"","what":"Find DAP variable metadata — variable_meta","title":"Find DAP variable metadata — variable_meta","text":"Find DAP variable metadata","code":""},{"path":"/reference/variable_meta.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Find DAP variable metadata — variable_meta","text":"","code":"variable_meta(raw, verbose = TRUE)"},{"path":"/reference/variable_meta.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Find DAP variable metadata — variable_meta","text":"raw data.frame verbose emit messages","code":""},{"path":"/reference/variable_meta.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Find DAP variable metadata — variable_meta","text":"data.frame","code":""},{"path":[]},{"path":"/reference/vrt_crop_get.html","id":null,"dir":"Reference","previous_headings":"","what":"VRT Crop — vrt_crop_get","title":"VRT Crop — vrt_crop_get","text":"VRT Crop","code":""},{"path":"/reference/vrt_crop_get.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"VRT Crop — vrt_crop_get","text":"","code":"vrt_crop_get( URL = NULL, catalog = NULL, AOI = NULL, grid = NULL, varname = NULL, start = NULL, end = NULL, toptobottom = FALSE, verbose = TRUE )"},{"path":"/reference/vrt_crop_get.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"VRT Crop — vrt_crop_get","text":"URL local file path URL catalog subset open.dap catalog AOI sf SpatVect point polygon extract data grid list containing extent (), crs varname variable name extract (e.g. tmin) start non \"dated\" items, start can called index end non \"dated\" items, end can called index toptobottom data inverse? verbose dap_summary printed?","code":""},{"path":"/reference/vrt_crop_get.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"VRT Crop — vrt_crop_get","text":"SpatRaster","code":""},{"path":[]},{"path":"/reference/writeDodsrc.html","id":null,"dir":"Reference","previous_headings":"","what":"Write dodsrc file — writeDodsrc","title":"Write dodsrc file — writeDodsrc","text":"Write dodsrc file valid netrc file","code":""},{"path":"/reference/writeDodsrc.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Write dodsrc file — writeDodsrc","text":"","code":"writeDodsrc(netrcFile = getNetrcPath(), dodsrcFile = \".dodsrc\")"},{"path":"/reference/writeDodsrc.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Write dodsrc file — writeDodsrc","text":"netrcFile character. path netrc file . dodsrcFile path dodsrc file want write default go home directory, advised","code":""},{"path":"/reference/writeDodsrc.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Write dodsrc file — writeDodsrc","text":"character vector containing netrc file path","code":""},{"path":[]},{"path":"/reference/writeNetrc.html","id":null,"dir":"Reference","previous_headings":"","what":"Write netrc file — writeNetrc","title":"Write netrc file — writeNetrc","text":"Write netrc file valid accessing urs.earthdata.nasa.gov","code":""},{"path":"/reference/writeNetrc.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Write netrc file — writeNetrc","text":"","code":"writeNetrc( login, password, machine = \"urs.earthdata.nasa.gov\", netrcFile = getNetrcPath(), overwrite = FALSE )"},{"path":"/reference/writeNetrc.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Write netrc file — writeNetrc","text":"login character. Email address used logging earthdata password character. Password associated login. machine machine logging netrcFile character. path netrc file written. default go home directory, advised overwrite logical. overwrite existing netrc file?","code":""},{"path":"/reference/writeNetrc.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Write netrc file — writeNetrc","text":"character vector containing netrc file path","code":""},{"path":"/reference/writeNetrc.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Write netrc file — writeNetrc","text":"database accessed user's credentials. netrc file storing login password information required. See . set must following (1) Login EarthData (2) Go Applications > Authorized Apps (3) NASA GESDISC DATA ARCHIVE Approved Applications list, select APPROVE APPLICATIONS (4) Find NASA GESDISC DATA ARCHIVE click AUTHORIZE instruction register set DataSpace credential.","code":""},{"path":[]},{"path":"/reference/writeNetrc.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Write netrc file — writeNetrc","text":"","code":"if (FALSE) { writeNetrc( login = \"XXX@email.com\", password = \"yourSecretPassword\" ) }"}]
diff --git a/docs/sitemap.xml b/docs/sitemap.xml
index 6d4044a..8e6752f 100644
--- a/docs/sitemap.xml
+++ b/docs/sitemap.xml
@@ -18,6 +18,12 @@
/articles/03-intro-climatepy.html
+
+ /articles/04-stream-morph.html
+
+
+ /articles/05-mros-climateR.html
+
/articles/index.html
diff --git a/vignettes/01-intro.Rmd b/vignettes/01-intro.Rmd
index 76462ca..be31d52 100644
--- a/vignettes/01-intro.Rmd
+++ b/vignettes/01-intro.Rmd
@@ -74,7 +74,7 @@ ggplot() +
```{r}
AOI = st_as_sfc(st_bbox(c(xmin = -112, xmax = -105, ymax = 39, ymin = 34), crs = 4326))
-g = getGridMET(AOI,
+g = getGridMET(st_as_sf(AOI),
varname = "vs",
startDate = "2018-09-01")
```
@@ -223,7 +223,7 @@ Finally, data gathering is not limited to areal extents and can be retrieved as
```{r}
ts = data.frame(lng = -105.0668, lat = 40.55085) %>%
- sf::st_as_sf(coords = c('lng', 'lat'), crs = 4326) %>%
+ st_as_sf(coords = c('lng', 'lat'), crs = 4326) %>%
getGridMET(varname = c("pr", 'srad'),
startDate = "2021-01-01",
endDate = "2021-12-31")
@@ -248,12 +248,13 @@ ggplot(data = ts, aes(x = date, y = cumsum(pr))) +
```{r}
future = getMACA(geocode("Fort Collins", pt = TRUE),
- model = 5, varname = "tasmax",
- startDate = "2050-01-01", endDate = "2050-01-31")
+ model = 5,
+ varname = "tasmax",
+ startDate = "2050-01-01",
+ endDate = "2050-01-31")
future_long = pivot_longer(future, -date)
-
ggplot(data = future_long, aes(x = date, y = value, col = name)) +
geom_line() +
theme_linedraw() +
diff --git a/vignettes/examples.Rmd b/vignettes/04-stream-morph.Rmd
similarity index 57%
rename from vignettes/examples.Rmd
rename to vignettes/04-stream-morph.Rmd
index 6439d1a..01bd3ac 100644
--- a/vignettes/examples.Rmd
+++ b/vignettes/04-stream-morph.Rmd
@@ -21,61 +21,57 @@ options(width=100)
knitr::opts_chunk$set(
collapse = TRUE,
comment = "#>",
- fig.path = 'man/figures/',
out.width = "100%",
- #warning = FALSE,
+ warning = FALSE,
message = TRUE
)
```
# Examples
-We all can agree that access to tools to perform spatial operations has revolutionized the field of hydrological sciences by offering a powerful platform to access satellite imagery, reanalysis products, and diverse datasets crucial for spatial analysis and hydrological modeling. These tools facilitate the retrieval and processing of vast amounts of geospatial data, enabling researchers and practitioners to perform comprehensive analyses at various spatial and temporal scales, which in turn greatly benefits the field of hydrology.
+We all can agree that access to tools to perform spatial operations has revolutionized the field of hydrologic sciences by offering a powerful platforms to access satellite imagery, reanalysis products, and diverse datasets crucial for spatial analysis and hydrologic modeling. These tools facilitate the retrieval and processing of vast amounts of geospatial data, allowing researchers and practitioners to perform comprehensive analyses at various spatial and temporal scales, which in turn greatly benefits the field of hydrology.
-Our team at Lynker have developed [climateR](https://github.com/mikejohnson51/climateR.git) and [climatePy](https://github.com/anguswg-ucsb/climatePy).The key advantages of using platforms like climateR is the accessibility to a wealth of satellite imagery spanning multiple decades. With archives of satellite data readily available, hydrologists can track changes in land cover, monitor hydrological phenomena, and assess the impacts of climate change on water resources. The ability to access and analyze historical data allows for the identification of long-term trends, facilitating better understanding and prediction of hydrological processes.
+Our team at Lynker have developed [climateR](https://github.com/mikejohnson51/climateR.git) and [climatePy](https://github.com/LynkerIntel/climatePy).The key advantages of using platforms like climateR is the accessibility to a wealth of satellite imagery spanning multiple decades. With archives of satellite data readily available, hydrologists can track changes in land cover, monitor hydrologic phenomena, and assess the impacts of climate change on water resources. The ability to access and analyze historical data allows for the identification of long-term trends, facilitating better understanding and prediction of hydrologic processes.
-Furthermore, climateR foster collaboration and knowledge sharing within the hydrological community. It provide a platform for scientists and researchers across the globe to access standardized datasets, share methodologies, and collaborate on solving complex hydrological challenges. Also, puts forth an easy and accessible way to perform large spatiotemporal operations that support any NOAA effort. This collaborative environment encourages the development of innovative models and techniques for water resource management and decision-making.
+Furthermore, climateR foster collaboration and knowledge sharing within the hydrologic community. It provide a platform for scientists and researchers across the globe to access standardized datasets, share methodologies, and collaborate on solving complex hydrologic challenges. Also, puts forth an easy and accessible way to perform large spatiotemporal operations that support any NOAA effort. This collaborative environment encourages the development of innovative models and techniques for water resource management and decision-making.
Here we demonstrate several examples of how to access these databases using climateR and perform massive spatial and temporal aggregations.
-## Massive Spatial Aggregation with TerraClimate
+## Massive Spatial Aggregation of TerraClimate
The integration of reanalysis products and various datasets in this platform enables users to perform sophisticated spatial operations and analyses. Hydrologists can aggregate data over specific points or polygons, allowing for the extraction of critical information regarding water resources, such as precipitation patterns, evapotranspiration rates, and soil moisture content. This facilitates the characterization of watersheds, the assessment of water availability, and the prediction of potential flood or drought events.
-Here I want to extract long term historical mean value of TerraClimate bands for all NOAA Next Generation (NextGen) National Hydrologic Geospatial Fabric (hydrofabric) divides over the entire CONUS. As you no doubt surmised, this is a very expensive task to go over all monthly TerraClimate dataset for the past 20 years and and average all the byt with climateR this will be an easy and strait forward task.
+Here I want to extract long term historical mean value of TerraClimate bands for all NOAA Next Generation (NextGen) National Hydrologic Geospatial Fabric (hydrofabric) divides over the entire CONUS. As you no doubt surmised, this is a very expensive task to go over all monthly TerraClimate dataset for the past 20 years and average all the byt with climateR this will be an easy and strait forward task.
-One can access the hydrofabric in this case NextGen hydrofabric form
+One can access the hydrofabric in this case NextGen hydrofabric from the Lynker-spatial s2 account:
-```r
+```{r, eval = FALSE}
library(hydrofabric)
library(lubridate)
# Then specify the S3 bucket and file path
bucket_name <- "lynker-spatial"
-file_key <- "v20/gpkg/nextgen_12.gpkg"
+file_key <- "v20/gpkg/nextgen_12.gpkg"
# Now download the GeoPackage file from S3 to a temporary file
temp_file <- tempfile(fileext = ".gpkg")
-aws.s3::s3read_using(file = temp_file, FUN = get_object, object = file_key, bucket = bucket_name)
+s3read_using(file = temp_file,
+ FUN = get_object,
+ object = file_key,
+ bucket = bucket_name)
# Finally read the GeoPackage file into an sf object
gpkg_sf <- st_read(temp_file)
```
-Now we can extract individual divide files for given VPU and extract data from TerraClimate
-
-
-```r
+Now we can extract the divides layer for the given VPU and extract data from TerraClimate:
+```{r, eval = FALSE}
# List of VPU's for CONUS
-vpu_list = list("01","02","03S","03W","03N","04","05","06","07","08","09","10U","10L","11",
- "12","13","14","15","16","17","18")
-
-# List of columns to be renamed
-columns_to_rename <- c("mean,PDSI", "mean,aet", "mean,soil", "mean,def", "mean,ppt", "mean,q", "mean,tmin", "mean,tmax", "mean,pet")
+vpu_list = vpu_boundaries$VPUID[1:21]
-# New names for the columns
-new_column_names <- c("PDSI","aet","soil","def","ppt","q","tmin","tmax","pet")
+# Variables of Interest
+vars <- c("PDSI","aet","soil","def","ppt","q","tmin","tmax","pet")
# Loop through the VPU's and extract data and time the execution
system.time({
@@ -85,32 +81,44 @@ system.time({
# Download the GeoPackage file from S3 to a temporary file
temp_file <- tempfile(fileext = ".gpkg")
- aws.s3::s3read_using(file = temp_file, FUN = get_object, object = file_key, bucket = bucket_name)
+ s3read_using(file = temp_file,
+ FUN = get_object,
+ object = file_key,
+ bucket = bucket_name)
# Just read the divides
divides = read_sf(temp_file, "divides")
- # Use climateR to extract the variables between 2000-21
- out_raster <- getTerraClim(AOI = divides,
- varname = c(new_column_names),
- startDate = "2000-01-01",
- endDate = "2021-01-01")
+ # Use climateR to extract the variables between 2000-2021
+ out_raster <- getTerraClim(AOI = divides,
+ varname = vars,
+ startDate = "2000-01-01",
+ endDate = "2021-01-01")
- # Use rast to do a temporal mean aggregation and Zonal to do a spatial aggregation using divide_id
- output = execute_zonal(data = rast(lapply(out_raster, mean)), geom = div, fun = "mean", ID = "divide_id", join = FALSE)
+ # Use rast() to do a temporal mean aggregation and zonal to do a spatial aggregation using divide_id
+ output = execute_zonal(data = rast(lapply(out_raster, mean)),
+ geom = div,
+ fun = "mean",
+ ID = "divide_id",
+ join = FALSE)
# Finally write the data frame to a parquet file
write_parquet(output, sprintf("/your_path/conus_terraclimate_vpu_%s.parquet", vpu))
}
})
```
-We just calculated 20 year average of 9 different bands of TerraClimate over 882,945 divides that cover CONUS and it took under an hour to complete (2472.777 seconds) on my normal laptop!! This is very impressive.
+
+We just calculated 20 year average of 9 different variables over 882,945 divides that cover CONUS under an hour (2472.777 seconds = .68 hours) on my normal laptop!! This is very impressive.
## Comparison to GEE
-Now lets compare this to the very well known and frequently used Google Earth Engine (GEE). But one can not process all 882,945 divides at the same time in GEE and my personal experience showed that batches of 200 divides is the ideal size not to get the infamous "Computation Timed Out Error". So we can write a script to perform batch operation such as below.
+Now lets compare this to the well known and frequently used Google Earth Engine (GEE).
+
+To start, we cannot process all 882,945 divides at the same time in GEE and my personal experience has shown that batches of 200 divides is the ideal size to avoid the infamous `Computation Timed Out Error`.
-```javascript
+So we can write a script to perform batch operation such as below.
+
+```{javascript}
// This requires uploading the divides into EE assets
// A for loop to execute 100 batches of 200 divides as an example
for (var i=1; i<100; i++){
@@ -186,18 +194,19 @@ function runExtract(data, num, first){
Export.table.toDrive(data, exp_name, 'TerraClimate_exports', exp_name, 'CSV');
}
```
-**Breaking this into batches 200 each two batch takes about 1-3 hours to complete (see figure below) then it will takes weeks to extract all data for 882,945 divides using GEE!! whereas we have done it in less than a hour with climateR.**
-
-
-
+**Breaking this into batches 200 each set of two batchs takes about 1-3 hours to complete (see figure below). Based on this, for the scale of our application, GEE would require weeks to finsih!! **
-## Massive Temporal and Spatial Aggregation with GLDAS
-Now lets say we have even more computationally demanding task as we try to do a historical mean over a daily product form GLDAS. In this case we can break our period into chunks (e.g., 4 years) and extract data.
+```{r, echo = FALSE}
+knitr::include_graphics("../man/figures/ee_task.png")
+```
+
+## Massive Temporal and Spatial Aggregation with GLDAS
+Now lets say we have even more computationally demanding task as we try to find a historical mean over a daily product from GLDAS. In this case we can break our period into chunks (e.g., 4 years) and extract data.
-```r
+```{r, eval = FALSE}
# Define start and end dates
start_date <- ymd("2004-01-01")
end_date <- ymd("2021-01-01")
@@ -205,11 +214,8 @@ end_date <- ymd("2021-01-01")
# Create a sequence of dates with a step of 4 years
date_seq <- seq(start_date, end_date, by = "4 years")
-# List of columns to be renamed
-columns_to_rename <- c("mean,qsb_tavg", "mean,qs_tavg", "mean,gws_tavg", "mean,esoil_tavg", "mean,ecanop_tavg", "mean,canopint_tavg", "mean,avgsurft_tavg")
-
# New names for the columns
-new_column_names <- c("qsb_tavg", "qs_tavg", "gws_tavg", "esoil_tavg", "ecanop_tavg", "canopint_tavg", "avgsurft_tavg")
+vars <- c("qsb_tavg", "qs_tavg", "gws_tavg", "esoil_tavg", "ecanop_tavg", "canopint_tavg", "avgsurft_tavg")
# Loop through the VPU's and extract data and time the execution
system.time({
@@ -219,7 +225,11 @@ system.time({
# Download the GeoPackage file from S3 to a temporary file
temp_file <- tempfile(fileext = ".gpkg")
- aws.s3::s3read_using(file = temp_file, FUN = get_object, object = file_key, bucket = bucket_name)
+
+ s3read_using(file = temp_file,
+ FUN = get_object,
+ object = file_key,
+ bucket = bucket_name)
# Just read the divides
divides = read_sf(temp_file, "divides")
@@ -234,10 +244,10 @@ system.time({
# Use climateR to extract the variables between 2004-21
out_raster <- getGLDAS(AOI = div,
- varname = c(new_column_names),
- model = "CLSM025_DA1_D.2.2",
- startDate = current_start,
- endDate = current_end)
+ varname = vars,
+ model = "CLSM025_DA1_D.2.2",
+ startDate = current_start,
+ endDate. = current_end)
output = execute_zonal(data = rast(lapply(out_raster, mean)), geom = div, fun = "mean", ID = "divide_id", join = FALSE)
current_start_year <- as.character(year(current_start))
@@ -252,9 +262,11 @@ system.time({
We can also use custom datasets form our local drive or s3 bucket to perform different aggregations. Here as an example we can access POLARIS soil dataset and do just a spatial average of multiple virtual rasters over all our divide polygons.
-```r
+This collection of POLARIS data has been resampled from its native 30m resolution to a 300m COG.
+
+```{r, eval = FALSE}
vars = c("alpha", "om", "ph")
-data = rast(glue::glue('/vsis3/lynker-spatial/gridded-resources/polaris300/{vars}_mean_0_5.tif'))
+data = rast(glue('/vsis3/lynker-spatial/gridded-resources/polaris300/{vars}_mean_0_5.tif'))
system.time({
for (vpu in vpu_list) {
@@ -263,12 +275,18 @@ system.time({
# Download the GeoPackage file from S3 to a temporary file
temp_file <- tempfile(fileext = ".gpkg")
- aws.s3::s3read_using(file = temp_file, FUN = get_object, object = file_key, bucket = bucket_name)
+
+ s3read_using(file = temp_file,
+ FUN = get_object, object = file_key,
+ bucket = bucket_name)
# Just read the divides
divides = read_sf(temp_file, "divides")
- polaris = execute_zonal(data = data, geom = divides, fun = "mean", ID = "divide_id", join = FALSE)
+ polaris = execute_zonal(data = data,
+ geom = divides, fun = "mean",
+ ID = "divide_id",
+ join = FALSE)
# Finally write the data frame to a parquet file
write_parquet(output, sprintf("/your_path/conus_polaris_vpu_%s.parquet", vpu))
@@ -276,27 +294,7 @@ system.time({
})
```
-## Extract at Points
-
-We can also extract using coordinates of point data e.g., locations of stations to extract values from POLARIS
-
-```r
-r = rast(
- c(
- '/vsicurl/http://hydrology.cee.duke.edu/POLARIS/PROPERTIES/v1.0/vrt/theta_r_mean_0_5.vrt',
- '/vsicurl/http://hydrology.cee.duke.edu/POLARIS/PROPERTIES/v1.0/vrt/theta_s_mean_0_5.vrt'
- )
-)
-
-# Read datafarme contain lat and long coordinates
-pts = read_parquet('your_path/data.parquet') %>%
- st_as_sf(coords = c('X', "Y"), crs = 4326, remove = FALSE) %>%
- st_transform(st_crs(r))
-
-system.time({ t = extract(r, pts) })
-write_parquet(t, "your_path/polaris_data.parquet")
-```
# Conclusion
-In summary, the utilization of climateR and climatePy significantly benefits hydrological sciences by providing unprecedented access to diverse datasets and satellite imagery. These tools empower researchers, policymakers, and water resource managers to conduct in-depth spatial analyses, ultimately enhancing our understanding of hydrological processes and improving water resource management strategies for a more sustainable future.
\ No newline at end of file
+In summary, using climateR significantly benefits hydrological sciences by providing unprecedented access to diverse datasets. These tools empower researchers, policymakers, and water resource managers to conduct in-depth spatial analyses, ultimately enhancing our understanding of hydrological processes and improving water resource management strategies for a more sustainable future.
\ No newline at end of file
diff --git a/vignettes/mros-climateR.Rmd b/vignettes/05-mros-climateR.Rmd
similarity index 99%
rename from vignettes/mros-climateR.Rmd
rename to vignettes/05-mros-climateR.Rmd
index 35d69a5..1b50a37 100644
--- a/vignettes/mros-climateR.Rmd
+++ b/vignettes/05-mros-climateR.Rmd
@@ -6,7 +6,7 @@ author:
affiliation_url: https://lynker.com
output: distill::distill_article
vignette: >
- %\VignetteIndexEntry{intro}
+ %\VignetteIndexEntry{mros}
%\VignetteEngine{knitr::rmarkdown}
%\VignetteEncoding{UTF-8}
---