##Introduction
The geoknife package was created to support web-based geoprocessing of large gridded datasets according to their overlap with landscape (or aquatic/ocean) features that are often irregularly shaped. geoknife creates data access and subsequent geoprocessing requests for the USGS’s Geo Data Portal to carry out on a web server. The results of these requests are available for download after the processes have been completed. This type of workflow has three main advantages: 1) it allows the user to avoid downloading large datasets, 2) it avoids reinventing the wheel for the creation and optimization of complex geoprocessing algorithms, and 3) computing resources are dedicated elsewhere, so geoknife operations do not have much of an impact on a local computer.
Because communication with web resources are central to geoknife operations, users must have an active internet connection. geoknife interacts with a remote server to discover processing capabilities, find already available geospatial areas of interest (these are normally user-uploaded shapefiles), get gridded dataset characteristics, execute geoprocessing requests, and get geoprocessing results.
The main elements of setting up and carrying out a geoknife ‘job’ (a
geojob
object) include defining the feature of interest
‘stencil’ (a webgeom
or simplegeom
object),
the gridded web dataset ‘fabric’ to be processed (a webdata object), and
the the processing algorithm ‘knife’ parameters (a
webprocess
object). The status of the geojob can be checked
with check
, and output can be loaded into a data.frame with
results
. See below for more details.
##Installation To install the stable version of geoknife package with dependencies:
install.packages("geoknife",
repos = c("https://owi.usgs.gov/R","https://cran.rstudio.com/"),
dependencies = TRUE)
Or to install the current development version of the package:
install.packages("devtools")
::install_github('USGS-R/geoknife') devtools
The geoknife
package was created to support web-based
geoprocessing of large gridded datasets according to their overlap with
landscape (or aquatic/ocean) features that are often irregularly shaped.
geoknife creates data access and subsequent geoprocessing requests for
the USGS’s Geo Data Portal to carry out on a web server.
geoknife has abstractions for web-available gridded data, geospatial
features, and geoprocessing details. These abstractions are the basic
geoknife arguments of fabric
, stencil
and
knife
.
* fabric
defines the web data that will be accessed,
subset, and processed (see the fabric
section for more details). These data are limited to gridded
datasets that are web-accessible through the definitions presented in the OPeNDAP section. Metadata for
fabric
include time, the URL for the data, and
variables.
* stencil
is the geospatial feature (or set of features)
that will be used to delineate specific regions of interest on the
fabric (see the stencil section
for more details). stencil
can include point or polygon
groupings of various forms (including classes from the sp R
package).
* knife
defines the way the analysis will be performed,
including the algorithm and version used, the URL that receives the
processing request, the statistics returned, and the format of the
results (see the knife
section for more details).
* The geoknife()
function takes the fabric
,
stencil
, and knife
, and returns a
geojob
, which is a live geoprocessing request that will be
carried out on a remote web server (see the
geojob section for more details). The geojob
can be
checked by users, and results can be parsed and loaded into the R
environment for analyses.
Because geoknife
executes geospatial computations on a
remote webserver, the workflow for to execute geoprocessing operations
may feel a bit foreign to users who usually performing their analyses on
a local computer. To find available datasets and their details
(variables, time range, etc.), geoknife
must query remote
servers because data for use with geoknife
is typically
hosted on open access servers near the processing service. These
operations are covered in detail below, but this section is designed to
provide a quick overview.
Interactions with web resources may take on the following forms, and each involve separate requests to various webservers:
query
function to figure out what data exist
for fabric
. This function will request data from a CSW
(catalog service for the web) resource and return results, or, if a
dataset is already specified, it can be used to query for the variables
or time dimension.query
function to use a web resource for the
geometry of stencil
, including a US State, Level III
Ecoregion, and many others.geojob
to be processed externallygeojob
geojob
There are various ways to get up and running quickly with
geoknife
. See sections below for additional details on any
of the following operations. As mentioned above, geoknife
has the basic arguments of fabric
, stencil
and
knife
. knife
is an optional argument, and if
not used, a default knife
will be used to specify the
processing details.
There are many different ways to specify geometry
(stencil
) for geoknife
. The two basic
functions that support building stencil
objects are
simplegeom
and webdata
:
library(geoknife)
Use a single longitude latitude pair as the geometry with the
simplegeom
function:
<- simplegeom(c(-89, 46.23)) stencil
Or specify a collection of named points in a data.frame
(note that naming is important for multi-features because it specifies
how the results are filtered):
<- simplegeom(data.frame(
stencil 'point1' = c(-89, 46),
'point2' = c(-88.6, 45.2)))
Use a web-available geometry dataset with the webgeom
function to specify state boundaries:
<- webgeom('state::New Hampshire')
stencil <- webgeom('state::New Hampshire,Wisconsin,Alabama') stencil
or HUC8s (hydrologic unit code):
<- webgeom('HUC8::09020306,14060009') stencil
display stencil:
stencil
## An object of class "webgeom":
## url: http://cida.usgs.gov/gdp/geoserver/wfs
## geom: derivative:wbdhu8_alb_simp
## attribute: HUC_8
## values: 09020306, 14060009
## wfs version: 1.1.0
see what other HUCs could be used via the query
function:
<- query(stencil, 'values') HUCs
there are thousands of results, but head()
will only
display a few of them
head(HUCs)
## [1] "11060006" "11060005" "11060001" "11060004" "11060003"
## [6] "11060002"
The Geo Data Portal’s web data catalog is quite extensive, and
includes many datasets that can all be processed with
geoknife
. Check it out at cida.usgs.gov/gdp. This is not a
complete list of all relevant datasets that can be accessed and
processed. The geoknife
package has a number of quick
access datasets build in (similar to quick start webgeom
objects).
An example of a quick start dataset:
<- webdata('prism')
fabric fabric
## An object of class "webdata":
## times: 1895-01-01T00:00:00Z, 1899-01-01T00:00:00Z
## url: https://cida.usgs.gov/thredds/dodsC/prism_v2
## variables: ppt
which can be a starting point for the PRISM dataset, as the fields can be modified:
times(fabric) <- c('2002-01-01','2010-01-01')
variables(fabric) <- c('tmx')
fabric
## An object of class "webdata":
## times: 2002-01-01T00:00:00Z, 2010-01-01T00:00:00Z
## url: https://cida.usgs.gov/thredds/dodsC/prism_v2
## variables: tmx
<- geoknife(stencil, fabric) job
use convienence functions to check on the job:
check(job)
running(job)
error(job)
successful(job)
Cancel a running job:
<- cancel(job) job
Run the job again, but have R wait until the process is finished:
<- geoknife(stencil, fabric, wait = TRUE) job
Load up the output and plot it
<- result(job)
data plot(data[,1:2], ylab = variables(fabric))
For long running processes, it often makes sense to use an email listener:
<- geoknife(webgeom('state::Wisconsin'), fabric = 'prism', email = 'fake.email@gmail.com') job
stencil
)The stencil
concept in geoknife
represents
the area(s) of interest for geoprocessing. stencil
can be
represented by two classes in geoknife
:
simplegeom
and webdata
. Any other classes can
also be used that can be coerced into either of these two classes (such
as data.frame
).
simplegeom
objectThe simplegeom
class is designed to hold spatial
information from the R environment and make it available to the
processing engine. simplegeom
is effectively a wrapper for
the sp
package’s SpatialPolygons
class, but
also coerces a number of different other types into this class. For
example:
Points can be specified as longitude latitude pairs:
<- simplegeom(c(-89, 45.43)) stencil
or as a data.frame:
<- simplegeom(data.frame(
stencil 'point1' = c(-89, 46),
'point2' = c(-88.6, 45.2)))
Also, a SpatialPolygons
object can be used as well
(example from sp
package):
library(sp)
## Warning: package 'sp' was built under R version 4.2.1
= Polygon(cbind(c(2,4,4,1,2),c(2,3,5,4,2)))
Sr1 = Polygon(cbind(c(5,4,2,5),c(2,3,2,2)))
Sr2 = Polygon(cbind(c(4,4,5,10,4),c(5,3,2,5,5)))
Sr3 = Polygon(cbind(c(5,6,6,5,5),c(4,4,3,3,4)), hole = TRUE)
Sr4
= Polygons(list(Sr1), "s1")
Srs1 = Polygons(list(Sr2), "s2")
Srs2 = Polygons(list(Sr3, Sr4), "s3/4")
Srs3 <- simplegeom(Srl = list(Srs1,Srs2,Srs3), proj4string = CRS("+proj=longlat +datum=WGS84")) stencil
## Warning in value[[3L]](cond): SpatialPolygons support is
## deprecated.
webgeom
objectThe webgeom
class is designed to hold references to web
feature service (WFS) details and make it available to the processing
engine.
Similar to webdata
(see below), the webgeom
class has public fields that can be set and accessed using simple
methods. Public fields in webgeom
:
url
: the WFS endpoint to use for the datageom
: the feature collection name (can be namespaced
shapefile names)attribute
: the attribute of the feature collection to
be usevalues
: the values of the chosen attribute to use (or
NA
for all)version
: the WFS version to use.To create a default webgeom
object:
<- webgeom() stencil
The user-level information in webgeom is all available with the webgeom “show” method (or print).
stencil
## An object of class "webgeom":
## url: https://cida.usgs.gov/gdp/geoserver/wfs
## geom: NA
## attribute: NA
## values: NA
## wfs version: 1.1.0
The public fields can be accessed in by using the field name:
geom(stencil) <- "sample:CONUS_states"
attribute(stencil) <- "STATE"
values(stencil) <- c("Wisconsin","Maine")
There are some built in webgeom
templates that can be
used to figure out the pattern, or to use these datasets for analysis.
Currently, the package only supports US States, Level III Ecoregions, or
HUC8s:
<- webgeom('state::Wisconsin')
stencil webgeom('state::Wisconsin,Maine')
webgeom('HUC8::09020306,14060009')
webgeom('ecoregion::Colorado Plateaus,Driftless Area')
webgeom
The query
function on webgeom
can be used
to find possible inputs for each public field (other than
version
and url
currently):
query(stencil, 'geoms')
## [1] "sample:Alaska"
## [2] "upload:CIDA_TEST_"
## [3] "sample:CONUS_Climate_Divisions"
## [4] "sample:CONUS_states"
## [5] "sample:CONUS_states"
## [6] "sample:CSC_Boundaries"
## [7] "sample:Landscape_Conservation_Cooperatives"
## [8] "sample:FWS_LCC"
## [9] "sample:simplified_huc8"
## [10] "sample:Ecoregions_Level_III"
## [12] "sample:Counties"
## [13] "sample:nps_boundary_2013"
## [14] "upload:nrhu_selection"
## [15] "upload:nrhu_selection_Gallatin"
## [16] "sample:simplified_HUC8s"
## [17] "draw:test"
query(stencil, 'attributes')
## [1] "STATE"
fabric
)The fabric
concept in geoknife
represents
the gridded dataset that will be operated on by the tool.
fabric
can be a time-varying dataset (such as PRISM) or a
spatial snapshot coverage dataset (such as the NLCD). At present,
fabric
is limited to datasets that can be accessed using
the OPeNDAP protocol or WMS (web map service). Most helper functions in
geoknife, including query(fabric,'variables')
tend to work
better for OPeNDAP datasets.
webdata
objectThe webdata
class holds all the important information
for webdatasets in order to make them available for processing by
geoknife’s outsourced geoprocessing engine, the Geo Data Portal. Public
fields in webdata
:
times
: a POSIXct vector of length 2. This specifies the
start and end time of the process request. If times()[1]
is
NA
, the start time will be the beginning of the dataset. If
times()[2]
is NA
the end time will be the end
of the dataset. times
must be
as.POSIXct(c(NA,NC))
for datasets without a time
dimension.url
: a character for the location of a web available
dataset. This URL will be queried for data access and used for the
processing task.variables
: a character vector for the variables of the
dataset to use. Must be valid variables that exist within the dataset
specified with url
.To create a default webdata
object:
<- webdata() fabric
The user-level information in webdata is all available with the webdata “show” method (or print).
fabric
## An object of class "webdata":
## times: NA, NA
## url: NA
## variables: NA
The public fields can be accessed in by using the field name:
times(fabric)
## [1] NA NA
url(fabric) <- 'https://cida.usgs.gov/thredds/dodsC/prism'
variables(fabric) <- 'tmx'
times(fabric)[1] <- as.POSIXct('1990-01-01')
The fabric
is specified using the webdata
function. geoknife
can access a catalog of webdata by using
the query
function:
= query('webdata')
webdatasets length(webdatasets)
## [1] 190
Interrogating datasets can be done by printing the returned dataset list, which displays the title and the url of each dataset by default (this example truncates the 190 datasets to display 5):
61:65] webdatasets[
## An object of class "datagroup":
## [1] Eighth degree-CONUS Daily Downscaled Climate Projections Minimum and Maximum Temperature
## url: http://cida.usgs.gov/thredds/dodsC/dcp/conus_t
## [2] Eighth degree-CONUS Daily Downscaled Climate Projections Precipitation
## url: http://cida.usgs.gov/thredds/dodsC/dcp/conus_pr
## [3] Future California Basin Characterization Model Downscaled Climate and Hydrology
## url: http://cida.usgs.gov/thredds/dodsC/CA-BCM-2014/future
## [4] GLDAS Version 2.0 Noah 0.25 degree monthly data
## url: http://hydro1.sci.gsfc.nasa.gov/dods/GLDAS_NOAH025_M.020
## [5] GLDAS Version 2.0 Noah 1.0 degree 3-hourly data
## url: http://hydro1.sci.gsfc.nasa.gov/dods/GLDAS_NOAH10_3H.020
Finding additional information about a particular dataset is
supported by title()
and abstract()
, which
return the dataset titles and abstracts respectively:
title(webdatasets[87])
## [1] "North Central River Forecasting Center - Quantitative Precipitation Estimate Archive"
abstract(webdatasets[87])
## [1] "Radar indicated-rain gage verified and corrected hourly precipitation estimate on a corrected ~4km HRAP grid."
indexing datasets based on order or title are equivalent
<- webdata(webdatasets[99])
fabric <- webdata(webdatasets['Monthly Conterminous U.S. actual evapotranspiration data']) evapotran
To modify the times in fabric
, use
times()
:
times(fabric) <- c('1990-01-01','2005-01-01')
Similar to webgeom
, the query method can be used on
webdata
objects:
query(fabric, 'times')
query(fabric, 'variables')
There are hundreds (or potentially thousands) of additional OPeNDAP
datasets that will work with geoknife, but need to be found through web
searches or catalogs (e.g.,
www.esrl.noaa.gov/psd/thredds/dodsC/Datasets,
apdrc.soest.hawaii.edu/data/data.php). One such example is Sea Surface
Temperature from the Advanced Very High Resolution Radiometer (AVHRR)
temperature sensing system. Specifying datasets such as this requires
finding out the OPeNDAP endpoint (URL) for the dataset, and specifying
it as the url
to webdata (we found this example in the
extensive apdrc.soest.hawaii.edu/data/data.php catalog):
= webdata(url='dods://apdrc.soest.hawaii.edu/dods/public_data/satellite_product/AVHRR/avhrr_mon') fabric
query
for variables
doesn’t work for this
dataset, because it actually doesn’t have units and therefore “valid”
variables are not returned (instead you get an empty list). From the
OPeNDAP endpoint, it is clear that this dataset has one variable of
interest, which is called ‘sst’:
variables(fabric) <- 'sst'
query(fabric, 'times')
[1] "1985-01-01 UTC" "2003-05-01 UTC"
times(fabric) <- c('1990-01-01','1999-12-31')
plotting the July surface temperature of a spot on the Caspian Sea is done by:
= result(geoknife(data.frame('caspian.sea'=c(51,40)), fabric, wait = TRUE))
sst head(sst)
<- months(sst$DateTime) == 'July'
july.idx plot(sst$DateTime[july.idx], sst$caspian.sea[july.idx], type='l', lwd=2, col='dodgerblue', ylab='Sea Surface Temperature (degC)',xlab=NA)
## DateTime caspian.sea variable statistic
## 1 1990-01-01 11.250 sst MEAN
## 2 1990-02-01 10.575 sst MEAN
## 3 1990-03-01 10.350 sst MEAN
## 4 1990-04-01 11.400 sst MEAN
## 5 1990-05-01 14.925 sst MEAN
## 6 1990-06-01 19.800 sst MEAN
webdata
The query
function works on webdata
,
similar to how it works for webgeom
objects. For the PRISM
dataset specified above, the time range of the dataset can come from
query
with times
:
= webdata('prism')
fabric variables(fabric) <- 'ppt'
query(fabric, 'times')
## [1] "1895-01-01 UTC" "2013-02-01 UTC"
likewise, variables with variables
:
query(fabric, 'variables')
Note that a variable has to be specified to use the
times
query:
variables(fabric) <- NA
## [1] "ppt" "tmx" "tmn"
This will fail:
query(fabric, 'times')
Error in times_query(fabric, knife) :
variables cannot be NA for fabric argument
At present, the geoknife
package does not have a query
method for dataset urls.
knife
objectThe webprocess
class holds all the important information
for geoknife processing details for the outsourced geoprocessing engine,
the Geo Data Portal. Public fields in webprocess
:
url
: a character for the location of the web processing
service to be used.algorithm
: a list specifying the algorithm to be used
for processing. Defaults to Area Grid Statistics (weighted).version
: a character specifying the version of the web
processing service to be used. Defaults to 1.0.0.processInputs
: a list of processing details for the
specified algorithm
. These details vary depending on
algorithm
and are this field is automatically reset when
the algorithm
field is set.wait
: a boolean that specifies whether to have R wait
until the process is finished. Defaults to FALSE
email
: a character that species an email address to
send the finished process result to.webprocess
The query
function works on webprocess
,
similar to how it works for webgeom
and
webdata
objects. For a default webprocess
object, the available algorithms can be queried by:
<- webprocess()
knife query(knife, 'algorithms')
## $`Categorical Coverage Fraction`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCategoricalGridCoverageAlgorithm"
##
## $`OPeNDAP Subset`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCoverageOPeNDAPIntersectionAlgorithm"
##
## $`Area Grid Statistics (unweighted)`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureGridStatisticsAlgorithm"
##
## $`Area Grid Statistics (weighted)`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureWeightedGridStatisticsAlgorithm"
Changing the webprocess
url will modify the endpoint for
the query, and different algorithms may be available:
url(knife) <- 'https://cida.usgs.gov/gdp/process/WebProcessingService'
query(knife, 'algorithms')
## $`Categorical Coverage Fraction`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCategoricalGridCoverageAlgorithm"
##
## $`OPeNDAP Subset`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCoverageOPeNDAPIntersectionAlgorithm"
##
## $`Area Grid Statistics (unweighted)`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureGridStatisticsAlgorithm"
##
## $`Area Grid Statistics (weighted)`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureWeightedGridStatisticsAlgorithm"
As noted above, the algorithm
field in
webprocess
is a list, specifying the algorithm name and
relative path to the algorithm endpoint. To access or change the
algorithm:
<- webprocess()
knife algorithm(knife) <- query(knife, 'algorithms')[1]
# -- or --
algorithm(knife) <- list('Area Grid Statistics (weighted)' =
"gov.usgs.cida.gdp.wps.algorithm.FeatureWeightedGridStatisticsAlgorithm")
getting and setting processInputs
for
geoknife
is currently in. Check back later.
The url
field in webprocess
can be accessed
and set as expected:
url(knife) <- 'https://cida.usgs.gov/gdp/process/WebProcessingService'
wait
The wait
boolean in webprocess
can set
during creation:
<- webprocess(wait = TRUE)
knife knife
## An object of class "webprocess":
## url: https://cida.usgs.gov/gdp/process/WebProcessingService
## algorithm: Area Grid Statistics (weighted)
## web processing service version: 1.0.0
## process inputs:
## SUMMARIZE_FEATURE_ATTRIBUTE: false
## SUMMARIZE_TIMESTEP: false
## REQUIRE_FULL_COVERAGE: true
## DELIMITER: COMMA
## STATISTICS:
## GROUP_BY:
## wait: TRUE
## email: NA
email
The email
field in webprocess
can be
accessed and set as expected:
<- webprocess(email = 'fake.email@gmail.com')
knife knife
## An object of class "webprocess":
## url: https://cida.usgs.gov/gdp/process/WebProcessingService
## algorithm: Area Grid Statistics (weighted)
## web processing service version: 1.0.0
## process inputs:
## SUMMARIZE_FEATURE_ATTRIBUTE: false
## SUMMARIZE_TIMESTEP: false
## REQUIRE_FULL_COVERAGE: true
## DELIMITER: COMMA
## STATISTICS:
## GROUP_BY:
## wait: FALSE
## email: fake.email@gmail.com
geojob
detailsThe geojob
in the geoknife
package contains
all of the processing configuration details required to execute a
processing request to the Geo Data Portal and check up on the state of
that request. A geojob
object is created using the
high-level function geoknife()
with the
stencil
, fabric
and optional
knife
arguments as described above.
geojob
class and detailsThe geojob
public fields include:
url
: the url where the processing job was sent to. Is
defined by the url
field of the knife
argument
used to create the jobxml
: the XML used in the POST to the web geoprocessing
service. This XML includes configurations set up by the
fabric
, stencil
, and knife
arguments.id
: the url of the process that is currently running,
or cancel
geojobThe geoknife
package currently limits the user
processing requests to single-running processes, so as to avoid creating
thousands of requests in error, which could overwhelm the processing
resources. If there is a reason to support additional jobs at one time,
please email the package maintainers with your query.
To cancel and existing job: Cancel a running job but retain the details:
id(job)
## [1] "https://cida.usgs.gov:80/gdp/process/RetrieveResultServlet?id=a264a88c-9672-4029-915b-a09b1403d26a"
<- cancel(job)
job id(job)
## [1] "<no active job>"
To cancel any running job without specifying the geojob
reference:
cancel()
geoknife outsources all major geospatial processing tasks to a remote server. Because of this, users must have an active internet connection. Problems with connections to datasets or the processing resources are rare, but they do happen. When experiencing a connectivity problem, the best approach is often to try again later or email gdp@usgs.gov with any questions. The various web dependencies are described below.
The U.S. Geological Survey’s “Geo Data Portal” (GDP) provides the
data access and processing services that are leveraged by the
geoknife
package. See cida.usgs.gov/gdp for the GDP user
interface.