03 Using Cache with SpaDES

Eliot J. B. McIntire

January 31 2018

As part of a reproducible work flow, caching of various function calls are a critical component. Down the road, it is likely that an entire work flow from raw data to publication, decision support, report writing, presentation building etc., could be built and be reproducible anywhere, on demand. The reproducible::Cache function is built to work with any R function. However, it becomes very powerful in a SpaDES context because we can build large, powerful applications that are transparent and tied to the raw data that may be many conceptual steps upstream in the workflow. To do this, we have built several customizations within the SpaDES package. Important to this is dealing correctly with the simList, which is an object that has slot that is an environment. But more important are the various tools that can be used at higher levels, i.e., not just for “standard” functions.

1 Caching as part of SpaDES

Some of the details of the simList-specific features of this Cache function include:

In a SpaDES context, there are several levels of caching that can be used as part of a reproducible workflow. Each level can be used to a modeler’s advantage; and, all can be – and are often – used concurrently.

1.1 At the spades level

And entire call to spades or experiment can be cached. This will have the effect of eliminating any stochasticity in the model as the output will simply be the cached version of the simList. This is likely most useful in situations where reproducibility is more important than “new” stochasticity (e.g., building decision support systems, apps, final version of a manuscript).

This functionality can be achieved within a spades call.

##    user  system elapsed 
##   3.077   0.301   3.588

Note that if there were any visualizations (here we turned them off with .plotInitialTime = NA above) they will happen the first time through, but not the cached times.

##   Using cached copy of burn event in fireSpread module
##    user  system elapsed 
##   0.086   0.009   0.096
## [1] TRUE

1.2 At the experiment level

This functionality can be achieved within an experiment call. This can be done 2 ways, either: “internally” through the cache argument, which will cache each spades call; or, “externally” which will cache the entire experiment. If there are lots of spades calls, then the former will be slow as the simList will be digested once per spades call.

1.2.1 Using cache argument

##    user  system elapsed 
##   2.589   0.230   2.816
##   Using cached copy of burn event in fireSpread module
##   Using cached copy of burn event in fireSpread module
##    user  system elapsed 
##   0.136   0.014   0.151
## [1] TRUE

1.2.2 Wrapping experiment with Cache

Here, the simList (and other arguments to experiment) is hashed once, and if it is found to be the same as previous, then the returned list of simList objects is recovered. This means that even a very large experiment, with many replicates and combinations of parameters and modules can be recovered very quickly. Here we show that you can output objects to disk, so the list of simList objects doesn’t get too big. Then, when we recover it in the Cached version, all the files are still there, the list of simList objects is small, so very fast to recover.

##    user  system elapsed 
##   3.723   0.301   4.032

The second time is way faster. We see the output files in the same location.

##   loading cached result from previous experiment call.
##    user  system elapsed 
##   0.028   0.003   0.032
## [1] TRUE
## [1] "experiment.RData"         "rep1/landscape_year5.rds"
## [3] "rep2/landscape_year5.rds" "rep3/landscape_year5.rds"

Notice that speed up can be enormous; in this case ~100 times.

1.3 Module-level caching

If the parameter .useCache in the module’s metadata is set to TRUE, then every event in the module will be cached. That means that every time that module is called from within a spades or experiment call, Cache will be called. Only the objects inside the simList that correspond to the inputObjects or the outputObjects from the module metadata will be assessed for caching.

For general use, module-level caching would be mostly useful for modules that have no stochasticity, such as data-preparation modules, GIS modules etc.

In this example, we will use the cache on the randomLandscapes module. This means that each subsequent call to spades will result in identical outputs from the randomLandscapes module (only!). This would be useful when only one random landscape is needed simply for trying something out, or putting into production code (e.g., publication, decision support, etc.).

## This is the current event, printed as it is happening:
## eventTime moduleName eventType eventPriority
## 0         checkpoint init      5            
## 0         save       init      5            
## 0         progress   init      5            
## 0         load       init      5            
## 0         randomLandscapes init      5            
## 0         fireSpread       init      5            
## 1         fireSpread       burn      5            
## 1         fireSpread       stats     5            
## 2         fireSpread       burn      5            
## 2         fireSpread       stats     5            
## 3         fireSpread       burn      5            
## 3         fireSpread       stats     5            
## 4         fireSpread       burn      5            
## 4         fireSpread       stats     5            
## 5         fireSpread       burn      5            
## 5         fireSpread       stats     5            
## 5         save             spades    10
##    user  system elapsed 
##   1.330   0.114   1.448
## This is the current event, printed as it is happening:
## eventTime moduleName eventType eventPriority
## 0         checkpoint init      5            
## 0         save       init      5            
## 0         progress   init      5            
## 0         load       init      5            
## 0         randomLandscapes init      5            
##   Using cached copy of randomLandscapes module
## 0         fireSpread       init      5            
## 1         fireSpread       burn      5            
## 1         fireSpread       stats     5            
## 2         fireSpread       burn      5            
## 2         fireSpread       stats     5            
## 3         fireSpread       burn      5            
## 3         fireSpread       stats     5            
## 4         fireSpread       burn      5            
## 4         fireSpread       stats     5            
## 5         fireSpread       burn      5            
## 5         fireSpread       stats     5            
## 5         save             spades    10
##    user  system elapsed 
##   0.219   0.017   0.239

Test that only layers produced in randomLandscapes are identical, not fireSpread.

## $DEM
## [1] TRUE
## 
## $forestAge
## [1] TRUE
## 
## $habitatQuality
## [1] TRUE
## 
## $percentPine
## [1] TRUE
## 
## $Fires
## [1] FALSE

1.4 Event-level caching

If the parameter .useCache in the module’s metadata is set to a character or character vector, then that or those event(s), identified by their name, will be cached. That means that every time the event is called from within a spades or experiment call, Cache will be called. Only the objects inside the simList that correspond to the inputObjects or the outputObjects as defined in the module metadata will be assessed for caching inputs or outputs, respectively. The fact that all and only the named inputObjects and outputObjects are cached and returned may be inefficient (i.e., it may cache more objects than are necessary) for individual events.

Similar to module-level caching, event-level caching would be mostly useful for events that have no stochasticity, such as data-preparation events, GIS events etc. Here, we don’t change the module-level caching for randomLandscapes, but we add to it a cache for only the “init” event for fireSpread.

## This is the current event, printed as it is happening:
## eventTime moduleName eventType eventPriority
## 0         checkpoint init      5            
## 0         save       init      5            
## 0         progress   init      5            
## 0         load       init      5            
## 0         randomLandscapes init      5            
## 0         fireSpread       init      5            
## 1         fireSpread       burn      5            
## 1         fireSpread       stats     5            
## 2         fireSpread       burn      5            
## 2         fireSpread       stats     5            
## 3         fireSpread       burn      5            
## 3         fireSpread       stats     5            
## 4         fireSpread       burn      5            
## 4         fireSpread       stats     5            
## 5         fireSpread       burn      5            
## 5         fireSpread       stats     5            
## 5         save             spades    10
##    user  system elapsed 
##   1.473   0.143   1.627
## This is the current event, printed as it is happening:
## eventTime moduleName eventType eventPriority
## 0         checkpoint init      5            
## 0         save       init      5            
## 0         progress   init      5            
## 0         load       init      5            
## 0         randomLandscapes init      5            
##   Using cached copy of randomLandscapes module
## 0         fireSpread       init      5            
##   Using cached copy of init event in fireSpread module
## 1         fireSpread       burn      5            
## 1         fireSpread       stats     5            
## 2         fireSpread       burn      5            
## 2         fireSpread       stats     5            
## 3         fireSpread       burn      5            
## 3         fireSpread       stats     5            
## 4         fireSpread       burn      5            
## 4         fireSpread       stats     5            
## 5         fireSpread       burn      5            
## 5         fireSpread       stats     5            
## 5         save             spades    10
##    user  system elapsed 
##   0.258   0.020   0.279

1.5 Function-level caching

Any function can be cached using: Cache(FUN = functionName, ...).

This will be a slight change to a function call, such as: projectRaster(raster, crs = crs(newRaster)) to Cache(projectRaster, raster, crs = crs(newRaster)).

##    user  system elapsed 
##   3.115   0.124   3.296
##   loading cached result from previous gaussMap call.
##    user  system elapsed 
##   0.084   0.006   0.091
## [1] TRUE

1.6 Working with the Cache manually

Since the cache is simply an archivist repository, all archivist functions will work as is. In addition, there are several helpers in the reproducible package, including showCache, keepCache and clearCache that may be useful. Also, one can access cached items manually (rather than simply rerunning the same Cache function again).

##      tagKey   tagValue         createdDate
## 1: function     spades 2018-01-31 16:39:38
## 2: function experiment 2018-01-31 16:39:42
## 3: function    doEvent 2018-01-31 16:39:46
## 4: function     spades 2018-01-31 16:39:46
## 5: function   gaussMap 2018-01-31 16:39:51
## 6: function    doEvent 2018-01-31 16:39:46
## 7: function     spades 2018-01-31 16:39:46
## 8: function     spades 2018-01-31 16:39:34
## 9: function     spades 2018-01-31 16:39:36

2 Workflow

In general, we feel that a liberal use of Cache will make a re-useable and reproducible work flow. shiny apps can be made, taking advantage of Cache. Indeed, much of the difficulty in managing data sets and saving them for future use, can be accommodated by caching.