Abstract
This vignette shows working examples of how to fit (multi-) species distribution models with cito. Training neural networks is tricky compared to other ML algorithms that converge more easily (due to various reasons). The purpose of this vignette is to provide an example workflow and to point out common caveats when training a neural networkThe goal is to build a SDM for the African elephant. A pre-processed dataset from Angelov, 2020 can be found in the EcoData package which is only available on github:
set.seed(1337)
if(!require(EcoData)) devtools::install_github(repo = "TheoreticalEcology/EcoData",
dependencies = FALSE, build_vignettes = FALSE)
#> Loading required package: EcoData
library(EcoData)
df = EcoData::elephant$occurenceData
head(df)
#> Presence bio1 bio2 bio3 bio4 bio5 bio6 bio7 bio8 bio9 bio10 bio11 bio12
#> 3364 0 -0.4981747 -0.2738045 0.5368968 -0.5409999 -0.36843571 0.2947850 -0.5260099 -1.2253960 0.2494100 -0.64527314 -0.06267842 0.6285371
#> 6268 0 0.6085908 -0.5568352 1.0340686 -1.2492050 -0.11835651 0.8221087 -0.8938475 0.4233787 0.7746249 0.09168503 0.94419518 1.1121516
#> 10285 0 -0.7973005 1.4648130 -1.0540532 2.0759423 0.07614953 -1.5860029 1.6284678 0.2768209 -1.5153122 -0.03648161 -1.44165748 -1.2351482
#> 2247 0 0.6385034 1.3435141 -0.1591439 -0.5107148 1.10425291 -0.1622288 0.8577603 0.4600181 0.5855475 0.54026827 0.68153250 0.5951165
#> 9821 0 0.6684160 -0.6781341 0.6363311 -0.9906170 0.15950927 0.9099960 -0.8062671 0.3867393 0.8586593 0.31597665 0.94419518 1.1003561
#> 1351 0 0.9675418 -0.6781341 -0.3580126 -0.3748202 0.77081398 0.8748411 -0.3858812 0.3134604 1.0477367 0.98885151 0.94419518 0.7287986
#> bio13 bio14 bio15 bio16 bio17 bio18 bio19
#> 3364 0.6807958 -0.29703736 -0.008455252 0.7124535 -0.2949994 -1.06812752 1.96201807
#> 6268 0.5918442 0.01619202 -0.884507980 0.5607328 0.3506918 1.22589281 -0.36205814
#> 10285 -1.3396742 -0.50585695 0.201797403 -1.3499999 -0.5616980 -0.42763181 -0.62895735
#> 2247 0.8714061 -0.55806185 0.236839512 1.1012378 -0.5616980 -0.20541902 -0.58378979
#> 9821 0.5537222 0.59044589 -1.024676416 0.6413344 0.7437213 0.06254347 -0.05409751
#> 1351 1.1255533 -0.50585695 0.236839512 1.2956300 -0.4494038 -0.90473576 2.47939193
Presence is our response variable and we have the 19 bioclim variables as features/predictors.
Let’s split part of the data away so that we can use it at the end to evaluate our model:
We will first try a simple DNN with default values and the binomial likelihood. We use 30% of the data as validation holdout to check for overfitting:
library(cito)
model = dnn(Presence~., data = df,
batchsize = 100L,
validation = 0.3, loss = "binomial",
verbose = FALSE)
We see that the training and test losses were still decreasing which means we didn’t train the model long enough. We could now either increase the number of epochs or increase the learning rate so that the model trains faster:
model = dnn(Presence~., data = df,
batchsize = 100L,
lr = 0.05,
validation = 0.3, loss = "binomial",
verbose = FALSE)
Much better! But still now enough epochs. Also, let’s see if we can further decrease the loss by using a wider and deeper neural network:
model = dnn(Presence~., data = df,
batchsize = 100L,
hidden = c(100L, 100L, 100L),
lr = 0.05,
validation = 0.3, loss = "binomial",
verbose = FALSE)
At the end of the training, the losses start to get jumpy, which can be a sign of potential overfitting. We can control that by adding a weak regularization (as we want a L2 regularization, we set alpha to 1.0):
model = dnn(Presence~., data = df,
batchsize = 100L,
epochs = 150L,
hidden = c(100L, 100L, 100L),
lr = 0.05,
lambda = 0.001,
alpha = 1.0,
validation = 0.3, loss = "binomial",
verbose = FALSE)
We will turn on now advanced features that help with the convergence and to reduce overfitting:
learning rate scheduler - reduces learning rate during training
early stopping - stop training when validation loss starts to increase
model = dnn(Presence~., data = df,
batchsize = 100L,
epochs = 150L,
hidden = c(100L, 100L, 100L),
lr = 0.05,
lambda = 0.001,
alpha = 1.0,
validation = 0.3, loss = "binomial",
verbose = FALSE,
lr_scheduler = config_lr_scheduler("reduce_on_plateau", patience = 7), # reduce learning rate each 7 epochs if the validation loss didn't decrease,
early_stopping = 14 # stop training when validation loss didn't decrease for 10 epochs
)
Great! We found now a model architecture and training procedure that fits and trains well. Let’s proceed to our final model
We haven’t directly started with bootstrapping because it complicates the adjustment of the training procedure.
Uncertainties can be obtained by using bootstrapping. Be aware that this can be computational expensive:
model_boot = dnn(Presence~., data = df,
batchsize = 100L,
epochs = 150L,
hidden = c(100L, 100L, 100L),
lr = 0.05,
lambda = 0.001,
alpha = 1.0,
validation = 0.3, loss = "binomial",
verbose = FALSE,
lr_scheduler = config_lr_scheduler("reduce_on_plateau", patience = 7), # reduce learning rate each 7 epochs if the validation loss didn't decrease,
early_stopping = 14, # stop training when validation loss didn't decrease for 10 epochs
bootstrap = 20L,
bootstrap_parallel = 5L
)
We can use the model now for predictions:
The predictions are 2/3 dimensional because of the bootstrapping. Calculate the AUC interval:
hist(sapply(1:20, function(i) Metrics::auc(test$Presence, predictions[i,,])),
xlim = c(0, 1), main = "AUC of ensemble model", xlab = "AUC")
We can now predict the habitat suitability of the elephant (Note that spatial dependencies are required):
library(raster)
#> Loading required package: sp
library(sp)
library(rsample)
library(latticeExtra)
#> Loading required package: lattice
#>
#> Attaching package: 'lattice'
#> The following object is masked from 'package:EcoData':
#>
#> melanoma
library(sp)
library(ggplot2)
#> Need help getting started? Try the R Graphics Cookbook: https://r-graphics.org
#>
#> Attaching package: 'ggplot2'
#> The following object is masked from 'package:latticeExtra':
#>
#> layer
library(maptools)
#> Please note that 'maptools' will be retired during October 2023,
#> plan transition at your earliest convenience (see
#> https://r-spatial.org/r/2023/05/15/evolution4.html and earlier blogs
#> for guidance);some functionality will be moved to 'sp'.
#> Checking rgeos availability: FALSE
customPredictFun = function(model, data) {
return(apply(predict(model, data, reduce = "none"), 2:3, mean)[,1])
}
normalized_raster = EcoData::elephant$predictionData
predictions =
raster::predict(normalized_raster,
model_boot,
fun = customPredictFun)
habitat_plot =
spplot(predictions, colorkey = list(space = "left") )
habitat_plot
Moreover, we can visualize the uncertainty of our model, instead of calculating the average occurrence probability, we calculate for each prediction the standard deviation and visualize it:
Neural networks are often called black-box models but the tools of explainable AI (xAI) allows us to understand them - and actually infer properties similar to what a linear regression model can provide (the calculation can take some time…):
results = summary(model_boot)
results
#> Summary of Deep Neural Network Model
#>
#> ── Feature Importance
#> Importance Std.Err Z value Pr(>|z|)
#> bio1 → 0.5671 0.2955 1.92 0.05493 .
#> bio2 → 0.3368 0.2757 1.22 0.22188
#> bio3 → 0.1971 0.1093 1.80 0.07128 .
#> bio4 → 0.2896 0.1992 1.45 0.14594
#> bio5 → 0.6996 0.3356 2.09 0.03707 *
#> bio6 → 0.5888 0.4589 1.28 0.19940
#> bio7 → 0.1762 0.1117 1.58 0.11486
#> bio8 → 0.3758 0.1940 1.94 0.05270 .
#> bio9 → 2.5090 1.4337 1.75 0.08012 .
#> bio10 → 0.3349 0.2130 1.57 0.11585
#> bio11 → 0.2795 0.2177 1.28 0.19921
#> bio12 → 1.9087 0.6461 2.95 0.00314 **
#> bio13 → 0.3197 0.1760 1.82 0.06924 .
#> bio14 → 0.4169 0.2696 1.55 0.12200
#> bio15 → 0.8942 0.2640 3.39 0.00071 ***
#> bio16 → 0.3672 0.3013 1.22 0.22299
#> bio17 → 0.1733 0.1445 1.20 0.23038
#> bio18 → 0.2410 0.1145 2.10 0.03536 *
#> bio19 → 0.0645 0.0355 1.82 0.06914 .
#> ---
#> Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
#>
#> ── Average Conditional Effects
#> ACE Std.Err Z value Pr(>|z|)
#> bio1 → 0.09733 0.03226 3.02 0.00256 **
#> bio2 → -0.05851 0.03811 -1.54 0.12465
#> bio3 → -0.00413 0.03759 -0.11 0.91261
#> bio4 → 0.03089 0.03336 0.93 0.35445
#> bio5 → 0.11839 0.03709 3.19 0.00141 **
#> bio6 → 0.09144 0.04549 2.01 0.04443 *
#> bio7 → 0.00310 0.03884 0.08 0.93644
#> bio8 → -0.04861 0.03167 -1.54 0.12475
#> bio9 → -0.18590 0.05309 -3.50 0.00046 ***
#> bio10 → -0.03716 0.04174 -0.89 0.37333
#> bio11 → 0.02729 0.04083 0.67 0.50397
#> bio12 → -0.19637 0.03947 -4.97 6.5e-07 ***
#> bio13 → 0.07646 0.03118 2.45 0.01420 *
#> bio14 → 0.09271 0.04582 2.02 0.04302 *
#> bio15 → -0.09483 0.02640 -3.59 0.00033 ***
#> bio16 → -0.05575 0.04044 -1.38 0.16803
#> bio17 → 0.03714 0.04335 0.86 0.39166
#> bio18 → 0.02949 0.01743 1.69 0.09069 .
#> bio19 → 0.00413 0.02446 0.17 0.86604
#> ---
#> Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
#>
#> ── Standard Deviation of Conditional Effects
#> ACE Std.Err Z value Pr(>|z|)
#> bio1 → 0.1238 0.0321 3.86 0.00012 ***
#> bio2 → 0.0827 0.0331 2.50 0.01243 *
#> bio3 → 0.0548 0.0230 2.38 0.01734 *
#> bio4 → 0.0717 0.0258 2.77 0.00553 **
#> bio5 → 0.1366 0.0359 3.81 0.00014 ***
#> bio6 → 0.1116 0.0438 2.55 0.01081 *
#> bio7 → 0.0565 0.0206 2.74 0.00618 **
#> bio8 → 0.0946 0.0224 4.22 2.5e-05 ***
#> bio9 → 0.2123 0.0601 3.53 0.00041 ***
#> bio10 → 0.0792 0.0235 3.37 0.00076 ***
#> bio11 → 0.0859 0.0205 4.18 2.9e-05 ***
#> bio12 → 0.2167 0.0356 6.08 1.2e-09 ***
#> bio13 → 0.0877 0.0267 3.28 0.00102 **
#> bio14 → 0.1134 0.0380 2.98 0.00285 **
#> bio15 → 0.1093 0.0189 5.79 7.0e-09 ***
#> bio16 → 0.0808 0.0343 2.35 0.01856 *
#> bio17 → 0.0671 0.0291 2.30 0.02123 *
#> bio18 → 0.0949 0.0265 3.59 0.00033 ***
#> bio19 → 0.0380 0.0152 2.50 0.01257 *
#> ---
#> Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
Bioclim9, 12, 14, and 16 have large significant average conditional effects ($\approx$ linear effects). We can visualize them using accumulated local effect plots:
Cito supports many different loss functions which we can use to build multi-species distribution models (MSDM). MSDM are multi-label, i.e. they model and predict simultaneously many responses. We will use eucalypts data from Pollock et al., 2014. The dataset has occurrence of 12 species over 458 sites.
load(url("https://github.com/TheoreticalEcology/s-jSDM/raw/master/sjSDM/data/eucalypts.rda"))
# Environment
head(eucalypts$env)
#> Rockiness Sandiness VallyBotFlat PPTann Loaminess cvTemp T0
#> 1 60 1 0 785 0 142 6124.01
#> 2 75 1 0 785 0 142 6124.01
#> 3 70 1 0 780 0 142 3252.96
#> 4 40 1 0 778 0 142 1636.63
#> 5 15 1 0 772 0 142 1352.08
#> 6 80 1 0 841 0 142 5018.48
# PA
head(eucalypts$PA)
#> ALA ARE BAX CAM GON MEL OBL OVA WIL ALP VIM ARO.SAB
#> [1,] 0 0 0 0 0 0 0 0 1 1 0 0
#> [2,] 0 0 0 0 0 0 1 0 1 1 0 0
#> [3,] 0 0 1 0 0 0 0 0 1 1 0 0
#> [4,] 0 0 1 0 0 0 0 0 1 0 0 0
#> [5,] 0 0 1 0 0 0 1 0 0 0 0 0
#> [6,] 0 0 0 0 0 0 0 0 1 1 0 0
Bring data into a format that is usable by cito:
df = cbind(eucalypts$PA, scale(eucalypts$env))
head(df)
#> ALA ARE BAX CAM GON MEL OBL OVA WIL ALP VIM ARO.SAB Rockiness Sandiness VallyBotFlat PPTann Loaminess cvTemp T0
#> [1,] 0 0 0 0 0 0 0 0 1 1 0 0 1.0315338 0.5716827 -0.5939667 -0.005981517 -0.2134535 -1.056073 0.5378148
#> [2,] 0 0 0 0 0 0 1 0 1 1 0 0 1.4558834 0.5716827 -0.5939667 -0.005981517 -0.2134535 -1.056073 0.5378148
#> [3,] 0 0 1 0 0 0 0 0 1 1 0 0 1.3144335 0.5716827 -0.5939667 -0.045456081 -0.2134535 -1.056073 -0.3404551
#> [4,] 0 0 1 0 0 0 0 0 1 0 0 0 0.4657344 0.5716827 -0.5939667 -0.061245907 -0.2134535 -1.056073 -0.8348993
#> [5,] 0 0 1 0 0 0 1 0 0 0 0 0 -0.2415148 0.5716827 -0.5939667 -0.108615385 -0.2134535 -1.056073 -0.9219447
#> [6,] 0 0 0 0 0 0 0 0 1 1 0 0 1.5973333 0.5716827 -0.5939667 0.436133605 -0.2134535 -1.056073 0.1996271
We will use the binomial likelihood - each species occurrence data will be modelled by a binomial likelihood. Build simple model:
model = dnn(cbind(ALA, ARE, BAX, CAM, GON, MEL, OBL, OVA, WIL, ALP, VIM, ARO.SAB)~.,
data = df,
lr = 0.1,
verbose = FALSE,
loss = "binomial")
Plot model:
Our NN has now 12 output nodes, one for each species.
head(predict(model))
#> [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10] [,11] [,12]
#> [1,] -1.836887 -4.988176 0.32807234 -5.222610 -5.308745 -4.067790 -0.867894590 -4.736314 -0.61683136 -1.0782812 -3.910664 -3.400055
#> [2,] -1.657390 -4.839282 0.08797990 -5.386761 -4.938032 -4.516629 -1.182702303 -4.994690 -0.84761620 -0.3660365 -4.247718 -4.020841
#> [3,] -2.085413 -5.099712 0.15239546 -5.287928 -5.096430 -4.624372 -1.044999719 -4.999919 -0.66758049 -0.6565972 -4.395789 -3.780074
#> [4,] -2.725661 -5.414782 0.62847829 -4.861598 -5.769506 -3.869004 -0.428146392 -4.516877 -0.22675626 -2.3427727 -3.797922 -2.450565
#> [5,] -3.021702 -5.423952 1.02425063 -4.336437 -6.150292 -2.892931 -0.006421283 -3.893203 0.08195837 -3.6905608 -3.047264 -1.228144
#> [6,] -1.672060 -4.829151 -0.08096267 -5.475241 -4.819751 -5.006217 -1.395022869 -5.176019 -0.99880135 0.4147617 -4.579223 -4.475727
model_boot = dnn(cbind(ALA, ARE, BAX, CAM, GON, MEL, OBL, OVA, WIL, ALP, VIM, ARO.SAB)~.,
data = df,
loss = "binomial",
epochs = 200L,
hidden = c(50L, 50L),
batchsize = 50L,
lr = 0.1,
lambda = 0.001,
alpha = 1.0,
validation = 0.2,
verbose = FALSE,
lr_scheduler = config_lr_scheduler("reduce_on_plateau", patience = 7), # reduce learning rate each 7 epochs if the validation loss didn't decrease,
early_stopping = 14, # stop training when validation loss didn't decrease for 10 epochs
bootstrap = 20L,
bootstrap_parallel = 5L)
We haven’t really adjusted the training procedure, so let’s check the convergence first:
results = summary(model_boot)
results
#> Summary of Deep Neural Network Model
#>
#> ── Feature Importance
#> Importance Std.Err Z value Pr(>|z|)
#> Rockiness → ALA 0.05591 0.03510 1.59 0.11123
#> Sandiness → ALA 0.01819 0.02898 0.63 0.53031
#> VallyBotFlat → ALA 0.07302 0.04190 1.74 0.08140 .
#> PPTann → ALA 0.01665 0.02027 0.82 0.41153
#> Loaminess → ALA 0.01670 0.01386 1.20 0.22830
#> cvTemp → ALA 0.03050 0.02471 1.23 0.21724
#> T0 → ALA 0.05909 0.03446 1.71 0.08643 .
#>
#> Rockiness → ARE 0.03587 0.02774 1.29 0.19595
#> Sandiness → ARE 0.04953 0.05938 0.83 0.40421
#> VallyBotFlat → ARE 0.06833 0.04268 1.60 0.10935
#> PPTann → ARE 0.03399 0.03270 1.04 0.29861
#> Loaminess → ARE 0.03420 0.01897 1.80 0.07134 .
#> cvTemp → ARE 1.25189 0.46022 2.72 0.00652 **
#> T0 → ARE 0.03928 0.03372 1.16 0.24413
#>
#> Rockiness → BAX 0.09881 0.05129 1.93 0.05403 .
#> Sandiness → BAX 0.09485 0.03736 2.54 0.01113 *
#> VallyBotFlat → BAX 0.17888 0.05286 3.38 0.00071 ***
#> PPTann → BAX 0.01467 0.01240 1.18 0.23667
#> Loaminess → BAX 0.00538 0.00588 0.92 0.35944
#> cvTemp → BAX 0.18118 0.06824 2.66 0.00793 **
#> T0 → BAX 0.00563 0.00475 1.18 0.23606
#>
#> Rockiness → CAM 0.06633 0.03476 1.91 0.05635 .
#> Sandiness → CAM 0.25162 0.10085 2.50 0.01259 *
#> VallyBotFlat → CAM 0.36023 0.14501 2.48 0.01298 *
#> PPTann → CAM 0.02242 0.02065 1.09 0.27754
#> Loaminess → CAM 0.02145 0.02064 1.04 0.29849
#> cvTemp → CAM 0.00418 0.00657 0.64 0.52476
#> T0 → CAM 0.01695 0.01211 1.40 0.16144
#>
#> Rockiness → GON 0.23601 0.12185 1.94 0.05276 .
#> Sandiness → GON 0.02487 0.04504 0.55 0.58078
#> VallyBotFlat → GON 0.10855 0.05694 1.91 0.05660 .
#> PPTann → GON 0.15407 0.11052 1.39 0.16330
#> Loaminess → GON 0.05577 0.03523 1.58 0.11346
#> cvTemp → GON 2.14793 0.74598 2.88 0.00399 **
#> T0 → GON 0.02103 0.02102 1.00 0.31709
#>
#> Rockiness → MEL 0.17740 0.07674 2.31 0.02079 *
#> Sandiness → MEL 0.02014 0.01699 1.19 0.23577
#> VallyBotFlat → MEL 0.01139 0.01082 1.05 0.29247
#> PPTann → MEL 0.04085 0.02205 1.85 0.06398 .
#> Loaminess → MEL 0.02822 0.01559 1.81 0.07029 .
#> cvTemp → MEL 0.02804 0.02145 1.31 0.19109
#> T0 → MEL 0.03865 0.04112 0.94 0.34727
#>
#> Rockiness → OBL 0.11460 0.06456 1.78 0.07586 .
#> Sandiness → OBL 0.02254 0.01224 1.84 0.06553 .
#> VallyBotFlat → OBL 0.09752 0.04155 2.35 0.01891 *
#> PPTann → OBL 0.01894 0.01465 1.29 0.19605
#> Loaminess → OBL 0.06726 0.04543 1.48 0.13879
#> cvTemp → OBL 0.22548 0.08677 2.60 0.00936 **
#> T0 → OBL 0.00581 0.00586 0.99 0.32135
#>
#> Rockiness → OVA 0.11048 0.03845 2.87 0.00406 **
#> Sandiness → OVA 0.16389 0.09224 1.78 0.07561 .
#> VallyBotFlat → OVA 0.11062 0.05946 1.86 0.06283 .
#> PPTann → OVA 0.01255 0.01587 0.79 0.42927
#> Loaminess → OVA 0.03811 0.01947 1.96 0.05027 .
#> cvTemp → OVA 0.00684 0.01811 0.38 0.70577
#> T0 → OVA 0.00574 0.00688 0.83 0.40387
#>
#> Rockiness → WIL 0.03170 0.02511 1.26 0.20670
#> Sandiness → WIL 0.02764 0.01915 1.44 0.14895
#> VallyBotFlat → WIL 0.05395 0.02662 2.03 0.04267 *
#> PPTann → WIL 0.02771 0.01524 1.82 0.06902 .
#> Loaminess → WIL 0.04921 0.03098 1.59 0.11220
#> cvTemp → WIL 0.64483 0.17280 3.73 0.00019 ***
#> T0 → WIL 0.00978 0.00968 1.01 0.31246
#>
#> Rockiness → ALP 1.33898 0.45270 2.96 0.00310 **
#> Sandiness → ALP 0.02993 0.02998 1.00 0.31820
#> VallyBotFlat → ALP 0.07775 0.06459 1.20 0.22870
#> PPTann → ALP 0.83178 0.25272 3.29 0.00100 ***
#> Loaminess → ALP 0.03192 0.03546 0.90 0.36808
#> cvTemp → ALP 0.11859 0.06806 1.74 0.08144 .
#> T0 → ALP 0.02064 0.01572 1.31 0.18925
#>
#> Rockiness → VIM 0.11096 0.03613 3.07 0.00214 **
#> Sandiness → VIM 0.11823 0.05786 2.04 0.04102 *
#> VallyBotFlat → VIM 0.03927 0.03977 0.99 0.32345
#> PPTann → VIM 0.01356 0.01253 1.08 0.27921
#> Loaminess → VIM 0.05022 0.01879 2.67 0.00751 **
#> cvTemp → VIM 0.01314 0.01189 1.11 0.26914
#> T0 → VIM 0.01542 0.01302 1.18 0.23648
#>
#> Rockiness → ARO.SAB 0.30750 0.07678 4.01 6.2e-05 ***
#> Sandiness → ARO.SAB 0.02214 0.01923 1.15 0.24977
#> VallyBotFlat → ARO.SAB 0.16434 0.07468 2.20 0.02776 *
#> PPTann → ARO.SAB 0.07733 0.04191 1.85 0.06501 .
#> Loaminess → ARO.SAB 0.00689 0.00914 0.75 0.45086
#> cvTemp → ARO.SAB 0.23335 0.08098 2.88 0.00396 **
#> T0 → ARO.SAB 0.01032 0.00978 1.06 0.29141
#>
#> ---
#> Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
#>
#> ── Average Conditional Effects
#> ACE Std.Err Z value Pr(>|z|)
#> Rockiness → ALA 0.022009 0.011727 1.88 0.06055 .
#> Sandiness → ALA 0.000475 0.015970 0.03 0.97628
#> VallyBotFlat → ALA -0.040763 0.016232 -2.51 0.01203 *
#> PPTann → ALA 0.010853 0.010901 1.00 0.31945
#> Loaminess → ALA -0.023536 0.012049 -1.95 0.05078 .
#> cvTemp → ALA 0.022284 0.012757 1.75 0.08068 .
#> T0 → ALA 0.024279 0.009628 2.52 0.01167 *
#>
#> Rockiness → ARE 0.004097 0.014343 0.29 0.77515
#> Sandiness → ARE 0.016287 0.010944 1.49 0.13669
#> VallyBotFlat → ARE -0.027409 0.011461 -2.39 0.01678 *
#> PPTann → ARE -0.006271 0.009138 -0.69 0.49252
#> Loaminess → ARE -0.027573 0.012705 -2.17 0.02999 *
#> cvTemp → ARE 0.088176 0.018124 4.87 1.1e-06 ***
#> T0 → ARE 0.011037 0.011154 0.99 0.32241
#>
#> Rockiness → BAX -0.087220 0.029893 -2.92 0.00353 **
#> Sandiness → BAX 0.089410 0.019960 4.48 7.5e-06 ***
#> VallyBotFlat → BAX -0.126835 0.019717 -6.43 1.3e-10 ***
#> PPTann → BAX -0.017420 0.022320 -0.78 0.43510
#> Loaminess → BAX -0.009810 0.018394 -0.53 0.59380
#> cvTemp → BAX -0.123235 0.023233 -5.30 1.1e-07 ***
#> T0 → BAX 0.003709 0.016590 0.22 0.82308
#>
#> Rockiness → CAM -0.033868 0.010889 -3.11 0.00187 **
#> Sandiness → CAM -0.039071 0.010464 -3.73 0.00019 ***
#> VallyBotFlat → CAM 0.042429 0.008675 4.89 1.0e-06 ***
#> PPTann → CAM -0.018928 0.010994 -1.72 0.08512 .
#> Loaminess → CAM -0.016836 0.007736 -2.18 0.02953 *
#> cvTemp → CAM 0.000116 0.006826 0.02 0.98640
#> T0 → CAM -0.004428 0.008660 -0.51 0.60915
#>
#> Rockiness → GON 0.026573 0.008300 3.20 0.00137 **
#> Sandiness → GON 0.006518 0.009327 0.70 0.48466
#> VallyBotFlat → GON -0.024043 0.010133 -2.37 0.01765 *
#> PPTann → GON -0.015254 0.009903 -1.54 0.12346
#> Loaminess → GON -0.024562 0.009682 -2.54 0.01118 *
#> cvTemp → GON 0.080506 0.014910 5.40 6.7e-08 ***
#> T0 → GON 0.003805 0.008977 0.42 0.67169
#>
#> Rockiness → MEL -0.079029 0.027386 -2.89 0.00390 **
#> Sandiness → MEL 0.017185 0.015587 1.10 0.27023
#> VallyBotFlat → MEL 0.001956 0.014723 0.13 0.89432
#> PPTann → MEL -0.037932 0.013843 -2.74 0.00614 **
#> Loaminess → MEL -0.035711 0.012857 -2.78 0.00548 **
#> cvTemp → MEL 0.022832 0.010397 2.20 0.02809 *
#> T0 → MEL 0.020194 0.013265 1.52 0.12792
#>
#> Rockiness → OBL -0.094420 0.029524 -3.20 0.00138 **
#> Sandiness → OBL -0.026955 0.014240 -1.89 0.05838 .
#> VallyBotFlat → OBL -0.077027 0.024505 -3.14 0.00167 **
#> PPTann → OBL -0.034164 0.021544 -1.59 0.11278
#> Loaminess → OBL 0.057495 0.027004 2.13 0.03324 *
#> cvTemp → OBL -0.128407 0.018304 -7.02 2.3e-12 ***
#> T0 → OBL 0.003082 0.018364 0.17 0.86670
#>
#> Rockiness → OVA -0.042656 0.011541 -3.70 0.00022 ***
#> Sandiness → OVA -0.031969 0.011213 -2.85 0.00436 **
#> VallyBotFlat → OVA 0.024976 0.008358 2.99 0.00280 **
#> PPTann → OVA -0.015456 0.012192 -1.27 0.20488
#> Loaminess → OVA -0.023076 0.007738 -2.98 0.00286 **
#> cvTemp → OVA 0.005134 0.010920 0.47 0.63826
#> T0 → OVA 0.005040 0.005497 0.92 0.35923
#>
#> Rockiness → WIL -0.039157 0.017691 -2.21 0.02687 *
#> Sandiness → WIL 0.030581 0.013444 2.27 0.02293 *
#> VallyBotFlat → WIL -0.048129 0.016968 -2.84 0.00456 **
#> PPTann → WIL -0.029462 0.013224 -2.23 0.02588 *
#> Loaminess → WIL 0.027987 0.010349 2.70 0.00684 **
#> cvTemp → WIL -0.144880 0.018816 -7.70 1.4e-14 ***
#> T0 → WIL -0.009177 0.011576 -0.79 0.42788
#>
#> Rockiness → ALP 0.067017 0.011881 5.64 1.7e-08 ***
#> Sandiness → ALP 0.011443 0.009825 1.16 0.24411
#> VallyBotFlat → ALP -0.026809 0.015801 -1.70 0.08975 .
#> PPTann → ALP 0.053667 0.008062 6.66 2.8e-11 ***
#> Loaminess → ALP -0.012781 0.013272 -0.96 0.33554
#> cvTemp → ALP -0.020158 0.008833 -2.28 0.02249 *
#> T0 → ALP 0.002295 0.008227 0.28 0.78033
#>
#> Rockiness → VIM -0.056519 0.016041 -3.52 0.00043 ***
#> Sandiness → VIM -0.033793 0.011963 -2.82 0.00473 **
#> VallyBotFlat → VIM 0.014227 0.013794 1.03 0.30235
#> PPTann → VIM -0.021759 0.012002 -1.81 0.06982 .
#> Loaminess → VIM -0.030142 0.008741 -3.45 0.00056 ***
#> cvTemp → VIM -0.013018 0.009845 -1.32 0.18609
#> T0 → VIM 0.010468 0.007531 1.39 0.16453
#>
#> Rockiness → ARO.SAB -0.151047 0.021186 -7.13 1.0e-12 ***
#> Sandiness → ARO.SAB 0.011755 0.020452 0.57 0.56544
#> VallyBotFlat → ARO.SAB 0.070048 0.017976 3.90 9.7e-05 ***
#> PPTann → ARO.SAB -0.071959 0.021524 -3.34 0.00083 ***
#> Loaminess → ARO.SAB -0.018187 0.016509 -1.10 0.27061
#> cvTemp → ARO.SAB -0.104956 0.017885 -5.87 4.4e-09 ***
#> T0 → ARO.SAB -0.010473 0.016703 -0.63 0.53064
#> ---
#> Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
#>
#> ── Standard Deviation of Conditional Effects
#> ACE Std.Err Z value Pr(>|z|)
#> Rockiness → ALA 0.01891 0.00767 2.47 0.01363 *
#> Sandiness → ALA 0.01296 0.00632 2.05 0.04016 *
#> VallyBotFlat → ALA 0.02929 0.01222 2.40 0.01655 *
#> PPTann → ALA 0.01091 0.00694 1.57 0.11573
#> Loaminess → ALA 0.01855 0.00729 2.54 0.01093 *
#> cvTemp → ALA 0.01852 0.00840 2.20 0.02753 *
#> T0 → ALA 0.01885 0.00785 2.40 0.01632 *
#>
#> Rockiness → ARE 0.02005 0.00741 2.70 0.00684 **
#> Sandiness → ARE 0.02258 0.01324 1.71 0.08815 .
#> VallyBotFlat → ARE 0.03524 0.01468 2.40 0.01639 *
#> PPTann → ARE 0.01595 0.00713 2.24 0.02521 *
#> Loaminess → ARE 0.03621 0.01260 2.87 0.00407 **
#> cvTemp → ARE 0.10603 0.02423 4.38 1.2e-05 ***
#> T0 → ARE 0.01785 0.00994 1.80 0.07257 .
#>
#> Rockiness → BAX 0.03826 0.01149 3.33 0.00087 ***
#> Sandiness → BAX 0.03835 0.00853 4.49 7.0e-06 ***
#> VallyBotFlat → BAX 0.05375 0.00939 5.72 1.0e-08 ***
#> PPTann → BAX 0.01688 0.00553 3.05 0.00226 **
#> Loaminess → BAX 0.01364 0.00317 4.31 1.7e-05 ***
#> cvTemp → BAX 0.05601 0.01324 4.23 2.3e-05 ***
#> T0 → BAX 0.01219 0.00255 4.77 1.8e-06 ***
#>
#> Rockiness → CAM 0.03871 0.01197 3.23 0.00122 **
#> Sandiness → CAM 0.04723 0.01351 3.50 0.00047 ***
#> VallyBotFlat → CAM 0.04881 0.01152 4.24 2.3e-05 ***
#> PPTann → CAM 0.02328 0.01255 1.85 0.06364 .
#> Loaminess → CAM 0.02208 0.00880 2.51 0.01209 *
#> cvTemp → CAM 0.01169 0.00437 2.68 0.00746 **
#> T0 → CAM 0.01263 0.00506 2.50 0.01258 *
#>
#> Rockiness → GON 0.03927 0.01030 3.81 0.00014 ***
#> Sandiness → GON 0.01565 0.00880 1.78 0.07521 .
#> VallyBotFlat → GON 0.03470 0.01351 2.57 0.01019 *
#> PPTann → GON 0.02440 0.01049 2.33 0.02004 *
#> Loaminess → GON 0.03487 0.01189 2.93 0.00336 **
#> cvTemp → GON 0.10688 0.01872 5.71 1.1e-08 ***
#> T0 → GON 0.01369 0.00601 2.28 0.02270 *
#>
#> Rockiness → MEL 0.05372 0.02164 2.48 0.01306 *
#> Sandiness → MEL 0.01630 0.00820 1.99 0.04694 *
#> VallyBotFlat → MEL 0.01257 0.00499 2.52 0.01186 *
#> PPTann → MEL 0.02526 0.01001 2.52 0.01162 *
#> Loaminess → MEL 0.02468 0.00994 2.48 0.01302 *
#> cvTemp → MEL 0.01867 0.00783 2.38 0.01709 *
#> T0 → MEL 0.01580 0.00952 1.66 0.09697 .
#>
#> Rockiness → OBL 0.04964 0.01679 2.96 0.00310 **
#> Sandiness → OBL 0.01974 0.00581 3.40 0.00068 ***
#> VallyBotFlat → OBL 0.04075 0.01305 3.12 0.00180 **
#> PPTann → OBL 0.02174 0.00956 2.27 0.02301 *
#> Loaminess → OBL 0.03039 0.01270 2.39 0.01672 *
#> cvTemp → OBL 0.06780 0.01532 4.43 9.6e-06 ***
#> T0 → OBL 0.01289 0.00387 3.33 0.00087 ***
#>
#> Rockiness → OVA 0.04019 0.01224 3.28 0.00102 **
#> Sandiness → OVA 0.03335 0.01264 2.64 0.00834 **
#> VallyBotFlat → OVA 0.02442 0.00920 2.65 0.00794 **
#> PPTann → OVA 0.01711 0.00832 2.06 0.03980 *
#> Loaminess → OVA 0.02404 0.00812 2.96 0.00308 **
#> cvTemp → OVA 0.01284 0.00711 1.81 0.07096 .
#> T0 → OVA 0.00869 0.00312 2.79 0.00529 **
#>
#> Rockiness → WIL 0.03189 0.01090 2.93 0.00344 **
#> Sandiness → WIL 0.02465 0.01029 2.39 0.01663 *
#> VallyBotFlat → WIL 0.03782 0.01254 3.02 0.00257 **
#> PPTann → WIL 0.02472 0.00869 2.85 0.00443 **
#> Loaminess → WIL 0.02268 0.00897 2.53 0.01144 *
#> cvTemp → WIL 0.10938 0.02074 5.27 1.3e-07 ***
#> T0 → WIL 0.01300 0.00424 3.06 0.00218 **
#>
#> Rockiness → ALP 0.09195 0.01761 5.22 1.8e-07 ***
#> Sandiness → ALP 0.01901 0.00862 2.21 0.02741 *
#> VallyBotFlat → ALP 0.03707 0.02078 1.78 0.07452 .
#> PPTann → ALP 0.07306 0.01311 5.57 2.5e-08 ***
#> Loaminess → ALP 0.02372 0.01312 1.81 0.07063 .
#> cvTemp → ALP 0.03036 0.01170 2.59 0.00946 **
#> T0 → ALP 0.01335 0.00490 2.73 0.00641 **
#>
#> Rockiness → VIM 0.04343 0.01141 3.81 0.00014 ***
#> Sandiness → VIM 0.03041 0.00952 3.20 0.00139 **
#> VallyBotFlat → VIM 0.01537 0.00997 1.54 0.12330
#> PPTann → VIM 0.01771 0.00829 2.14 0.03258 *
#> Loaminess → VIM 0.02493 0.00719 3.46 0.00053 ***
#> cvTemp → VIM 0.01504 0.00519 2.90 0.00376 **
#> T0 → VIM 0.01094 0.00482 2.27 0.02316 *
#>
#> Rockiness → ARO.SAB 0.10781 0.01900 5.67 1.4e-08 ***
#> Sandiness → ARO.SAB 0.01962 0.00995 1.97 0.04867 *
#> VallyBotFlat → ARO.SAB 0.05147 0.01240 4.15 3.3e-05 ***
#> PPTann → ARO.SAB 0.05223 0.01751 2.98 0.00286 **
#> Loaminess → ARO.SAB 0.01985 0.00692 2.87 0.00412 **
#> cvTemp → ARO.SAB 0.07545 0.01460 5.17 2.4e-07 ***
#> T0 → ARO.SAB 0.01658 0.00702 2.36 0.01813 *
#> ---
#> Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
cvTemp is significant for many species. Visualization of the effect:
In recent years, joint species distribution models (JSDM) have emerged as a new class of models capable of jointly modeling species. JSDM account for co-occurrences between species that cannot be explained by the environment alone with biotic associations Pollock et al., 2014. Technically, biotic associations are coded by a covariance matrix that absorbs the species co-occurrences “left over” in the residuals. Two common models for JSDMs are the latent variable model Warton et al., 2015 and the multivariate probit model (MVP) (Pollock et al., 2014).
In ‘cito’ we provide an experimental likelihood for the multivariate probit model which is based on a Monte-Carlo approximation (Chen et al., 2018). However, ‘cito’ is not a JSDM-focused package which means that many interesting features of JSDMs such as community assembly analyses are not available in ‘cito’. If you want to perform an in-depth analysis with JSDM such as to reveal the internal metacommunity structure we recommend, for example, the sjSDM package: