`gum()`

constructs Generalised Exponential Smoothing - pure additive state-space model. It is a part of smooth package.

In this vignette we will use data from `Mcomp`

package, so it is advised to install it.

Let’s load the necessary packages:

```
require(smooth)
require(Mcomp)
```

You may note that `Mcomp`

depends on `forecast`

package and if you load both `forecast`

and `smooth`

, then you will have a message that `forecast()`

function is masked from the environment. There is nothing to be worried about - `smooth`

uses this function for consistency purposes and has exactly the same original `forecast()`

as in the `forecast`

package. The inclusion of this function in `smooth`

was done only in order not to include `forecast`

in dependencies of the package.

Generalised Exponential Smoothing is a next step from CES. It is a state-space model in which all the matrices and vectors are estimated. It is very demanding in sample size, but is also insanely flexible.

A simple call by default constructs GUM\((1^1,1^m)\), where \(m\) is frequency of the data. So for our example with monthly data N2457, we will have GUM\((1^1,1^{12})\):

`gum(M3$N2457$x, h=18, holdout=TRUE)`

```
## Time elapsed: 0.25 seconds
## Model estimated: GUM(1[1],1[12])
## Persistence vector g:
## [,1] [,2]
## [1,] 0.273 0.005
## Transition matrix F:
## [,1] [,2]
## [1,] 0.88 0.728
## [2,] 0.04 0.882
## Measurement vector w: 0.386, 0.079
## Initial values were optimised.
## 22 parameters were estimated in the process
## Residuals standard deviation: 1499.584
## Cost function type: MSE; Cost function value: 1738726.863
##
## Information criteria:
## AIC AICc BIC BICc
## 1713.034 1726.710 1769.678 1800.959
## Forecast errors:
## MPE: 22.5%; sCE: -1760.7%; Bias: 86.2%; MAPE: 38.1%
## MASE: 2.782; sMAE: 113.5%; RelMAE: 1.188; sMSE: 221.7%
```

But some different orders and lags can be specified. For example:

`gum(M3$N2457$x, h=18, holdout=TRUE, orders=c(2,1), lags=c(1,12))`

```
## Time elapsed: 0.27 seconds
## Model estimated: GUM(2[1],1[12])
## Persistence vector g:
## [,1] [,2] [,3]
## [1,] 0.116 -0.064 -0.081
## Transition matrix F:
## [,1] [,2] [,3]
## [1,] 0.810 0.035 0.018
## [2,] 0.258 0.829 0.076
## [3,] 0.282 0.021 1.000
## Measurement vector w: 0.94, 0.963, 0.53
## Initial values were optimised.
## 30 parameters were estimated in the process
## Residuals standard deviation: 1508.931
## Cost function type: MSE; Cost function value: 1572684.312
##
## Information criteria:
## AIC AICc BIC BICc
## 1719.299 1747.480 1796.540 1861.002
## Forecast errors:
## MPE: 24.2%; sCE: -1794%; Bias: 87.2%; MAPE: 38.4%
## MASE: 2.792; sMAE: 113.9%; RelMAE: 1.193; sMSE: 219.3%
```

Function `auto.gum()`

is now implemented in `smooth`

, but it works slowly as it needs to check a large number of models:

`auto.gum(M3$N2457$x, h=18, holdout=TRUE, intervals=TRUE, silent=FALSE)`

```
## Starting preliminary loop: 1 out of 122 out of 123 out of 124 out of 125 out of 126 out of 127 out of 128 out of 129 out of 1210 out of 1211 out of 1212 out of 12. Done.
## Searching for appropriate lags: —\|/—\|/—\|/We found them!
## Searching for appropriate orders: —\|/—\|/—Orders found.
## Reestimating the model. Done!
```

```
## Time elapsed: 15.54 seconds
## Model estimated: GUM(1[1],1[8])
## Persistence vector g:
## [,1] [,2]
## [1,] 0.005 0.245
## Transition matrix F:
## [,1] [,2]
## [1,] 1.000 0.975
## [2,] 0.014 0.024
## Measurement vector w: 0.983, 0
## Initial values were produced using backcasting.
## 9 parameters were estimated in the process
## Residuals standard deviation: 1341.911
## Cost function type: MSE; Cost function value: 1633647.19
##
## Information criteria:
## AIC AICc BIC BICc
## 1680.988 1683.057 1704.160 1708.892
## 95% parametric prediction intervals were constructed
## 61% of values are in the prediction interval
## Forecast errors:
## MPE: 7.3%; sCE: -1166.2%; Bias: 55.7%; MAPE: 35.9%
## MASE: 2.403; sMAE: 98%; RelMAE: 1.026; sMSE: 171%
```

In addition to standard values that other functions accept, GUM accepts predefined values for transition matrix, measurement and persistence vectors. For example, something more common can be passed to the function:

```
transition <- matrix(c(1,0,0,1,1,0,0,0,1),3,3)
measurement <- c(1,1,1)
gum(M3$N2457$x, h=18, holdout=TRUE, orders=c(2,1), lags=c(1,12), transition=transition, measurement=measurement)
```

```
## Time elapsed: 0.14 seconds
## Model estimated: GUM(2[1],1[12])
## Persistence vector g:
## [,1] [,2] [,3]
## [1,] 0.172 0.004 0.083
## Transition matrix F:
## [,1] [,2] [,3]
## [1,] 1 1 0
## [2,] 0 1 0
## [3,] 0 0 1
## Measurement vector w: 1, 1, 1
## Initial values were optimised.
## 18 parameters were estimated in the process
## 12 parameters were provided
## Residuals standard deviation: 1559.733
## Cost function type: MSE; Cost function value: 1981326.617
##
## Information criteria:
## AIC AICc BIC BICc
## 1717.704 1726.473 1764.049 1784.107
## Forecast errors:
## MPE: 14.4%; sCE: -1419.2%; Bias: 74.6%; MAPE: 34.5%
## MASE: 2.456; sMAE: 100.2%; RelMAE: 1.049; sMSE: 183.6%
```

The resulting model will be equivalent to ETS(A,A,A). However due to different initialisation of optimisers and different method of number of parameters calculation, `gum()`

above and `es(y, "AAA", h=h, holdout=TRUE)`

will lead to different models.