Some good free textbooks are Rob Hyndman's online book https://otexts.com/fpp2/ and Brockwell and Davis' old textbook https://link.springer.com/book/10.1007/978-3-319-29854-2. They focus much on ARIMA and exponential smoothers, because most time series data are pretty small sized (a few dozens to at most a few thousand samples), so there's really not that much else that can do.
Most of Hyndman's textbook approaches (mostly ARIMA and various exponential smoothers) are implemented in his 'forecast' R package.
ARIMA and exponential smoothers tend to be a bit hard to get working well on daily data (they come from the era where most data was monthly or quarterly). A modern take on classical frequency domain Fourier regression is Facebook Prophet (https://facebook.github.io/prophet/) which tends to work pretty well if you have a few years of daily data( https://facebook.github.io/prophet/ )
FPP is great, but limited to the simplest possible timeseries: a single number recorded at evenly-spaced intervals.
Anyone know of good resources for multivariate, multimodal, irregular timeseries forecasting? I know some great practical tools and tutorials (prophet, fast.ai), but I'd love to inject some statistical knowledge like FPP offers.
- Multi-variate: text book treatments tend to focus mainly on Vector Auto Regression (VAR) models. Unrestricted VARs scale very badly in vector dimension, so the often end up in some regularized form (dimension reduced by PCA or Bayesian priors). Lütkepohl's textbook is the standard reference.
VAR type models in my view not very practical for most business time series. You should probably not waste too much time on them unless you're really into macro-economic forecasting, in which case you're wasting your time anyway :). VAR forecast accuracy in macro-economics is not great to put it mildly, but we have nothing really better).
An alternative to VARs for multivariate time series are state space models, which are described mostly in Durbin&Koopman and Andrew Harvey's time series textbooks. These model types was recently popularized in tech circles by Google's CausalImpact R package (though that package I think only implements the univariate model).
- Multi-model: if you need to model some generic non-Gaussian time series process some slow generic simulation method (MCMC, particle filtering). I can't recommend any good reference since I haven't kept up with the literature for about 15 years. I only remember a bunch of dense journal papers from that era (e.g. https://en.wikipedia.org/wiki/Particle_filter#Bibliography)
- Irregular: if the irregularity is mild (filling up a relatively small number of gaps/missing data), you can do LOESS, smoothing splines, Kalman filtering, which should all get you pretty similar results. If your time series are extremely irregular, probably no generic method will do well and you probably need to invest some days/weeks/months into a fairly problem/data-specific method (probably some heavily tuned smoothing spline)
Most of Hyndman's textbook approaches (mostly ARIMA and various exponential smoothers) are implemented in his 'forecast' R package.
ARIMA and exponential smoothers tend to be a bit hard to get working well on daily data (they come from the era where most data was monthly or quarterly). A modern take on classical frequency domain Fourier regression is Facebook Prophet (https://facebook.github.io/prophet/) which tends to work pretty well if you have a few years of daily data( https://facebook.github.io/prophet/ )