An Introduction to Time Series Forecasting
A Sequence of recording a metric over the constant time intervals is known as Time Series .
Based on the frequency, a Time Series can be classified into the following categories:
- Yearly (For example, Annual Budget)
- Quarterly (For example, Expenses)
- Monthly (For example, Air Traffic)
- Weekly (For example, Sale Quantity)
- Daily (For instance, Weather)
- Hourly (For example, Stocks Price)
- Minutes wise (For example, Inbound Calls in a Call Centre)
- Seconds wise (For example, Web Traffic)
Once we are done with Time Series Analysis, we have to forecast it in order to predict the future values that the series will be going to take.
However, what is the need for forecasting?
Since forecasting a Time Series, such as Sales and Demand, is often of incredible commercial value, which increases the need for forecasting.
Time Series Forecasting is generally used in many manufacturing companies as it drives the primary business planning, procurement, and production activities. Any forecasts’ errors will undulate throughout the chain of the supply or any business framework, for that stuff. Thus, it is significant in order to get accurate predictions saving the costs, and is critical to success.
The Concepts and Techniques behind Time Series forecasting can also be applied in any business, including manufacturing.
The Time Series forecasting can be broadly classified into two categories:
- Univariate Time Series Forecasting: The Univariate Time Series Forecasting is a forecasting of time series where we utilize the former values of the time series only in order to guess the forthcoming values.
- Multi-Variate Time Series Forecasting: The Multi-Variate Time Series Forecasting is a forecasting of time series where we utilize the predictors other than the series, also known as exogenous variables, in order to forecast.
In the following tutorial, we will understand the specific type of method known as ARIMA modeling .
Auto Regressive Integrated Moving Average , abbreviated as ARIMA , is an Algorithm for forecasting that is centered on the concept that the data in the previous values of the time series can alone be utilized in order to predict the future values.
Let us understand the ARIMA Models in detail.
An Introduction to ARIMA Models
ARIMA , abbreviated for ‘Auto Regressive Integrated Moving Average’ , is a class of models that ‘demonstrates’ a given time series based on its previous values: its lags and the lagged errors in forecasting, so that equation can be utilized in order to forecast future values.
We can model any Time Series that are non-seasons exhibiting patterns and not a random white noise with ARIMA models .
There are three terms characterizing An ARIMA model:
p, q, and d
where,
- p = the order of the AR term
- q = the order of the MA term
- d = the number of differences required to make the time series stationary
If a Time Series has seasonal patterns, we have to insert seasonal periods, and it becomes SARIMA , short for ‘Seasonal ARIMA’ .
Now, before understanding " the order of AR term ", let us discuss ‘d’ term.
What are ‘p’, ‘q’, and ‘d’ in the ARIMA model?
The primary step is to make the time series stationary in order to build an ARIMA model. This is because the term ‘Auto Regressive’ in ARIMA implies a Linear Regression Model using its lags as predictors. And as we already know, Linear Regression Models work well for independent and non-correlated predictors.
In order to make a series stationary, we will utilize the most common approach that is to subtract the past value from the present value. Sometimes, depending on the series complexity, multiple subtractions may be required.
Therefore, the value of d is the minimum number of subtractions required to make the series stationary. And if the time series is already stationary, thus d becomes 0.
Now, let us understand the terms ’ p ’ and ’ q '.
The ’ p ’ is the order of the ‘AR’ (Auto-Regressive) term , which means that the number of lags of Y to be utilized as predictors. At the same time, ’ q ’ is the order of the ‘MA’ (Moving Average) term , which means that the number of lagged forecast errors should be used in the ARIMA Model.
Now, let us understand what ‘AR’ and ‘MA’ models are in detail.
Understanding Auto-Regressive (AR) and Moving Average (MA) Models
In the following section, we will discuss the AR and MA models and the actual mathematical formula for these models.
A Pure AR (Auto-Regressive only) Model is a model which relies only on its own lags. Hence, we can also conclude that it is a function of the ‘lags of Yt’
where, Yt-1 is the lag1 of the series. β1 is the coefficient of lag1 and is the term of intercept that is calculated by the model.
Similarly, a Pure MA (Moving Average only) model is a model where Yt relies only on the lagged predicted errors.
Where, the error terms are the AR models errors of the corresponding lags. The errors ϵt and ϵt-1 are the errors from the equations given below:
Thus, we have concluded Auto-Regressive (AR) and Moving Average (MA) models, respectively.
Let us now understand the equation of an ARIMA Model.
An ARIMA model is a model where the series of time was subtracted at least once in order to make it stationary, and we combine the Auto-Regressive (AR) and the Moving Average (MA) terms. Hence, we got the following equation:
ARIMA Model in words:
Forecasted Yt = Constant + Linear Combination Lags of Y (up to p lags) + Linear Combination of Lagged Predicted Errors (up to q lags)
Thus, the objective of this model is to find the values of p, q , and d . However, how can we find one?
Let us begin with finding the ‘d’ in the ARIMA Model.
Finding the order of differencing ‘d’ in the ARIMA Model
The primary purpose of differencing in the ARIMA model is to make the Time Series stationary.
However, we have to take care of not over-differencing the series as an over-differenced series may also be stationary, which will affect the parameter of the model later.
Now, let us understand the appropriate differencing order.
The most appropriate differencing order is the minimum differencing needed in order to achieve an almost stationary series roaming around a defined mean and the ACF plat reaching Zero relatively faster.
In case the autocorrelations are positive for multiple lags (generally, ten or more), the series requires further differencing. In contrast, if lag 1 autocorrelated itself pretty negatively, then the series is possibly over-differenced.
In cases where we cannot actually decide between two differencing orders, then we have to choose the order providing the minor standard deviation in the differenced series.