Time Series Analysis

Time series is an ordered sequence of values of a variable at equally spaced time intervals. Time series occur frequently when looking at industrial data. The essential difference between modeling data via time series methods and the other methods is that Time series analysis accounts for the fact that data points taken over time may have an internal structure such as autocorrelation, trend or seasonal variation that should be accounted for. A Time-series model explains a variable with regard to its own past and a random disturbance term.

Special attention is paid to exploring the historic trends and patterns (such as seasonality) of the time series involved, and to predict the future of this series based on the trends and patterns identified in the model. Since time series models only require historical observations of a variable, it is less costly in data collection and model estimation. . Time series models can broadly be categorized into linear and nonlinear Models. Linea models depend linearly on previous data points.

We Will Write a Custom Case Study Specifically
For You For Only $13.90/page!


order now

They include the autoregressive (AR) models, the integrated (I) models, and the moving average (MA) models.

The general autoregressive model of order p (AR(p)) can be written as And that of the moving average model of order q as The autoregressive (AR) models, were first introduced by Yule (1927) while the moving average process was developed by Slutzky (1937). Combinations of these ideas produce autoregressive moving average (ARMA) and autoregressive integrated moving average (ARIMA) models. s an autoregressive moving average process of order p,q denoted as ARMA(p,q) if is stationary and if for every where . is linearly related to the p most recent observations , q most recent forecast errors and the current disturbance . A non-stationary ARMA(p,q) process which requires differencing d times before it becomes stationary is said to follow an Autoregressive Integrated Moving Average of order (p,d,q) abbreviated as ARIMA(p,d,q). The difference operator when applied to the entry yields the difference .

Nonlinear time series are able to show cyclicity, asymmetry and capture higher moments such as skewness and kurtosis. They include the bilinear models introduced by Subba and Gabr (1984), the exponential autoregressive models introduced by Ozaki and Oda (1978) and the Autoregressive Conditional Heteroscedastic (ARCH) models introduced by Engle (1982). The general bilinear model is given by Where is a sequence of i. i. d random variables, usually but not always with zero mean and variance and , , and are model parameters.

That of the EAR model given by , And that of the Autoregressive Conditional Heteroscedastic (ARCH) models For t =1,…,T where is a kx1 vector of exogenous variables, and is a kx1 regression parameters. 2. 4Autoregressive moving average (ARMA) models The ARMA model is expressed as ARMA (p,q) where p= Number of autoregressive parameters and q = Number of moving average parameters. It is defined as, The basic assumption in estimating the ARMA coefficients is that the data are stationary, that is, the trend or seasonality cannot affect variance.

admin