Time Series Analysis is the mathematical art of looking backward to see forward. Unlike standard statistics, which often treats data points as independent snapshots, Time Series recognizes that the order of data matters. Whether you are tracking the fluctuating price of coffee on the Nairobi Securities Exchange or predicting the next peak in seasonal flu cases, you are dealing with data where today’s value is almost always linked to yesterday’s.
Below is the exam paper download link
SMS-3251-SMS-3476-TIME-SERIES-ANALYSIS-TIME-SERIES-ANALYSIS
Above is the exam paper download link
For students in economics, statistics, and data science, this unit is a true test of analytical logic. We have gathered the most frequent “stumbling blocks” from past examinations to help you refine your revision.
What is ‘Stationarity’ and why is it the first thing we check?
In Time Series, a process is Stationary if its statistical properties—like the mean, variance, and autocorrelation—do not change over time. Think of it like a heartbeat: it might go up and down, but it stays within a consistent range. Most forecasting models (like ARIMA) require the data to be stationary to work correctly. If your data has a clear upward trend, it isn’t stationary, and you’ll need to apply “differencing” to stabilize the mean before you can proceed.
How do we identify the components of a Time Series?
When you look at a raw graph of data over ten years, you are actually looking at four distinct layers layered on top of each other:
-
Trend: The long-term “direction” (up or down).
-
Seasonality: Patterns that repeat at fixed intervals (e.g., high soda sales every December).
-
Cyclic: Fluctuations that aren’t fixed, often linked to economic “booms” or “busts.”
-
Irregular/Residual: The “noise” or random shocks that cannot be predicted.
What is the difference between ‘AR’ and ‘MA’ models?
-
Auto-Regressive (AR): This model suggests that the current value is a linear function of its own previous values. It’s like saying today’s weather is 80% likely to be what yesterday’s weather was.
-
Moving Average (MA): This model focuses on “shocks.” It suggests the current value is related to the error terms (unpredicted noise) of previous periods. When you combine these with differencing, you get the famous ARIMA model, the workhorse of modern forecasting.
What are ACF and PACF plots used for?
The Autocorrelation Function (ACF) and Partial Autocorrelation Function (PACF) are the “fingerprints” of a time series.
-
The ACF helps identify the order of the Moving Average (MA) component.
-
The PACF helps identify the order of the Auto-Regressive (AR) component. In an exam, you will often be given these plots and asked to “identify the model.” If the PACF cuts off after two lags, you are likely looking at an AR(2) process.
How do we handle ‘Random Walks’?
A Random Walk is a series where the next value is just the current value plus a random hit of noise. It is famously used to describe stock prices. The key takeaway for your revision is that a Random Walk is non-stationary. It has “infinite memory,” meaning a shock that happened three years ago still impacts the price today.
What is ‘Exponential Smoothing’?
While ARIMA is great for complex data, Exponential Smoothing is often better for data with clear trends or seasonality. It gives more “weight” to recent observations and less weight to older ones. Holt-Winters is the specific version of this that handles both a steady trend and a repeating seasonal pattern simultaneously.

Conclusion
Time Series Analysis is about finding the signal within the noise. It requires a balance of visual intuition—looking at a graph and “seeing” the trend—and mathematical rigor. The only way to get comfortable with shifting from the time domain to the frequency domain is to get your hands dirty with real data patterns.
To help you master the art of forecasting and ace your finals, we have compiled a full set of practice problems and solutions below.
Last updated on: March 24, 2026