Time Series Lesson 9 – ACF | Dataplexa

Autocorrelation in Time Series (ACF)

So far, we learned how time series behaves over time, how it changes, how it smooths, and how trends and seasonality appear.

Now we ask a deeper question:

Does the past influence the future?

Autocorrelation helps us answer exactly that.


Real-World Intuition First

Think about daily sales of an online store.

If sales were high yesterday, is it likely they’ll be high today?

If traffic drops on Monday, does it also drop on Tuesday?

These relationships between past values and current values are what autocorrelation measures.


What Is Autocorrelation?

Autocorrelation measures how strongly a time series is related to its own past values.

In simple terms:

  • Lag 1 → today vs yesterday
  • Lag 7 → today vs last week
  • Lag 30 → today vs last month

If values repeat or depend on earlier values, autocorrelation will be high.


Why ACF Is Extremely Important

Autocorrelation tells us:

  • If a series has memory
  • If past values help predict future values
  • What lag lengths matter most

Almost all classical forecasting models (AR, ARIMA) are built directly on autocorrelation.


Creating a Realistic Time Series

We’ll use a realistic daily dataset with:

  • Upward trend
  • Weekly seasonality
  • Random noise

Python: Time Series Data

Python: Time Series
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt

np.random.seed(5)

time = np.arange(300)
trend = time * 0.2
seasonal = 10 * np.sin(2 * np.pi * time / 7)
noise = np.random.normal(0, 4, size=300)

series = trend + seasonal + noise

This is what the raw time series looks like:


Understanding Lag Visually

Before plotting ACF, let’s visually understand lag.

Here we compare the series with a 7-day shifted version.

Python: Lagged Series

Python: Lag 7
lag_7 = np.roll(series, 7)

Below you see both curves overlapping:

What this tells us:

  • Patterns align after shifting
  • Weekly dependency exists
  • Seasonality causes correlation

Autocorrelation Function (ACF)

ACF computes correlation between the series and its lagged versions for many lag values.

Instead of guessing lags, ACF shows them clearly.

Python: Compute ACF

Python: ACF Calculation
def autocorrelation(x, lag):
    return np.corrcoef(x[:-lag], x[lag:])[0,1]

lags = range(1, 40)
acf_values = [autocorrelation(series, lag) for lag in lags]

Here is the actual ACF plot:


How to Read an ACF Plot

Each vertical bar represents correlation at a lag.

  • High bar → strong dependency
  • Low bar → weak dependency
  • Repeating peaks → seasonality

In our plot:

  • Lag 1 → strong correlation
  • Lag 7 → strong weekly pattern
  • Slow decay → trend exists

Real-World Meaning

Pattern Interpretation
Slow decay Trend present
Peaks at fixed intervals Seasonality
Sudden drop Weak memory

Why ACF Matters Later

Later models rely heavily on ACF:

  • AR → uses ACF directly
  • ARIMA → selects parameters from ACF
  • SARIMA → seasonal ACF patterns

If ACF doesn’t make sense to you, those models will feel impossible.


Practice Questions

Q1. What does a high ACF at lag 1 mean?

The current value strongly depends on the previous value.

Q2. Why do seasonal series show repeated ACF peaks?

Because values repeat at fixed intervals.

Key Takeaways

  • ACF measures dependency on past values
  • Helps detect trend and seasonality
  • Guides model selection
  • Foundation for ARIMA models

Next Lesson

In the next lesson, we’ll learn about PACF and how it differs from ACF.