Partial Autocorrelation (PACF)
In the previous lesson, we learned about Autocorrelation (ACF), which showed us how the past influences the present.
But ACF has a problem.
It mixes direct influence and indirect influence.
PACF exists to separate those two.
Real-World Analogy (Very Important)
Imagine daily sales of a grocery store.
- Yesterday’s sales influence today → direct effect
- Sales from two days ago influence today through yesterday → indirect effect
ACF measures both.
PACF measures only the direct effect.
Why PACF Is Needed
ACF answers:
“How much does the past affect the present?”
PACF answers:
“How much does a specific past point affect the present, after removing all shorter lags?”
This distinction is critical for building AR models.
Creating a Realistic Time Series
We’ll reuse a realistic dataset with:
- Trend
- Weekly seasonality
- Noise
Python: Time Series Data
import numpy as np
import matplotlib.pyplot as plt
np.random.seed(7)
time = np.arange(300)
trend = time * 0.15
seasonal = 8 * np.sin(2 * np.pi * time / 7)
noise = np.random.normal(0, 3, size=300)
series = trend + seasonal + noise
This is the time series we are working with:
ACF vs PACF – Visual Difference
Before computing PACF, let’s remember what ACF looks like.
ACF shows correlation for all lags, but each lag includes influence from previous lags.
ACF Plot
Notice:
- Correlation decays slowly
- Multiple lags look important
- Hard to tell which lag truly matters
What PACF Actually Does
PACF removes the effect of shorter lags.
So when we look at lag 3:
- ACF includes influence of lag 1 and lag 2
- PACF removes lag 1 and lag 2 effects
What remains is the true direct influence.
Python: Calculating PACF
def pacf(series, max_lag):
pacf_vals = []
for lag in range(1, max_lag+1):
X = []
y = series[lag:]
for i in range(lag):
X.append(series[lag-i-1:-i-1])
X = np.column_stack(X)
coef = np.linalg.lstsq(X, y, rcond=None)[0]
pacf_vals.append(coef[-1])
return pacf_vals
pacf_values = pacf(series, 20)
Here is the actual PACF plot:
How to Read a PACF Plot
Each bar shows direct dependency.
- High bar → strong direct influence
- Bars drop suddenly → AR order cutoff
In this plot:
- Lag 1 is very strong
- Lag 2 is weak
- Most later lags are near zero
This tells us the series behaves like an AR(1) process.
Why PACF Is Crucial for AR Models
| Pattern | Meaning |
|---|---|
| PACF cuts off at lag p | AR(p) model |
| ACF cuts off, PACF decays | MA model |
| Both decay | ARMA |
Real-World Interpretation
If PACF shows:
- Strong lag 1 → yesterday matters
- Weak lag 2 → day-before-yesterday doesn’t add new info
Then your system has short memory.
This is common in:
- Sales data
- Traffic counts
- Sensor readings
Practice Questions
Q1. Why does PACF remove indirect effects?
Q2. What does a sharp PACF cutoff indicate?
Key Takeaways
- ACF mixes direct and indirect effects
- PACF isolates direct dependency
- PACF determines AR order
- Essential for ARIMA modeling
Next Lesson
In the next lesson, we will move into Moving Averages and understand MA behavior visually.