Mathematics Lesson 56 – Projections | Dataplexa

Projections

Projections explain how one vector can be expressed as a shadow or component along another vector or subspace.

This idea connects geometry, algebra, optimization, and machine learning. Whenever we approximate, predict, or minimize error, we are using projections.


Why Projections Are Important

In real life, we often need the “best approximation” rather than an exact value.

Projections help us:

  • Approximate vectors
  • Remove unwanted components
  • Minimize error
  • Understand orthogonality

Many AI algorithms are projection-based.


What Is a Projection? (Intuition)

Imagine shining a light on a vector.

The shadow cast on another direction is the projection.

It tells us how much of one vector lies along another.


Projection Onto a Vector

Given two vectors:

  • v → vector being projected
  • u → direction vector

The projection of v onto u is the component of v that lies along u.


Formula for Projection Onto a Vector

The projection of v onto u is:

proju(v) = ( v·u / u·u ) u

This formula is extremely important for exams and applications.


Understanding the Projection Formula

Each part of the formula has meaning:

  • v·u → measures alignment
  • u·u → magnitude squared of u
  • Multiplying by u → gives direction

The result is a vector along u.


Example: Projection Onto a Vector

Let:

v = (3, 4), u = (1, 0)

Then:

v·u = 3
u·u = 1

proju(v) = 3(1, 0) = (3, 0)

This is the horizontal component of v.


Orthogonal Component

The part of v that is not along u is called the orthogonal component.

It is given by:

v − proju(v)

This component is perpendicular to u.


Orthogonal Projection (Key Idea)

An orthogonal projection is the closest vector (in distance) to v that lies along u.

This makes projections essential for minimizing error.


Projection Onto a Line

Projecting onto a vector is the same as projecting onto the line spanned by that vector.

The projection gives the closest point on that line to the original vector.


Projection Onto a Subspace

Projections can also be done onto entire subspaces.

If a subspace has an orthonormal basis, projection becomes very simple.

Each basis vector contributes independently.


Orthogonal Basis and Projections

If basis vectors are orthogonal:

  • Projections are easy to compute
  • No interference between directions

This is why orthogonality is so powerful.


Projection in Geometry

Geometrically, projections represent:

  • Shadows
  • Perpendicular drops
  • Closest points

They appear naturally in shapes and measurements.


Projection in Physics

Physics uses projections constantly:

  • Resolving forces
  • Motion along inclined planes
  • Component analysis

Each force is projected onto directions.


Projection in Data Science

In data science:

  • Data is projected onto features
  • Noise is removed using projections
  • Dimensionality is reduced

PCA is fundamentally projection-based.


Projection in Machine Learning

Machine learning uses projections in:

  • Linear regression
  • Least squares problems
  • Feature extraction

Predictions are projections onto model spaces.


Least Squares and Projections

Least squares finds the vector in a subspace that is closest to a given data vector.

That closest vector is an orthogonal projection.

This is a core ML idea.


Projections in Competitive Exams

Exams often test:

  • Projection formula
  • Geometric interpretation
  • Orthogonality conditions

Understanding beats memorization here.


Common Mistakes to Avoid

Students often make these mistakes:

  • Forgetting dot product in formula
  • Projecting onto wrong vector
  • Confusing projection with component subtraction

Always check direction and formula.


Practice Questions

Q1. What does a projection represent?

Component of one vector along another

Q2. What operation is central to projection?

Dot product

Q3. Why are projections important in ML?

They minimize error and approximate data

Quick Quiz

Q1. Is an orthogonal projection the closest point?

Yes

Q2. Are projections used in regression?

Yes

Quick Recap

  • Projections measure components along directions
  • Computed using dot products
  • Orthogonal projection minimizes distance
  • Core idea in least squares and ML
  • Connects geometry to data science

With projections mastered, you are now ready to learn orthogonality, which explains perpendicular structure in spaces.