Building a CNN with TensorFlow & Keras
This lesson brings everything together.
Until now, you have learned how convolution, pooling, activation functions, regularization, optimization, and architectures work individually.
Here, we combine those concepts and build a complete Convolutional Neural Network (CNN) using TensorFlow and Keras.
Why Keras?
Keras is a high-level API built on top of TensorFlow.
It allows us to focus on architecture and learning behavior instead of low-level tensor operations.
This makes it ideal for building, testing, and deploying CNN models efficiently.
Problem Statement (Capstone)
We will build an image classification CNN that learns to classify images into multiple categories.
The focus here is not dataset size, but understanding the full pipeline:
Model design, Compilation, Training, Evaluation.
Import Required Libraries
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Conv2D, MaxPooling2D
from tensorflow.keras.layers import Flatten, Dense, Dropout
These modules allow us to define layers, assemble the network, and control training behavior.
Building the CNN Architecture
model = Sequential()
model.add(Conv2D(32, (3,3), activation='relu', input_shape=(128,128,3)))
model.add(MaxPooling2D(pool_size=(2,2)))
model.add(Conv2D(64, (3,3), activation='relu'))
model.add(MaxPooling2D(pool_size=(2,2)))
model.add(Conv2D(128, (3,3), activation='relu'))
model.add(MaxPooling2D(pool_size=(2,2)))
model.add(Flatten())
model.add(Dense(128, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(10, activation='softmax'))
The network gradually increases depth while reducing spatial dimensions.
This allows the model to learn both low-level and high-level visual patterns.
Compiling the Model
model.compile(
optimizer='adam',
loss='categorical_crossentropy',
metrics=['accuracy']
)
Adam optimizer adapts learning rates automatically.
Categorical crossentropy is used for multi-class classification problems.
Training the Model
model.fit(
train_data,
validation_data=val_data,
epochs=20
)
During training, the model learns filter weights that best separate image classes.
Validation data helps monitor generalization performance.
Understanding the Output
At the end of training, you will observe:
Increasing training accuracy, Validation accuracy stabilizing, Loss gradually decreasing.
If validation accuracy drops while training accuracy rises, overfitting is occurring.
Model Evaluation
test_loss, test_accuracy = model.evaluate(test_data)
print("Test Accuracy:", test_accuracy)
This step confirms how well the model performs on unseen data.
Capstone Project – Explained
You have now completed a full CNN pipeline:
Architecture design, Optimization, Regularization, Training and evaluation.
This is the same workflow used in real-world deep learning systems.
Additional Project Ideas
To deepen your understanding, try these projects:
1. Medical image classification (X-ray or MRI scans) 2. Traffic sign recognition system 3. Plant disease detection using leaf images 4. Face emotion classification system
Each project reinforces CNN design decisions and improves real-world intuition.