DEEP LEARNING AI23531
ASSINGMENT -1
Digit Classification using a Multi-Layer Perceptron (MLP)
Aim:
To design, train, and evaluate a simple Multi-Layer Perceptron (MLP)
deep learning model using the MNIST handwritten digit dataset.
The objective is to normalize the input images, build a neural network
with one hidden layer of 128 neurons using ReLU activation, and classify
digits (0–9) with high accuracy.
Step-by-step algorithm :
1. Set up environment & reproducibility
Import libraries and fix random seeds so results are reproducible.
2. Load dataset
Load MNIST (already split into train/test). Note shapes.
3. Preprocess inputs
Scale pixel values to [0,1] by dividing by 255.0. (No flattening needed if
you use Flatten layer.)
4. Prepare validation set
Either use validation_split in [Link]() or carve out a
validation set from training data (e.g., 90% train / 10% val). This helps
monitor overfitting.
5. Decide label format For SparseCategoricalCrossentropy keep labels
as integer class indices (no one-hot). If you use CategoricalCrossentropy,
convert labels to one-hot.
CODE:
import tensorflow as tf
from tensorflow import keras
from [Link] import layers
import [Link] as plt
# Load dataset
(x_train, y_train), (x_test, y_test) = [Link].load_data()
# Normalize images
x_train = x_train / 255.0
x_test = x_test / 255.0
# Build a simple model
model = [Link]([
[Link](input_shape=(28, 28)),
[Link](128, activation='relu'),
[Link](10, activation='softmax')
])
# Compile model
[Link](optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
# Train the model and store history
history = [Link](x_train, y_train, epochs=3, validation_data=(x_test, y_test))
# Evaluate
test_loss, test_acc = [Link](x_test, y_test)
print("Test accuracy:", test_acc)
# Plot training & validation accuracy
[Link](figsize=(8, 4))
[Link]([Link]['accuracy'], label='Training Accuracy', marker='o')
[Link]([Link]['val_accuracy'], label='Validation Accuracy', marker='o')
[Link]('Model Accuracy')
[Link]('Epoch')
[Link]('Accuracy')
[Link]()
[Link](True)
[Link]()
# Plot training & validation loss
[Link](figsize=(8, 4))
[Link]([Link]['loss'], label='Training Loss', marker='o')
[Link]([Link]['val_loss'], label='Validation Loss', marker='o')
[Link]('Model Loss')
[Link]('Epoch')
[Link]('Loss')
[Link]()
[Link](True)
[Link]()
OUTPUT :
OUTPUT :
The model successfully trained on the MNIST dataset for 3 epochs. Achieved a test
accuracy of approximately 97% (exact value depends on run). Throughput S ≈ 0.3033
packets/slot Efficiency ≈ 30.33%