COVID-19 Pneumonia Detection

This project implements a deep learning system for automatic detection of COVID-19 through chest X-ray analysis. The system can classify medical images into three categories: COVID-19, Normal, and Viral Pneumonia using advanced convolutional neural networks and transfer learning techniques.

Project Overview

Dataset

COVID-19_Radiography_Dataset containing COVID, Normal, and Viral Pneumonia X-ray images

Technology Stack

TensorFlow/Keras, CNN Architecture, ResNet50 Transfer Learning

Data Processing

Image Preprocessing Function

def preprocessor(img_path):
    # Open image, convert to RGB, resize to 192x192
    img = Image.open(img_path).convert("RGB").resize((192,192))
    
    # Min-max normalization to [0,1] range
    img = (np.float32(img)-1.)/(255-1.)
    
    # Reshape to correct dimensions for Keras
    img = img.reshape((192,192,3))
    return img

Input Dimensions

192 × 192 × 3 (RGB channels)

Normalization

Min-Max scaling to [0,1] range

Data Loading & Label Encoding

# Dataset path configuration
base_path = 'COVID-19_Radiography_Dataset'
categories = ['COVID/images', 'Normal/images', 'Viral Pneumonia/images']

# Create labels
covid = list(repeat("COVID", len(fnames[0])))
normal = list(repeat("NORMAL", len(fnames[1])))
pneumonia = list(repeat("PNEUMONIA", len(fnames[2])))

# One-hot encoding
y = pd.get_dummies(y_labels)

Model Architecture

Baseline CNN Model

def create_baseline_cnn(input_shape=(192, 192, 3), num_classes=3):
    model = Sequential([
        # First convolutional block
        Conv2D(32, (3, 3), activation='relu', input_shape=input_shape),
        BatchNormalization(),
        MaxPooling2D(2, 2),
        Dropout(0.25),

        # Second convolutional block
        Conv2D(64, (3, 3), activation='relu'),
        BatchNormalization(),
        MaxPooling2D(2, 2),
        Dropout(0.25),

        # Third convolutional block
        Conv2D(128, (3, 3), activation='relu'),
        BatchNormalization(),
        MaxPooling2D(2, 2),
        Dropout(0.25),

        # Dense layers
        Flatten(),
        Dense(256, activation='relu'),
        BatchNormalization(),
        Dropout(0.5),
        Dense(128, activation='relu'),
        BatchNormalization(),
        Dropout(0.5),

        # Output layer
        Dense(num_classes, activation='softmax')
    ], name='Baseline_CNN')
    
    return model

Convolutional Layers

3 conv blocks
32→64→128 filters

Regularization

BatchNormalization
Dropout(0.25/0.5)

Dense Layers

256→128→3
ReLU + Softmax

Training Configuration

# Model compilation
baseline_model.compile(
    optimizer=Adam(learning_rate=0.001),
    loss='categorical_crossentropy',
    metrics=['accuracy', 
             tf.keras.metrics.Precision(name='precision'),
             tf.keras.metrics.Recall(name='recall')]
)

# Callback functions
callbacks = [
    EarlyStopping(monitor='val_accuracy', patience=8),
    ReduceLROnPlateau(monitor='val_accuracy', factor=0.5, patience=5)
]

Training Results

Model Evaluation Metrics

# Test set evaluation
baseline_test_loss, baseline_test_acc, baseline_test_prec, baseline_test_recall = baseline_model.evaluate(X_test, y_test)

print(f"Test Accuracy: {baseline_test_acc:.4f}")
print(f"Test Precision: {baseline_test_prec:.4f}")
print(f"Test Recall: {baseline_test_recall:.4f}")
print(f"Test F1-Score: {2 * (baseline_test_prec * baseline_test_recall) / (baseline_test_prec + baseline_test_recall):.4f}")

Training Curve Visualization

# Plot training curves
def smooth(xs, w=3):
    if w <= 1: return xs
    xs = np.array(xs, dtype=float)
    return np.convolve(xs, np.ones(w)/w, mode='same')

metrics = [
    ('loss', 'val_loss'),
    ('accuracy', 'val_accuracy'),
    ('precision', 'val_precision'),
    ('recall', 'val_recall'),
]

plt.figure(figsize=(10, 10))
for i, (m, vm) in enumerate(metrics, 1):
    plt.subplot(2, 2, i)
    train_curve = baseline_history.history.get(m, [])
    val_curve = baseline_history.history.get(vm, [])
    plt.plot(smooth(train_curve, w=3), label=f'train_{m}')
    plt.plot(smooth(val_curve, w=3), label=f'val_{m}')
    plt.title(m)
    plt.xlabel('Epoch')
    plt.legend()
plt.tight_layout()
plt.show()

Monitoring Metrics

Loss, Accuracy, Precision, Recall

Smoothing

Moving average filter to reduce curve noise

Transfer Learning (ResNet50)

ResNet50 Transfer Learning Model

def create_resnet_model(input_shape=(192, 192, 3), num_classes=3, fine_tune=False):
    """
    Create ResNet50 transfer learning model
    """
    # Load pre-trained ResNet50
    base_model = ResNet50(
        weights='imagenet',
        include_top=False,
        input_shape=input_shape
    )

    # Freeze base model layers
    base_model.trainable = fine_tune

    if fine_tune:
        # Fine-tune: unfreeze from layer 140 onwards
        for layer in base_model.layers[:140]:
            layer.trainable = False

    # Build model
    model = Sequential([
        base_model,
        GlobalAveragePooling2D(),
        Dense(512, activation='relu'),
        BatchNormalization(),
        Dropout(0.5),
        Dense(256, activation='relu'),
        BatchNormalization(),
        Dropout(0.5),
        Dense(num_classes, activation='softmax')
    ], name='ResNet50_Transfer')

    return model

Pre-trained Weights

ImageNet, 152 layers deep

Fine-tuning Strategy

Freeze first 140 layers, unfreeze top layers

Custom Head

GAP + Dense layers 512→256→3

Training Configuration

# ResNet model compilation
resnet_model.compile(
    optimizer=Adam(learning_rate=0.0001),  # Lower learning rate
    loss='categorical_crossentropy',
    metrics=['accuracy',
             tf.keras.metrics.Precision(name='precision'),
             tf.keras.metrics.Recall(name='recall')]
)

# Two-stage training
print("Training ResNet50 (frozen base)...")
resnet_history = resnet_model.fit(
    X_train_split, y_train_split,
    validation_data=(X_val, y_val),
    epochs=30,
    batch_size=32,
    callbacks=setup_callbacks(),
    verbose=1
)

Tech Stack & Tools

Deep Learning Framework

TensorFlow 2.x
Keras High-level API
NumPy & Pandas

Image Processing

PIL (Pillow)
OpenCV (cv2)
Matplotlib

Machine Learning

Scikit-learn
Transfer Learning
Data Augmentation

Development Environment

Jupyter Notebook
Python 3.8+
GPU Acceleration (CUDA)

Project Highlights

Multi-class Medical Image Classification

Accurate distinction between COVID-19, Normal, and Viral Pneumonia

Transfer Learning Optimization

Enhanced performance using ResNet50 pre-trained model

Data Augmentation Techniques

Improved model generalization and robustness

Comprehensive Evaluation System

Accuracy, Precision, Recall, F1-Score metrics