Skip to content

nilot-pal/text-classification

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SMS Spam Detection - Classical NLP vs Transformers

A comparative study of classical TF-IDF models and transformer-based approaches for SMS spam detection, with a focus on performance, cost, and deployment trade-offs.

Problem Statement

Spam detection is a highly imbalanced text classification problem where evaluation metrics, threshold selection, and operational costs matter more than raw accuracy. This project explores whether modern transformer models meaningfully outperform classical NLP approaches on a real-world SMS dataset.

Dataset

  • UCI SMS Spam Collection
  • ~5,500 SMS messages
  • Binary labels: ham (legitimate) vs spam
  • Strong class imbalance (~13% spam)

Models Evaluated

Classical Models

  • TF-IDF + Logistic Regression
  • TF-IDF + Multinomial Naive Bayes

Transformer Model

  • DistilBERT (fine-tuned for binary classification)

Evaluation Strategy

  • ROC and Precision-Recall curves
  • Threshold tuning to reflect cost-sensitive decisions
  • Error analysis of false positives and false negatives
  • Comparison of performance gains vs computational cost

Key Results

  • Classical TF-IDF models already achieve very strong performance due to clear lexical signals in spam messages.
  • DistilBERT improves recall and F1-score for spam detection, but the gains are incremental.
  • Precision-Recall analysis highlights the importance of threshold selection over default accuracy metrics.

Takeaway

This project demonstrates that:

  • Model complexity should be justified by problem complexity.
  • Classical models often provide superior performance-to-cost ratios for simple text classification tasks.
  • Transformers are most valuable when contextual understanding is essential, not by default.

Reproducibility

Model artifacts are not committed to the repository. All results can be reproduced by running the provided notebooks.

Directory structure

text-classification/
├── data/
├── notebooks/
│ └── 01_eda.ipynb
  └── 02_baselines.ipynb
  └── 03_transformers.ipynb
├── README.md
├── requirements.txt

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors