Data Science Machine Learning

Logistic Regression and Classification

A practical course in probabilistic classification, decision boundaries, model evaluation, and real-world classifier design

Logistic Regression and Classification logo
Quick Course Facts
18
Self-paced, Online, Lessons
18
Videos and/or Narrated Presentations
6.2
Approximate Hours of Course Media
About the Logistic Regression and Classification Course

Build a strong foundation in Data Science with Logistic Regression and Classification, a practical course in probabilistic classification, decision boundaries, model evaluation, and real-world classifier design. You will learn how classification models turn data into predictive decisions, how to evaluate their performance, and how to communicate results with confidence.

Apply Logistic Regression and Classification To Real Data Science Problems

  • Learn how logistic regression converts features into probability scores for practical classification tasks.
  • Understand decision boundaries, sigmoid functions, odds, log-odds, and coefficient interpretation.
  • Evaluate classifiers using confusion matrices, precision, recall, F1, ROC curves, AUC, and threshold selection.
  • Design more reliable real-world models with preprocessing, regularisation, calibration, imbalance handling, and an end-to-end case study.

This Data Science course teaches the core theory and applied workflow behind Logistic Regression and Classification.

Logistic Regression and Classification begins with the foundations of classification problems, showing how predictive decisions differ from continuous prediction and why probability-based outputs matter. You will move from linear models to probability scores, explore the sigmoid function, and learn how decision boundaries separate classes in a clear and practical way.

The course then explains the mechanics behind logistic regression, including odds, log-odds, coefficient interpretation, maximum likelihood, cross-entropy loss, and gradient descent. These lessons help you understand not just how to run a model, but why it behaves the way it does and how training choices affect performance.

You will also develop a practical modelling workflow for Data Science projects, including feature scaling, encoding, preprocessing, binary logistic regression, and multiclass classification. Dedicated lessons on model evaluation cover confusion matrices, precision, recall, F1, cost-aware evaluation, ROC curves, AUC, and threshold selection so you can judge classifier performance in context.

Real-world classifier design often involves imperfect data, unequal class distributions, and the need for reliable probability scores. This course addresses those challenges through class imbalance strategies, L1 and L2 regularisation, probability calibration, interpretation, explanation, and communication of results. By the end, you will be able to build, evaluate, improve, and explain classification models with the practical judgment expected in applied Data Science work.

Course Lessons

Full lesson breakdown

Lessons are organized by topic area and each includes descriptive copy for search visibility and student clarity.

Foundations of Classification

3 lessons

This lesson introduces classification as the task of making predictive decisions about categories rather than predicting continuous quantities. It frames classifiers as decision systems that use input…

Lesson 2: From Linear Models to Probability Scores

20 min
This lesson explains how a familiar linear model can be adapted for classification by turning a raw score into a probability-like output. Instead of predicting a numeric quantity such as price or temp…

Lesson 3: The Sigmoid Function and Decision Boundaries

19 min
This lesson introduces the sigmoid function as the link between a linear score and a probability in binary classification. Learners will see why raw linear outputs are not suitable as probabilities, h…

Model Mechanics

3 lessons

Lesson 4: Odds, Log-Odds, and Interpreting Coefficients

21 min
This lesson explains how logistic regression turns a linear score into an interpretable probability by using odds and log-odds. Learners connect probability, odds, and the logit scale, then see why lo…

Lesson 5: Maximum Likelihood for Logistic Regression

22 min
This lesson explains how logistic regression learns its coefficients through maximum likelihood estimation . Instead of minimizing squared error, logistic regression chooses parameters that make the o…

Lesson 6: Cross-Entropy Loss and Model Training

20 min
This lesson explains why logistic regression is trained with cross-entropy loss rather than squared error, and how that loss connects directly to probability, likelihood, and practical model fitting. …

Model Training

2 lessons

Lesson 7: Gradient Descent and Optimisation Basics

21 min
Gradient descent is the practical engine that turns logistic regression from a model formula into a fitted classifier. In this lesson, Professor Christina Ross explains how optimisation searches for p…

Lesson 8: Feature Scaling, Encoding, and Preprocessing

19 min
This lesson explains how preprocessing choices affect logistic regression training. Students learn why scaling numerical features improves optimization and makes regularization behave more fairly, how…

Applied Modelling Workflow

1 lesson

Lesson 9: Binary Logistic Regression in Practice

20 min
This lesson turns binary logistic regression into a practical modelling workflow. Students learn how to frame a two-class prediction problem, prepare data, fit a logistic model, inspect coefficients, …

Model Evaluation

3 lessons

Lesson 10: Confusion Matrices and Core Classification Metrics

22 min
This lesson teaches how to evaluate a binary classifier using a confusion matrix and the core metrics derived from it: accuracy, precision, recall, specificity, F1 score, and error rates. Students lea…

Lesson 11: Precision, Recall, F1, and Cost-Aware Evaluation

21 min
This lesson explains how to evaluate classifiers when accuracy is not enough. Students learn precision, recall, and F1 from the confusion matrix, connect each metric to false positives and false negat…

Lesson 12: ROC Curves, AUC, and Threshold Selection

23 min
This lesson explains how ROC curves and AUC evaluate a classifier across many possible probability thresholds. Students learn how true positive rate and false positive rate change as the threshold mov…

Practical Classification Challenges

1 lesson

Lesson 13: Handling Class Imbalance

20 min
Class imbalance occurs when one class is much rarer than another, such as fraud cases, disease positives, churn events, or safety incidents. In logistic regression and other classifiers, imbalance can…

Improving Generalisation

1 lesson

Lesson 14: Regularisation with L1 and L2 Penalties

22 min
This lesson explains how L1 and L2 regularisation improve the generalisation of logistic regression classifiers by discouraging overly large coefficients. Students learn how penalties change the train…

Extending the Model

1 lesson

Lesson 15: Multiclass Logistic Regression

21 min
This lesson extends binary logistic regression to classification problems with more than two classes. It focuses on two practical strategies: one-vs-rest classifiers and the multinomial softmax model.…

Model Reliability

1 lesson

Lesson 16: Probability Calibration and Reliable Scores

20 min
This lesson explains what it means for a classifier score to be reliable, with a focus on probability calibration. A logistic regression model produces probabilities by design, but those probabilities…

Model Interpretation

1 lesson

Lesson 17: Interpreting, Explaining, and Communicating Results

19 min
This lesson focuses on turning a fitted logistic regression model into results that other people can understand and use. Learners interpret coefficients, odds ratios, predicted probabilities, and feat…

Capstone Application

1 lesson

Lesson 18: End-to-End Classification Case Study

24 min
In this capstone lesson, students work through an end-to-end classification case study using logistic regression as the core model. The lesson connects problem framing, data preparation, feature desig…
About Your Instructor
Professor Christina Ross

Professor Christina Ross

Professor Christina Ross guides this AI-built Virversity course with a clear, practical teaching style.