👋 Introduction
Welcome to the Machine Learning Course!
This course introduces the foundational concepts and techniques of machine learning, focusing on supervised learning methods like linear regression, logistic regression, and ensemble models, along with a brief exploration of unsupervised learning. You’ll learn how to prepare data, build predictive models, and evaluate their performance—all while applying these skills to real-world problems in domains like business, healthcare, and education.
📋 Course Overview
🗝️ Key Topics Covered
- Foundations of Machine Learning
- What is Machine Learning?
- Supervised vs. Unsupervised Learning
- The ML workflow: data → model → evaluation
- Supervised Learning
- Linear Regression: predicting continuous outcomes
- Logistic Regression: binary classification
- Model Evaluation: accuracy, precision, recall, F1-score
- Overfitting & Regularization: bias-variance tradeoff
- Ensemble Learning
- Decision Trees: intuitive models for classification
- Random Forests: bagging and feature importance
- Gradient Boosting: boosting weak learners
- Comparison of ensemble methods
- Unsupervised Learning (Intro)
- Clustering: k-Means and Hierarchical Clustering
- Dimensionality Reduction: PCA basics
- Applications in customer segmentation and pattern discovery
By the end of this course, you’ll be able to build and evaluate basic machine learning models, understand their strengths and limitations, and apply them to structured datasets with confidence.
📝 Course Criteria
Criteria | Percentage |
---|---|
Attendance | 10% |
Participation & quiz | 30% |
Midterm Exam | 30% |
Final Project & Presentation / Practical labs | 30% |
💻 Programming:
You are free you use your favorite programming language
Python or
.
🗺️ Course progress
Topic | Lab | Solution | Remark |
---|---|---|---|
… | … | … | … |
Linear Regression | Lab1 | Solution1 | …Loading |
Logistic Regression | Lab2 | Solution2 | …Loading |
📄 Midterms, Exams and Projects
In this section, you will find all the information related to the midterms, exams and projects including instructions, starting dates and the deadlines.
📄 Midterm & Exam
- A possible
midterm
date:...Loading
.
📄 Project:
Deadline for the report:
...Loading
.Where to submit:
Canvas
Your report should be in (your favorite) PDF format and include the following criteria:
- Introduction
- Objective: Define the ML problem (e.g., classification, regression, clustering).
- Dataset: Describe the source, size, and features (e.g., Kaggle, UCI).
- Relevance: Why is this problem important in a machine learning context?
- Data Preprocessing
- Cleaning: Handle missing values, outliers, duplicates.
- Transformation: Scaling, encoding, feature engineering.
- Selection: Feature importance, PCA, correlation analysis.
- Exploratory Data Analysis (EDA)
- Statistics & Visuals: Summary stats, distributions, pair plots.
- Insights: Identify trends, anomalies, class imbalance.
- Model Development
- Algorithms: Justify choices (e.g., Logistic Regression, Random Forest, SVM).
- Training Strategy: Train/test split, cross-validation.
- Tuning: Grid search, random search, Bayesian optimization.
- Evaluation Metrics:
- Classification: Accuracy, Precision, Recall, F1, ROC-AUC.
- Regression: RMSE, MAE, R².
- Clustering: Silhouette Score, Davies-Bouldin Index.
- Visuals: Confusion matrix, learning curves, residual plots.
- Discussion & Challenges
- Limitations: Data quality, overfitting, interpretability.
- Implications: How results apply to real-world scenarios.
- Conclusion & Future Work
- Summary: Key takeaways and model performance.
- Next Steps: Improvements, alternative models, deployment ideas.
- References
- Cite datasets, libraries (e.g., scikit-learn, TensorFlow), and relevant papers.
- Appendix (Optional)
- Code snippets, extended results, additional plots.
Presentation:
- A possible dates:
...Loading
.
📚 Resources and Further Reading
Explore these resources to deepen your understanding of Machine Learning:
Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow – Aurélien Géron
The Elements of Statistical Learning – Hastie, Tibshirani, Friedman
Deep Learning – Ian Goodfellow, Yoshua Bengio, Aaron Courville
Introduction to Statistical Learning – James, Witten, Hastie, Tibshirani