Fundamentals of Machine Learning

Course info:

Semester: 4

General Foundation

ECTS: 6

Hours per week: 3

Professor: T.B.D.

Teaching style: Face to face, usage of specialized software

Grading: Homework / Projects (60%), Final written exam (40%)

Activity Workload
Lectures 36
Class assignments / projects 42
Independent study 72
Course total 150

Learning Results

Upon successful completion of the course, students will be able to:

  • Design and implement machine learning and pattern recognition systems for a wide range of applications
  • Estimate parametric probability distributions of data features based on labelled data, using maximum likelihood and EM algorithm
  • Implement and train different machine learning models
  • Possess a critical understanding of the different characteristics, capabilities and limitations of different machine learning techniques and select the appropriate ones for different cases of complex problem solving

Skills acquired

  • Data and information retrieval, analysis and synthesis
  • Decision making
  • Individual work
  • Team work
  • New research ideas generation
  • Promotion of free, creative and inductive thinking

Probabilities review. Bayesian rules of inference. Distribution functions.

Introduction to Machine Learning (ML). ML methods. Types of learning. Terminology. ML Process Example. Statistical decision models for regression (function approximation) and classification problems. Bayes classifiers and minimal cost Bayes classifiers. Distance functions. Minimum distance classifiers. Linear and non-linear decision functions. Estimation of the probability density of input patterns. Parzen windows. The k-nearest neighbors (k-NN) classification algorithm. Supervised classification methods. Cost functions. Feature extraction. Feature selection. Dimensionality reduction. Principal Component Analysis (PCA). Linear Discriminant Analysis (LDA). Independent Component Analysis (ICA). Artificial Neural Networks (ANN). Supervised ANNs. Perceptron. Multi-Layer Perceptrons (MLPs). Gradient descent and the back-propagation algorithm for training MLPs. Resilient back_propagation (R-prop). Extreme Learning Machines (ELM). Radial Basis Function (RBF) networks. Unsupervised ANNs. Self-Organizing Maps. Advanced Topics in ML. Restricted Boltzman Machines. Output activation and loss functions. Hyperparameter tuning. Improving generalization. Bagging, boosting, ensemble classifiers. Belief Networks. Gaussian Mixture Models. The EM (Expectation – Maximization) algorithm. Markov Chains and Hidden Markov Models. Applications in the classification of different types of data (visual, audio, spatio-temporal, etc.) and in various fields (e.g. computer vision, remote sensing, photogrammetry, energy, telecommunications, biomedicine).

Python tutorials for Machine Learning

  1. Pattern Recognition and Machine Learning, C. M. Bishop, Springer, 2006.
  2. Machine Learning: An Algorithmic Perspective, S. Marsland, Chapman & Hall, 2nd ed., 2015.
  3. Pattern Recognition, S. Theodoridis and K. Koutroumbas, Academic Press, Elsevier, 4th ed., 2009.
  4. Pattern Classification, Richard O. Duda, Peter E. Hart, David G. Stork, 2nd Edition, Wiley, 2000
  5. Deep Learning, I. Goodfellow, Y. Bengio and A. Courville, MIT Press, 2016, http://www.deeplearningbook.org.
  6. Introduction to Statistical Pattern Recognition, Κ. Fukunaga, Academic Press.

Related scientific journals:

  1. IEEE Transactions on Pattern Analysis and Machine Intelligence
  2. Pattern Recognition, Elsevier
  3. Pattern Recognition Letters, Elsevier
  4. Journal of Machine Learning Research
  5. IEEE Transactions on Neural Networks and Learning Systems
Learning Results - Skills acquired

Learning Results

Upon successful completion of the course, students will be able to:

  • Design and implement machine learning and pattern recognition systems for a wide range of applications
  • Estimate parametric probability distributions of data features based on labelled data, using maximum likelihood and EM algorithm
  • Implement and train different machine learning models
  • Possess a critical understanding of the different characteristics, capabilities and limitations of different machine learning techniques and select the appropriate ones for different cases of complex problem solving

Skills acquired

  • Data and information retrieval, analysis and synthesis
  • Decision making
  • Individual work
  • Team work
  • New research ideas generation
  • Promotion of free, creative and inductive thinking
Course content

Probabilities review. Bayesian rules of inference. Distribution functions.

Introduction to Machine Learning (ML). ML methods. Types of learning. Terminology. ML Process Example. Statistical decision models for regression (function approximation) and classification problems. Bayes classifiers and minimal cost Bayes classifiers. Distance functions. Minimum distance classifiers. Linear and non-linear decision functions. Estimation of the probability density of input patterns. Parzen windows. The k-nearest neighbors (k-NN) classification algorithm. Supervised classification methods. Cost functions. Feature extraction. Feature selection. Dimensionality reduction. Principal Component Analysis (PCA). Linear Discriminant Analysis (LDA). Independent Component Analysis (ICA). Artificial Neural Networks (ANN). Supervised ANNs. Perceptron. Multi-Layer Perceptrons (MLPs). Gradient descent and the back-propagation algorithm for training MLPs. Resilient back_propagation (R-prop). Extreme Learning Machines (ELM). Radial Basis Function (RBF) networks. Unsupervised ANNs. Self-Organizing Maps. Advanced Topics in ML. Restricted Boltzman Machines. Output activation and loss functions. Hyperparameter tuning. Improving generalization. Bagging, boosting, ensemble classifiers. Belief Networks. Gaussian Mixture Models. The EM (Expectation – Maximization) algorithm. Markov Chains and Hidden Markov Models. Applications in the classification of different types of data (visual, audio, spatio-temporal, etc.) and in various fields (e.g. computer vision, remote sensing, photogrammetry, energy, telecommunications, biomedicine).

Python tutorials for Machine Learning

Recommended bibliography
  1. Pattern Recognition and Machine Learning, C. M. Bishop, Springer, 2006.
  2. Machine Learning: An Algorithmic Perspective, S. Marsland, Chapman & Hall, 2nd ed., 2015.
  3. Pattern Recognition, S. Theodoridis and K. Koutroumbas, Academic Press, Elsevier, 4th ed., 2009.
  4. Pattern Classification, Richard O. Duda, Peter E. Hart, David G. Stork, 2nd Edition, Wiley, 2000
  5. Deep Learning, I. Goodfellow, Y. Bengio and A. Courville, MIT Press, 2016, http://www.deeplearningbook.org.
  6. Introduction to Statistical Pattern Recognition, Κ. Fukunaga, Academic Press.

Related scientific journals:

  1. IEEE Transactions on Pattern Analysis and Machine Intelligence
  2. Pattern Recognition, Elsevier
  3. Pattern Recognition Letters, Elsevier
  4. Journal of Machine Learning Research
  5. IEEE Transactions on Neural Networks and Learning Systems