Semester: 4
General Foundation
ECTS: 6
Hours per week: 3
Professor: T.B.D.
Teaching style: Face to face, usage of specialized software
Grading: Homework / Projects (60%), Final written exam (40%)
Activity | Workload |
---|---|
Lectures | 36 |
Class assignments / projects | 42 |
Independent study | 72 |
Course total | 150 |
Upon successful completion of the course, students will be able to:
Probabilities review. Bayesian rules of inference. Distribution functions.
Introduction to Machine Learning (ML). ML methods. Types of learning. Terminology. ML Process Example. Statistical decision models for regression (function approximation) and classification problems. Bayes classifiers and minimal cost Bayes classifiers. Distance functions. Minimum distance classifiers. Linear and non-linear decision functions. Estimation of the probability density of input patterns. Parzen windows. The k-nearest neighbors (k-NN) classification algorithm. Supervised classification methods. Cost functions. Feature extraction. Feature selection. Dimensionality reduction. Principal Component Analysis (PCA). Linear Discriminant Analysis (LDA). Independent Component Analysis (ICA). Artificial Neural Networks (ANN). Supervised ANNs. Perceptron. Multi-Layer Perceptrons (MLPs). Gradient descent and the back-propagation algorithm for training MLPs. Resilient back_propagation (R-prop). Extreme Learning Machines (ELM). Radial Basis Function (RBF) networks. Unsupervised ANNs. Self-Organizing Maps. Advanced Topics in ML. Restricted Boltzman Machines. Output activation and loss functions. Hyperparameter tuning. Improving generalization. Bagging, boosting, ensemble classifiers. Belief Networks. Gaussian Mixture Models. The EM (Expectation – Maximization) algorithm. Markov Chains and Hidden Markov Models. Applications in the classification of different types of data (visual, audio, spatio-temporal, etc.) and in various fields (e.g. computer vision, remote sensing, photogrammetry, energy, telecommunications, biomedicine).
Python tutorials for Machine Learning
Related scientific journals:
Upon successful completion of the course, students will be able to:
Probabilities review. Bayesian rules of inference. Distribution functions.
Introduction to Machine Learning (ML). ML methods. Types of learning. Terminology. ML Process Example. Statistical decision models for regression (function approximation) and classification problems. Bayes classifiers and minimal cost Bayes classifiers. Distance functions. Minimum distance classifiers. Linear and non-linear decision functions. Estimation of the probability density of input patterns. Parzen windows. The k-nearest neighbors (k-NN) classification algorithm. Supervised classification methods. Cost functions. Feature extraction. Feature selection. Dimensionality reduction. Principal Component Analysis (PCA). Linear Discriminant Analysis (LDA). Independent Component Analysis (ICA). Artificial Neural Networks (ANN). Supervised ANNs. Perceptron. Multi-Layer Perceptrons (MLPs). Gradient descent and the back-propagation algorithm for training MLPs. Resilient back_propagation (R-prop). Extreme Learning Machines (ELM). Radial Basis Function (RBF) networks. Unsupervised ANNs. Self-Organizing Maps. Advanced Topics in ML. Restricted Boltzman Machines. Output activation and loss functions. Hyperparameter tuning. Improving generalization. Bagging, boosting, ensemble classifiers. Belief Networks. Gaussian Mixture Models. The EM (Expectation – Maximization) algorithm. Markov Chains and Hidden Markov Models. Applications in the classification of different types of data (visual, audio, spatio-temporal, etc.) and in various fields (e.g. computer vision, remote sensing, photogrammetry, energy, telecommunications, biomedicine).
Python tutorials for Machine Learning
Related scientific journals: