Semester: 5
General Foundation
ECTS: 6
Hours per week: 3
Professor: T.B.D.
Teaching style: Face to face, usage of specialized software
Grading: Homework / Projects (60%), Final written exam (40%),
Activity | Workload |
---|---|
Lectures | 36 |
Class assignments / projects | 42 |
Independent study | 72 |
Course total | 150 |
Upon successful completion of the course, students will have acquired:
Recalls on Neural Networks, Multi Layer Perceptrons, Backpropagation
• Loss functions, Hyperparameter tuning, Regularization, Model selection, weight decay, dropout, Optimization (SGD, Rprop, adam, rmsprop)
• Deep Neural Networks
• Convolutional Neural Networks (CNN), LeNet/AlexNet, Deep Residual Networks (ResNet). Application sof CNNs (Single-Image Super-Resolution, Object detection)
• CNN variations and other solutions for object detection: R-CNN, Fast R-CNN, Faster R-CNN, Mask R-CNN, SSD, YOLO
• Recurrent Neural Networks, Long Short-Term Memory Networks, Gated Recurrent Units, Bidirectional LSTM
• Transformers, sequence-to-sequence (seq2seq) learning, attention
• Generative Models. Restricted Boltzman Machines, Deep Boltzman Machines, Deep Belief Networks). Autoencoders, Stacked (Denoising AutoEncoders), Variational Autoencoders. Generative Adversarial Networks.
Related scientific journals:
Upon successful completion of the course, students will have acquired:
Recalls on Neural Networks, Multi Layer Perceptrons, Backpropagation
• Loss functions, Hyperparameter tuning, Regularization, Model selection, weight decay, dropout, Optimization (SGD, Rprop, adam, rmsprop)
• Deep Neural Networks
• Convolutional Neural Networks (CNN), LeNet/AlexNet, Deep Residual Networks (ResNet). Application sof CNNs (Single-Image Super-Resolution, Object detection)
• CNN variations and other solutions for object detection: R-CNN, Fast R-CNN, Faster R-CNN, Mask R-CNN, SSD, YOLO
• Recurrent Neural Networks, Long Short-Term Memory Networks, Gated Recurrent Units, Bidirectional LSTM
• Transformers, sequence-to-sequence (seq2seq) learning, attention
• Generative Models. Restricted Boltzman Machines, Deep Boltzman Machines, Deep Belief Networks). Autoencoders, Stacked (Denoising AutoEncoders), Variational Autoencoders. Generative Adversarial Networks.
Related scientific journals: