COMP47650 Deep Learning

Academic Year 2021/2022

Recent advances in machine learning have been dominated by neural network approaches broadly described as deep learning. This module provides an overview of the most important deep learning techniques covering both theoretical foundations and practical applications. The applications focus on problems in image understanding and language modelling and use appropriate state-of-the-art deep learning libraries and tools which are introduced in the module.

Prerequisites: Machine Learning (or similar), strong programming ability (Python or similar languages), strong mathematical ability (especially linear algebra and differential calculus, optimisation)

Show/hide contentOpenClose All

Curricular information is subject to change

Learning Outcomes:

On completion of this module students should be able to:
– understand what deep learning means and differentiate it from other approaches to machine learning
– understand loss functions, optimisation and the gradient descent algorithm
– understand the backpropagation of error algorithm in detail and how it is used to train deep neural networks
– distinguish between the most important neural network architectures (e.g. feed forward networks, convolutional neural networks, recurrent neural networks)
– apply appropriate deep learning techniques (e.g. convolutional neural networks, generative adversarial networks...) to image understanding problems (e.g. classification, segmentation, and generation) using open source deep learning frameworks (e.g. TensorFlow, PyTorch)
– apply appropriate deep learning techniques (e.g. dimensional embedding, recurrent neural networks, long short term memory networks, and gated recurrent units) to language modelling problems (e.g. classification, machine translation, and generation) using open source deep learning frameworks (e.g. TensorFlow, PyTorch)

Indicative Module Content:

Introduction to DL

Neural Network Fundamentals:
- Learning task, experience and performance. Types of task. Generalisation. Basic neurons. Perceptron.
- MLP and gradient descent. Backpropagation and Computational Graphs. Softmax and relative entropy.
- Softmax and ReLU, Overfitting and ways to deal with it, Speeding up gradient descent. Hyperparameter tuning.

CNN and Case Studies:
- Introduction to Image Understanding, Classification, Convolution, Stride and Padding, Pooling
- CNN,  Regularisation, Dropout, Batch Norm
- AlexNet, VGG, GoogLeNet (inception module), ResNet (residual module).
- Introduction to DL libraries
- 1D and 3D CNN generalisation, Transfer learning, Segmentation and Detection Applications

RNN, Sequence Models and Case Studies:
- Introduction to Language Modelling, Dimensional Embeddings (word2vec),
- RNN, Backprop through Time, LSTM and GRU, Sequence-to-Sequence Models
- Applications to Machine Translation, Sentiment Analysis,
- Utilising DL frameworks for Language Modelling, Attention models


Reinforcement Learning:
- SARSA & Q-Learning
- DeepNN RL


Tutorials:
- DL demo
- Logistic Regression, Jupyter Notebook
- Single Layer Feedforward Network, Jupyter Notebook
- Deep Feedforward Network, Jupyter Notebook
- Gradient Checking, Parameter Initialisation, Jupyter Notebook
- L2 Regularisation, Dropout, Jupyter Notebook
- Optimisers (mini-batch gradient descent, momentum, adam), Jupyter Notebook
- CNN, Jupyter Notebook

- Reinforcement learning demos

Student Effort Hours: 
Student Effort Type Hours
Lectures

24

Autonomous Student Learning

80

Total

104

Approaches to Teaching and Learning:
A set of lectures discussing recent developments of Deep Learning associated with a number of interactive tutorials with an emphasis on practical implementation. 
Requirements, Exclusions and Recommendations

Not applicable to this module.


Module Requisites and Incompatibles
Not applicable to this module.
 
Assessment Strategy  
Description Timing Open Book Exam Component Scale Must Pass Component % of Final Grade
Project: Deep learning project: dataset pre-processing and setup, modelling using DL frameworks (e.g. pytorch, tensorflow), training and evaluation of model, short report. Week 12 n/a Graded No

80

Attendance: Attendance and engagement in classes and weekly tutorials Throughout the Trimester n/a Graded No

20


Carry forward of passed components
No
 
Resit In Terminal Exam
Autumn No
Please see Student Jargon Buster for more information about remediation types and timing. 
Feedback Strategy/Strategies

• Group/class feedback, post-assessment

How will my Feedback be Delivered?

Solutions to assignments will be derived as a collaborative exercise at the weekly tutorials