Course info

In recent years we witnessed a huge development in machine learning, especially in deep learning which drives a new technological revolution. These models improve searches, apps, social media and open new doors in medicine, automation, self-driving cars, drones and almost all fields of science. In this introductory deep learning class students will learn about neural networks, objectives, optimization algorithms and different architectures. During the semester students will work on multiple projects: 2-3 mini project, where students try out different algorithms and architectures, and a more complex final project. In the projects the focus will be on scientific applications of deep learning, for example application in drug discovery, weather prediction and astronomy. To successfully complete the class, prior knowledge in python and numpy (or the willing to learn it fast from handed out materials) is required. In course of the class students will learn about and will get comfortable with popular deep learning frameworks, such as TensorFlow and Keras.

Technical infos

Questions, problems

  • Open issue on GitHub
  • If you really don't want to do that, you can contact us via email: deeplearninginsciences at gmail dot com

Materials (@GitHub)

Course staff


  • Basic linear algebra
  • Basic Probability and Statistics
  • Python (or motivation to learn it in the first two week from handed out materials)


There will be 2-4 mini-project, some assignment and a final project in this course. The evaluation will be based on the projects and assignment. (Details will be decided and available latter.)


There is no fixed syllabus at this time, just a working draft below.

Planned schedule

Week Topics Instructor Materials Assignment
1 Course Introduction István Csabai, Bálint Pataki slides1, slides2 01
2 Maximum likelihood, linear regression, logistic regression, neural networks, gradient descent. Attila Bagoly slides -
3 Neural networks, backpropagation, optimizers. Attila Bagoly slides HW02
4 KNN, decision trees (random forests), SVM. Regularization. Ensemble learning. Dezső Ribli n1, n2, n3 photoz
5 Convolution, convolutional neural networks (CNN). Bálint Pataki slides, n1, n2
6 Google Cloud. Practical CNN guides. Attila Bagoly, Ribli Dezső slides
7 CNN architectures, deep learning tricks. Dezső Ribli n1
8 Reminders + project ideas + word embedding Bálint Pataki slides, n1, n2
9 Sequence models (RNN, GRU, LSTM) Attila Bagoly slides hw09
10 Object detection Dezső Ribli
11 Adversarial examples, face recognition Bálint Pataki slides, notebook
12 Autoencoders, variational autoencoder, generative adversarial networks Attila Bagoly slides, notebooks HW+


Galaxy photoZ: photoz-kaggle

Happines detection: happines-kaggle

Background materials and references