Course info

In recent years we witnessed a huge development in machine learning, especially in deep learning which drives a new technological revolution. These models improve searches, apps, social media and open new doors in medicine, automation, self-driving cars, drones and almost all fields of science. In this introductory deep learning class students will learn about neural networks, objectives, optimization algorithms and different architectures. During the semester students will work on multiple projects: 2-3 mini project, where students try out different algorithms and architectures, and a more complex final project. In the projects the focus will be on scientific applications of deep learning, for example application in drug discovery, weather prediction and astronomy. To successfully complete the class, prior knowledge in python and numpy (or the willing to learn it fast from handed out materials) is required. In course of the class students will learn about and will get comfortable with popular deep learning frameworks, such as TensorFlow and Keras.

Technical infos

Questions, problems

  • Open issue on GitHub
  • If you really don't want to do that, you can contact us via email:
    • István Csabai: csabai at complex elte hu
    • Attila Bagoly: battila93 at gmail com
    • Dezső Ribli: dkribli at gmail com
    • Bálint Ármin Pataki: patbaa at gmail com

Materials (@GitHub)

Course staff


  • Basic linear algebra
  • Basic Probability and Statistics
  • Python (or motivation to learn it in the first two week from handed out materials)


There will be 2-4 mini-project, some homework and a final project in this course. The evaluation will be based on the projects and homework. (Details will be decided and available latter.)


There is no fixed syllabus at this time, just a working draft below.

Planned schedule

Week Topics Materials Home work
1 Course Introduction slide1, slide2 01
3 Regression. Classification. Logistic regression. Clustering. Decision tree, random forest. SVM. Decision boundary.
4 Shallow neural networks. Activation functions. Loss functions. Backpropagation. Optimalization algorithms: Batch Gradient Descent, Momentum, RMSProp, Adam.
5 Mini-project 1: linear models, simple networks, backpropagation
6 Deep neural networks, convolutional neural networks, autoencoders. Variational autoencoders (VAE), Generative Adversarial networks (GAN). Keras tutorial.
7 Mini-project 2: DNN, CNN, VAE, GAN
8 Object detection, neural style transfer
9 Mini-project 3: Object detection, neural style transfer
10-13 Recurrent neural networks, reinforcement learning. Other topics. Final project


One of the projects is already launched, you may start getting familiar: link