Seminar Computational Intelligence B (708.112)

SS 2019

Institut für Grundlagen der Informationsverarbeitung (708)

Lecturers:

Assoc. Prof. Dr. Robert Legenstein

Office hours: by appointment (via e-mail)

E-mail: robert.legenstein@igi.tugraz.at
Homepage: https://www.tugraz.at/institute/igi/team/prof-legenstein/


DI Michael Müller

Office hours: by appointment (via e-mail)

E-mail: mueller@igi.tugraz.at
Homepage: https://www.tugraz.at/institute/igi/people/mueller/




Location: IGI-seminar room, Inffeldgasse 16b/I, 8010 Graz
Date: starting on Tuesday, March 12 2018, 13:15 - 15.00

Content of the seminar: Neural Architecture Search

How to choose the architecture and hyperparameters of deep neural networks? There is no clear answer to this question and often many architectures have to be cross-validated to find a good solution to the problem at hand. Due to the recent availability of large-scale computing power, researchers have proposed algorithms that search for good architectures. We will discuss in this seminar recent work in this direction.

Prior knowledge of machine learning and neural networks in particular is expected.


Topics:

Talks will be assigned at the first seminar meeting.

Talks should be no longer than 35 minutes, and they should be clear, interesting and informative, rather than a reprint of the material. Select what parts of the material you want to present, and what not, and then present the selected material well (including definitions not given in the material: look them up on the web or if that is not successful, ask the seminar organizers). Often diagrams or figures are useful for a talk. On the other hand, giving in the talk numbers of references that are listed at the end is a no-no (a talk is an online process, not meant to be read). For the same reasons you can also quickly repeat earlier definitions or so if you suspect that the audience may not remember it.



Papers/Literature:

Motivation: why does NAS matter?

Reinforcement learning

Evolutionary methods

Gradient-based methods

Surrogate model methods

Hypernetworks



General rules:

Participation in the seminar meetings is obligatory. We also request your courtesy and attention for the seminar speaker: no smartphones, laptops, etc during a talk. Furthermore your active attention, questions, and discussion contributions are expected.

After your talk (and possibly some corrections) send pdf of your talk to Charlotte Rumpf charlotte.rumpf@tugraz.at, who will post it on the seminar webpage.




TALKS:

Date # Topic / paper title Presenter Presentation
7.5.2019
1
Bergstra et al., 2012, Random search for hyper-parameter optimization
Hadrovic
PDF
14.5.2019
2
Pham et al., 2018, Efficient neural architecture search via parameter sharing
Simon
PDF

3
Real et al., 2017, Large-scale evolution of image classifiers
Simic
PDF
21.5.2019
4
Zoph et al., 2017, Neural architecture search with reinforcement learning
Khodachenko
PDF

5
Chen et al., 2018, Reinforced evolutionary neural architecture search
Mittendrein
PDF
28.5.2019
6
Liu et al., 2018, DARTS: differentiable architecture search
Peter
PDF

7
Luo et al., 2018, Neural architecture optimization
Lackner
PDF
4.6.2019
8
Liu et al., 2018, Progressive neural architecture search
Weinrauch
PDF

9
Brock et al., 2017, SMASH: one-shot model architecture search through hypernetworks
Martinelli
PDF