Master thesis - A Reward-based Algorithm for Hyperparameter Oprimisation of Neural Networks

Vacancies 25 September 2019

Closing date





Ann-Marie Rehnström +46 (13) 180365
Mattias Helsing +46 (586) 81989
Apply for job!

Your role

Machine learning offers techniques with immense modeling power. There is, however, no general scheme for optimizing a machine learning algorithm. Namely, the optimum hyperparameter settings depend on the dataset, upon which models are to be trained. A, more or less recent, topic called meta machine learning attempts to optimize certain machine learning algorithms using, e.g., reinforcement learning. Drawing inspiration from this topic, this thesis aims to investigate the possibility to implement a reward-based algorithm for optimization of a neural network’s hyperparameter settings.

Suggested Work-plan:

  • A literature study
  • Familiarize oneself with the Keras API, and develop a dense network
  • Construct a reward-based algorithm that incorporates the dense network
  • Study if the implementation can be made to optimize a few, e.g. three, of the dense network’s hyperparameters.
  • Devise a way of quantifying the reward-based algorithm’s performance, and study how the algorithm scales to simultaneous optimization of additional parameters. Compare the implemented optimization algorithm with other typical methods for hyperparameter optimization.
  • Write master thesis report

Your profile

Pre-requisites: Programming, linear algebra and a familiarity with reading scientific articles. Prior knowledge about machine learning is preferrable but not required.

This work can be done by 1-2 students. Different takes on the thesis include the optimisation of: dense networks, and; convolutional networks

At Saab, we constantly look ahead and push boundaries for what is considered technically possible. We collaborate with colleagues around the world who all share our challenge – to make the world a safer place.

Apply for job!