Breadcrumb

Oskar Allerbo awarded the Faculty of Science's Doctoral Thesis Award

Published

Oskar Allerbo's thesis on the statistical method of regression has contributed to new insights into how complex neural networks learn by linking them to classical statistical methods. Oskar is now awarded the Faculty of Science's Doctoral Thesis Award 2024.

How does it feel to receive this year's thesis award?
“It feels very fun and honourable, of course. As a doctoral student, you are often a bit alone in your work. That's why it's very encouraging to get this proof that there are those who appreciate my work.” 

Oskar Allerbo
Oskar Allerbo

What is your thesis about? 
“I have studied different aspects of regression. Regression is a statistical method with the goal of finding the function that best fits the observed data. The simplest example is linear regression, where you can basically plot your data on a piece of paper and use a ruler to draw the line that fits best, but regression also includes much more complex functions, such as automatic text translation based on artificial neural networks. A limitation of complex regression models is that they tend to be expensive, in terms of time and computational power, to fit to the data (which is commonly referred to as training the model), and they are difficult to interpret. In my research, I have developed various methods to increase the interpretability and reduce the training cost of complex regression models.” 

How can your research benefit society? 
“Regression comes up whenever you want to construct functions from data, and the number of examples is almost endless. It can be about predicting the risk of a heart attack from patient data, about automatic image recognition or about predicting the weather based on different factors.  In some cases, it is very important to combine good predictive capabilities with high interpretability; a very hypothetical example could be automated courts – in this case, you definitely do not want a model whose reasoning you do not understand. In other cases, if for example hardware or time is limited, it may be critical to reduce the computational cost of a model. 

Hopefully, my research can help more people to train advanced, but intuitive, models faster and cheaper.” 

What are you doing now? 
“I am now working as a postdoctoral fellow at the Royal Institute of Technology, KTH. I am involved in a collaboration with Karolinska Institutet, where we are trying to predict blood sugar levels based on the electrical signals in the vagus nerve.” 

Award motivation 

Oskar Allerbo's thesis reflects a deep scientific curiosity regarding the connection between machine learning- and AI-methods and statistics. In this thesis, Oskar has developed methods for increasing the interpretability of neural networks. He has also proposed Elastic Gradient Descent, a methodological work centered on the link between gradient descent with early stopping and ridge regression, and similarly coordinate descent and lasso regression. Oskar has investigated kernel ridge regression and introduced kernel gradient flow. He has here shown that, by using a time-varying bandwidth, one observes a double descent behaviour, as is frequently seen for neural networks. Taken together, Oskar's thesis has contributed to new insights in how complex neural networks learn by coupling these to classical statistical methods. Oskar is passionate about his research but is also dedicated to the academic environment as a whole. Oskar has represented the PhD students in the research education board at the science faculty and has engaged in first cycle education as a lecturer and course examiner. 

About the Doctoral Thesis Award  

The award is given for successful and innovative research presented in a well-written doctoral thesis. The author receives a diploma and an award. The award ceremony will be held on November 6.