Contact person
Olof Mogren
Senior Researcher
Contact OlofAt RISE Learning Machines Seminar on August 31 2023. we had the pleasure to listen to Zahra Taghiyar Renan, Halmstad University, give her talk: From domain adaptation to federated learning.
– Theoretically, we illustrate how heterogeneity among clients influences the global model’s performance on each individual client. This highlights the necessity for Decentralized Domain Adaptation approaches.
In recent years, data-driven methods have received increasing attention across many fields. These methods are generally based on machine learning and learn predictive models from provided training samples. The models predict the labels of previously unseen samples or the value of a dependent variable. However, the generalization ability of the models to predict the label of the test samples is inherently connected to the assumption that training and test samples are generated by independent and identically distributed random variables, the IID assumption.
The dynamic nature of the real world may violate the mentioned assumption. This may result in a mismatch between samples available for training and future samples used for exploiting the models. Statistically, training and test samples may originate from different distributions. We call each of these data and the corresponding data distributions “domains”. To alleviate the differences between the domains, a solution is Domain Adaptation (DA); DA methods reduce the distance between diverse distributions. The scenario most often addressed by the literature is that DA deals with two domains: Source and Target. In this scenario, the source dataset is fully labeled. The goal is to adapt source and target domains to construct a generalizable model for the target domain using the labeled source samples. However, in order to achieve this objective, domain adaptation typically requires centralizing samples from all domains to facilitate the adaptation process.
Privacy considerations often prevent data owners (clients) from sharing their data entirely. Consequently, decentralized learning methods, such as federated learning, have gained attention as alternatives to centralized approaches. These decentralized methods enable collaborative learning to build a shared global model. Given that distributed domains often possess statistical heterogeneity, the global model’s effectiveness can be impacted. Theoretically, we illustrate how heterogeneity among clients influences the global model’s performance on each individual client. This highlights the necessity for Decentralized Domain Adaptation approaches.
Zahra (Nasrin) Taghiyarrenani is currently in her last year as a Ph.D. candidate at Halmstad University, where her specialization lies in the area of Domain Adaptation and Federated Learning. Through her studies, she, by proposing new Domain Adaptation and Federated Learning methods, has effectively addressed the problems within computer network security and Predictive Maintenance.