Streaming Media

Document Type

Event

Start Date

23-4-2023 5:00 PM

Description

The growing concern in data privacy has led to new paradigms in Machine Learning primarily focused around keep data safe and secure. In our research project, we studied Federated Learning, specifically utilizing knowledge distillation and an autoencoder in an attempt to create a sustainable model that could be used in a field such as Heathcare. We propose a Federated Model using the Flower framework, trained on the MedMNIST2D dataset (Organ(A/C/S)MNIST), using Knowledge Distillation as a method of sharing the global model, and a Variational Autoencoder to deal with the problem of Data Heterogeneity that can arise on a distributed network. Our results on a cumulative model are tentative but hope to prove that the idea can be utilized in networks with varying sizes of edge device, usage, and types.​

Share

COinS
 
Apr 23rd, 5:00 PM

UR-379 Combatting Data Heterogeneity in Federated Learning

The growing concern in data privacy has led to new paradigms in Machine Learning primarily focused around keep data safe and secure. In our research project, we studied Federated Learning, specifically utilizing knowledge distillation and an autoencoder in an attempt to create a sustainable model that could be used in a field such as Heathcare. We propose a Federated Model using the Flower framework, trained on the MedMNIST2D dataset (Organ(A/C/S)MNIST), using Knowledge Distillation as a method of sharing the global model, and a Variational Autoencoder to deal with the problem of Data Heterogeneity that can arise on a distributed network. Our results on a cumulative model are tentative but hope to prove that the idea can be utilized in networks with varying sizes of edge device, usage, and types.​