Executive Summary : | The study explores the use of Federated Learning (FL) in IoT network devices to transfer locally generated data from different devices to a central location for machine learning models. FL alleviates privacy issues and reduces communication costs by combining local updates at edge nodes. In multi-access edge computing networks, intermediate nodes can assist in computation and communication. The project aims to characterize these communication loads and design new coding schemes to reduce them. One idea is to apply vector codes instead of scalar codes, which can split transmissions made by edge nodes into smaller messages. This could potentially reduce the loads further. The random coding technique, which has been successfully used in network coding, could be explored in this setting. Random coding works better on probabilistic channels and adapts according to channel erasure probability. The project also considers a more decentralized setting with only helper nodes and edge nodes. Existing techniques developed for hierarchical architecture can be extended to this new setting. The developed techniques will be simulated and validated using a network simulator.
The project's outcome would be beneficial in practical realization of decentralized gradient descent, applicable to various machine learning applications. The techniques developed here could also be extended to general model updates for training. Similar to FL, privacy concerns are mitigated, making it beneficial in areas like smart healthcare and smart industry. |