In Physics in medicine and biology
OBJECTIVE : Federated learning (FL) is a computational paradigm that enables organizations to collaborate on machine learning (ML) and deep learning (DL) projects without sharing sensitive data, such as patient records, financial data, or classified secrets.
APPROACH : Open Federated Learning (OpenFL) framework is an open-source python-based tool for training ML/DL algorithms using the data-private collaborative learning paradigm of FL, irrespective to the use case. OpenFL works with training pipelines built with both TensorFlow and PyTorch, and can be easily extended to other ML and DL frameworks.
MAIN RESULTS : In this manuscript, we present OpenFL and summarize its motivation and development characteristics, with the intention of facilitating its application to existing ML/DL model training in a production environment. We further provide recommendations to secure a federation using trusted execution environments to ensure explicit model security and integrity, as well as maintain data confidentiality. Finally, we describe the first real-world healthcare federations that use the OpenFL library, and highlight how it can be applied to other non-healthcare use cases.
SIGNIFICANCE : The OpenFL library is designed for real world scalability, trusted execution, and also prioritizes easy migration of centralized ML models into a federated training pipeline. Although OpenFL's initial use case was in healthcare, it is applicable beyond this domain and is now reaching wider adoption both in research and production settings. The tool is open sourced at github.com/intel/openfl.
Foley Patrick, Sheller Micah J, Edwards Brandon, Pati Sarthak, Riviera Walter, Sharma Mansi, Moorthy Prakash Narayana, Wang Shi-Han, Martin Jason, Mirhaji Parsa, Shah Prashant, Bakas Spyridon
deep learning, federated learning, machine learning, open source, privacy, security