🔶 The typical system setup for an #AI project is to collect data from decentralized data sources such as a file, a particular database on a server, or even a live data feed. All the data is collected and centralized in one place, usually in the cloud. The model would then use the centralized data for building a neural network.
Downfalls of centralized data sources include:
❌It does not ensure data privacy by design
❌It is inconvenient for large data packages such as video files
❌Big data infrastructure is required to process the data
❌Training big data systems on a cloud produce high carbon emissions due to cooling down the corresponding servers.
🔶A solution to overcome these challenges is #federatedlearning 🚀💡.
🔶Federated learning brings the AI to the data instead of sending the data to the AI.
🔶AI is trained locally on the device where the data resides, such as smartphones or edge devices.
🔶A server collects several local AI systems from different devices, aggregates them, and creates a global AI. This global AI is then sent back to all devices s.a smartphones, updating the local AI model with an improved global AI. The global AI contains all learnings of all local information but has never seen the data.
Federated learning has many advantages compared to centralized learning:
✅ It ensures data privacy by design.
✅It shares learning between different instances.
✅ It does not require moving large data sets to a central server/cloud.
✅ It does not require a complicated big data infrastructure.
✅ It has a reduced carbon footprint compared to centralized learning due to indirect cooling of the training device.
🤚 These frameworks have an active developer community that you may join to ask all your questions.
Contributing Editor: Dr. Maria Börner
Women in AI & Robotics core team member