Search

AI Carbon Footprint

#DidYouKnow that training an #AI model produces as much carbon as 5 cars in their lifetime?


🏭Running a #NAS (Neural Architecture Search) consumes 280t carbon and in comparison, one car produces 57t carbon in its lifetime. A human life produces in comparison 4t carbon per year and the #training of image classification of #ImageNet which is a dataset of 14 million images being described by multiple words or word phrases consumes in the worst case 1.5kg carbon with one training.


🟢 NAS is an extreme use case in this sense since it is a process to automatically find optimal network structures. The training is complex and requires vast computational resources (800 GPUs for three to four weeks).


However, we should still try to keep any additional carbon emissions as low as possible. The need therefore to answer the following questions:

❓ What influences the carbon emission of AI training?

❓ How can we further reduce the carbon emission of AI training?


🌱 Geological location of the training center The carbon emission of data centers where the training is performed depends on the energy resources to obtain electricity in a specific country. Countries that rely on #solar or #windpower have the lowest emission factors. In contrast, countries using coal-fired power stations have the highest carbon emission. If you change the location of a data center you can reduce the carbon emission by a factor of 10.


🌱Power usage efficiency of the training facility The training energy in a data center depends on the Power Usage Efficiency which is the ratio between Total Facility Energy (e.g. light, cooling) and IT Equipment (#GPUs). The average of all data facilities has an efficiency number of around 1.67. If you choose your facility cleverly you can reduce this factor down to 1.1.


🌱 Training time It is clear that the longer the AI training performs the higher the carbon emission. You should therefore always ask yourself if you need further improvements to the AI model or if you can live with an accuracy of 90% instead of reaching a maximum 93% for a double amount of time.


🟢 Another method to reduce the training time is to choose your training data wisely. Instead of collecting as much data as possible, you can take only new and variant data that improve your model performance and avoid unwanted bias by unuseful data.


🟢 Before you start your AI training in the future, think about carbon emission and try to follow the steps we have listed. Have fun with your new green machine learning project!


Contributing Editor: Dr. Maria Börner, Women in AI & Robotics team member from #Berlin


Follow Women in AI & Robotics to learn from our expert community members and be notified of our upcoming events and workshops.


🤫 Teaser: Stay tuned, we have a #workshop coming up very soon!

#machinelearning #noenergy #carbonfootprint #emissions #greenAI #artificialintelligence #carbonemission



40 views0 comments

Recent Posts

See All