What do deep learning and AI’s crowning jewels, advanced image recognition, gaming champion AlphaGo and language models such as GPT-3 all have in common? Energy, of course.
Over their life cycle, large AI models such as these carry out processes which can emit over 626,000 pounds of carbon dioxide, equivalent to over five times the lifetime emissions of the average American car. And, this has generally been the consensus, often proving to be a detriment to adoption: large AI models do consume a large amount of energy and thereby cause a considerable amount of emissions.
The MIT Technology Review reports: “since the first paper studying this technology’s impact on the environment was published three years ago, a movement has grown among researchers to self-report the energy consumed and emissions generated from their work. Having accurate numbers is an important step toward making changes, but actually gathering those numbers can be a challenge.”
Given this, any sort of algorithmic adjustments that can help reduce energy consumption could prove to be a major boon to the future of deep learning. To this end, new research shows that scientists could now use cloud platforms to train algorithms to dramatically reduce energy consumption, and thereby, the emissions it generates. Simple changes to cloud settings could prove to be the key here.
To this end, Seattle-based Allen Institute for AI, with Microsoft, AI community Hugging Face, and three universities have collaborated to develop a tool that adequately measures electricity usage for machine learning models running on Azure, Microsoft’s cloud service. With it, the MIT Tech Review reports:
“Azure users building new models can view the total electricity consumed by graphics processing units (GPUs)—computer chips specialized for running calculations in parallel—during every phase of their project, from selecting a model to training it and putting it to use. It’s the first major cloud provider to give users access to information about the energy impact of their machine-learning programs.”
Although there already exist multiple emission-tracking tools, they do not work when cloud services are hosted by clients such as Microsoft, Amazon, and Google. Such services do not give users direct visibility into the CPU, GPU, and memory resources the activities consume. Even the new Azure tool, in this regard, only reports energy usage and not emissions.
To this end, the researchers mapped emissions to energy consumption using a service called Watttime and estimated emissions based on “the zip codes of cloud servers running 11 machine-learning models.”
“They found that emissions can be significantly reduced if researchers use servers in specific geographic locations and at certain times of day. Emissions from training small machine-learning models can be reduced up to 80% if the training starts at times when more renewable electricity is available on the grid, while emissions from large models can be reduced over 20% if the training work is paused when renewable electricity is scarce and restarted when it’s more plentiful.”
Energy-conscious users of the cloud may potentially lower emissions by adjusting preference settings on the world’s three largest cloud service providers (Microsoft Azure, Google Cloud, and Amazon Web Services). This may prove to be a crucial first step in firmly making machine learning more environmentally friendly. Additionally, Azure’s new tool only calculates energy consumption from the GPUs, not considering CPU, memory usage, or the energy needed for the building or cooling of physical services. Hence, a lot of work still needs to be done on this front.
More researchers than ever are now into the habit of reporting their energy usage. With this increasingly becoming a factor in planning future projects, it could start reducing machine learning’s impact on climate change.
Know more about the syllabus and placement record of our Top Ranked Data Science Course in Kolkata, Data Science course in Bangalore, Data Science course in Hyderabad, and Data Science course in Chennai.