Machine learning models are growing increasingly powerful in their abilities, whether that might be in https://thenewstack.io/openais-gpt-3-makes-big-leap-forward-for-natural-language-processing/, tackling the intricacies of https://thenewstack.io/computer-vision-modeling-unlocks-new-use-cases/ or any other number of https://thenewstack.io/5-ai-trends-to-watch-out-for-in-2022/. But we are finding out that as these models grow larger and larger, so do their corresponding carbon footprints, especially when it comes to https://thenewstack.io/check-your-ml-carbon-footprint-with-the-machine-learning-emissions-calculator/.

This approach allowed the researchers to compare models’ energy consumption as it related to geography, time and type of energy generation.

One can only imagine the emissions involved in training even larger models like OpenAI’s groundbreaking https://thenewstack.io/openais-gpt-3-makes-big-leap-forward-for-natural-language-processing/ NLP model, which consists of 100 billion parameters.

While the team’s work focuses solely on the operational carbon emissions of training AI models and doesn’t take into account the carbon emissions associated with building the hardware, cooling data centers and so on, the team nevertheless pointed out that more comprehensive carbon-aware approaches will become vital in ensuring the future sustainability of machine learning models.

Related Articles