Friday, November 8th

    AI drive brings Microsoft's 'green moonshot' down to ground in West London

    img
    As AI creates new energy-hungry datacenters, a tech company's attempt to remove more CO2 than it produces is being tested.

    Ai Drive brought Microsoft "Green Moonshot" to Earth in West London When AI gives birth to new data centers that want energy, technology companies want to delete more than the carbon dioxide they produce than they produce their carbon dioxide.

    If you want to prove that Microsoft has proven progress in achieving the "moon" in your environment, please immediately approach the Earth: in the western industry. The Royal Park Data Center is part of the promise to advertise artificial intelligence (AI), but this ambition has shocked its negative target - in 2030. Microsoft announced that the center will run entirely on renewable energy. However, the construction of the data center and the servers housed within it means that the company's Scope 3 emissions, such as the carbon dioxide associated with its building materials and the electricity people consume using products such as the Xbox, are 30% higher than the emissions they create. normal emission % above. As a result, the company's total emission goals are approximately the same.

    Microsoft Co -founder Bill Gates says this week that AI will help fight climate change, as Big Tech "is very ready to pay for further cleaning ability" to say that they use green energy. "

    In the short term, AI has been problematic for Microsoft’s green goals. Brad Smith, Microsoft’s outspoken president, once called its carbon ambitions a “moonshot”. In May, he stretched the metaphor to the extreme, admitting that "the moon has moved" because of the AI ​​strategy. The company plans to spend £2.5bn over the next three years developing its AI data center infrastructure in the UK, and earlier this year announced plans to open new data centers around the world, including in the US, Japan, Spain and Germany. Training and operating the artificial intelligence models that power products like OpenAI's ChatGPT and Google's Gemini require large amounts of electricity to power and cool the associated hardware, generating additional carbon when the associated equipment is manufactured and shipped.

    "This technology increases energy consumption," said Alex de Vries, founder of Digiconomist, a website that monitors the environmental impact of new technologies. The International Energy Agency estimates that total electricity consumption in data centers could double from 2022 levels to 1,000 TWh (terawatt hours) by 2026, equivalent to Japan's energy needs. According to the calculations of the research firm SemiAnalysis, artificial intelligence will make data centers use 4.5% of global electricity production by 2030. This means that amid concerns about the impact of artificial intelligence on employment and human lifespan, the environment has also become a focus. Last week, the International Monetary Fund said that governments should consider carbon taxes to cover the environmental costs of artificial intelligence, in the form of a general carbon tax that would include emissions from servers as part of their sphere of influence, or other methods of imposition. a special tax on artificial intelligence. All major technology companies AI -Meta, Google, Amazon, Microsoft are looking for renewable energy resources to achieve their climatic goals. In January, Amazon, the world's largest buyer of renewable energy company, announced that he bought more than half of the output from the Sea of ​​Scotland, and Microsoft announced that it would support $ 10 billion ($ 7.9 billion) renewable in May). -Project. Google plans to run its data centers entirely on carbon-free energy by 2030.

    "We remain strongly committed to meeting our climate goals," a Microsoft spokesperson said. Microsoft co-founder Bill Gates, who stepped down in 2020 but retains a stake in the company through the Gates Foundation trust, believes artificial intelligence can directly help fight climate change. The extra electricity demand would be matched by new investments in green generation, he said on Thursday, which would more than compensate for the use.

    A recent UK government-backed report agreed, stating that the “carbon intensity of the energy source is a key variable” in calculating AI-related emissions, although it adds that “a significant portion of AI training globally still relies on high-carbon sources such as coal or natural gas”. The water needed to cool servers is also an issue, with one study estimating that AI could use up to 6.6 billion cubic meters of water by 2027, nearly two-thirds of the UK's annual consumption.

    De Vries believes that the pursuit of sustainable computing power is putting pressure on the demand for renewable energy, which will lead to fossil fuels filling the gap in other areas of the global economy. "Higher energy consumption means we don't have enough renewable energy to meet that growth," he said.

    NexGen Cloud is a UK company that provides sustainable cloud computing. This is an industry that relies on data centers to provide IT services such as data storage and computing power over the Internet. Renewable energy for artificial intelligence-enabled computing. Youlian Tzanev, co-founder of NexGen Cloud, said:

    “The industry norm is to build around economic centers rather than renewable energy.”

    That makes it harder for any tech company focused on artificial intelligence to hit carbon targets. Amazon, the world's largest cloud computing provider, aims to reach net-zero emissions by 2040, removing as much carbon as it emits, and match its global electricity use with 100% renewable energy by 2025. Google and Meta are pursuing the same net-zero target by 2030. Large language models (the technology that powers chatbots like ChatGPT or Gemini) consume energy in two main ways. The first is the training phase, where the model is fed a huge amount of data from the Internet and elsewhere and builds a statistical understanding of the language itself, eventually allowing it to provide robust answers to questions. The initial energy cost of training AI is astronomical. This prevents smaller companies (and even smaller governments) from competing in the sector without an extra $100 million to throw at the drill. The actual operating cost of the model (called "resonance", however, is a dwarf. According to the analyst, the analyst is Brent Thill, the investment company Jefferies the energy cost of 90 % of the phase energy costs: the current prose used when people ask for AI system to respond to the actual Investigations. The power used for training and inference is powered by a vast and growing digital infrastructure. Data centers are filled with servers built from the ground up for the specific AI workload they serve. A single training server can have a central processing unit (CPU) barely more powerful than what's in your computer, paired with dozens of specialized graphics processing units (GPUs) or tensor processing units (TPUs), chips designed to quickly clean up the vast amount of simple calculations that AI models are built from.

    If you use a chatbot while you watch it spit out verbatim responses, a powerful GPU uses about a quarter of the power needed to boil a kettle. All of this is hosted by a data center, whether owned by the AI ​​provider itself or by a third party—in which case it might be called "the cloud," a fancy name for someone else's computer.

    SemiAnalysis estimates that if generative AI were integrated into every Google search, this could result in an annual energy consumption of 29.2 terawatt-hours, equivalent to Ireland's annual energy consumption, although the financial cost to the technology company would be staggering Intimidated. It has made people speculate that search companies can start collecting certain AI tools.

    However, some people think that viewing AI's energy over their heads is the wrong lens. Instead, consider the energy savings your new tool will save. Earlier this year, Nature's controversial peer-reviewed journal Scientific Reports claimed that AI writing and illustration would have a smaller carbon footprint than humans. "130 to 1500 times" AI carbon dioxide is less than human authors, and researchers at the University of California, Ervan estimate that there are fewer texts that are most 2,900 times.

    Of course, the rest is what human writers and illustrators do. Relocating and reskilling labor to other areas, such as green jobs, could be another windfall.

    Tags :