Beyond the Code: Carbon Impact of Large Language Models (LLMs)

<p>I&rsquo;m sure you&rsquo;ve heard of ChatGPT, Copilot, or any of the other many (many, many, many) AI tools hitting the tech scene. But something a little more nuanced is coming to light&ndash;and that&rsquo;s the environmental costs. I mean, &lsquo;L&rsquo; in LLM doesn&rsquo;t exactly stand for &rsquo;low-impact&rsquo;.</p> <p>So let&rsquo;s dive into the environmental cost of that LinkedIn post you just generated, and the realities of training large language models.</p> <h2>Measuring large language model energy consumption</h2> <p>Diving into this subject, I found out that GPT-3, of which GPT-3.5 is a subclass, used&nbsp;<a href="https://arxiv.org/ftp/arxiv/papers/2204/2204.05149.pdf" rel="noopener ugc nofollow" target="_blank">around 1287 MWh to be trained</a>&nbsp;(this is equivalent to the annual electricity consumption of an average household in the United States.) This was based on the understanding that they took 405 years (based on machine hours) to train in 2020 on V100 machines.</p> <p><a href="https://blog.thegreencoder.io/yet-another-ai-blog-post-carbon-impact-of-large-language-models-bd93c9ad54c2"><strong>Visit Now</strong></a></p>
Tags: Beyond Code