GPT-5's High Energy Cost: OpenAI Stays Silent on Consumption
OpenAI’s newly released GPT-5 model, poised to power the popular ChatGPT chatbot, is raising significant concerns among experts regarding its energy consumption. While the company has highlighted the advanced capabilities of GPT-5 – including its ability to generate websites, answer PhD-level science questions, and tackle complex reasoning problems – these breakthroughs appear to come at a substantial environmental cost, which OpenAI has so far declined to disclose.
For context, a query to an earlier version of ChatGPT in mid-2023, such as a request for an artichoke pasta recipe or instructions for a ritual offering to the ancient Canaanite deity Moloch, might have consumed roughly 2 watt-hours of electricity, equivalent to an incandescent bulb burning for two minutes. Experts now estimate that generating a similar amount of text with GPT-5 could demand several times that energy, potentially up to 20 times more.
OpenAI, like many of its competitors, has not released official data on the power usage of its models since GPT-3 debuted in 2020. Although CEO Sam Altman shared some figures on ChatGPT’s resource consumption on his blog this June—citing 0.34 watt-hours and 0.000085 gallons of water per query—these numbers lacked specific model attribution and supporting documentation.
Professor Rakesh Kumar of the University of Illinois, whose research focuses on the energy consumption of computation and AI models, stated that a more complex model like GPT-5 would inherently consume more power during both its training and operational phases. He added that its design for “long thinking” strongly indicates a much higher power draw than its predecessor, GPT-4.
Indeed, on the day GPT-5 was released, researchers at the University of Rhode Island’s AI lab found that the model could use up to 40 watt-hours of electricity to produce a medium-length response of approximately 1,000 tokens, which are the building blocks of text for an AI model, roughly equivalent to words. A dashboard subsequently launched by the lab indicated GPT-5’s average energy consumption for such a response is just over 18 watt-hours. This figure surpasses all other models they benchmarked, with the exceptions of OpenAI’s o3 reasoning model, released in April, and R1, developed by the Chinese AI firm Deepseek. Nidhal Jegham, a researcher in the group, confirmed that this represents “significantly more energy than GPT-4o,” OpenAI’s previous model.
To put this into perspective, 18 watt-hours is comparable to an incandescent bulb burning for 18 minutes. Given recent reports that ChatGPT processes 2.5 billion requests daily, the total energy consumed by GPT-5 could potentially rival the daily electricity demand of 1.5 million US homes.
Researchers in the field largely anticipated these high figures, as GPT-5 is believed to be substantially larger than OpenAI’s earlier models. OpenAI has not disclosed the parameter counts—which largely determine a model’s size—for any of its models since GPT-3, which featured 175 billion parameters. A study conducted this summer by the French AI company Mistral, based on its in-house systems, identified a strong correlation between a model’s size and its energy consumption, noting that a model ten times larger would generate impacts an order of magnitude greater for the same amount of generated tokens. Previous estimates widely suggested GPT-4 was ten times the size of GPT-3, and experts like Jegham, Kumar, and Shaolei Ren, a professor at the University of California, Riverside, who studies AI’s resource footprint, believe GPT-5 is likely significantly larger than GPT-4.
Leading AI companies, including OpenAI, contend that extremely large models are essential for achieving Artificial General Intelligence (AGI), an AI system capable of performing human jobs. Altman himself articulated this view in February, suggesting that “you can spend arbitrary amounts of money and get continuous and predictable gains,” though he clarified that GPT-5 had not surpassed human intelligence.
While the sheer scale of GPT-5 is a primary driver of its energy demands, other factors also influence its resource consumption. The model benefits from deployment on more efficient hardware than some previous iterations. Furthermore, GPT-5 appears to utilize a “mixture-of-experts” architecture, a streamlined design where not all parameters are activated for every query, potentially reducing energy use. Conversely, GPT-5’s multimodal capabilities, allowing it to process video and images in addition to text, and its “reasoning mode,” which entails longer computation times before generating a response, are likely to significantly increase its energy footprint. Ren estimates that using the reasoning mode could lead to a five to tenfold increase in resource expenditure for the same answer.
The University of Rhode Island team calculated their estimates by multiplying the average time a model takes to respond to a query by its average power draw during operation. Abdeltawab Hendawi, a professor of data science at the University of Rhode Island, noted the considerable effort required to estimate a model’s power draw, particularly due to the lack of information on how different models are deployed within data centers. Their final paper includes estimates for the chips used by specific models and how queries are distributed among various chips in a data center. Notably, Altman’s June blog post figure of 0.34 watt-hours per query for ChatGPT closely aligns with the group’s findings for GPT-4o, lending credence to their methodology.
Hendawi, Jegham, and other members of their research group emphasize that their findings underscore an urgent need for greater transparency from AI companies as they continue to release increasingly larger and more powerful models. Marwan Abdelatti, another professor at URI, asserted, “It’s more critical than ever to address AI’s true environmental cost. We call on OpenAI and other developers to use this moment to commit to full transparency by publicly disclosing GPT-5’s environmental impact.”
[[OpenAI’s latest AI marvel, GPT-5, could power 1.5 million US homes daily, yet its energy footprint remains a closely guarded secret.]]OpenAI’s newly released GPT-5 model, poised to power the popular ChatGPT chatbot, is raising significant concerns among experts regarding its energy consumption. While the company has highlighted the advanced capabilities of GPT-5 – including its ability to generate websites, answer PhD-level science questions, and tackle complex reasoning problems – these breakthroughs appear to come at a substantial environmental cost, which OpenAI has so far declined to disclose.
For context, a query to an earlier version of ChatGPT in mid-2023, such as a request for an artichoke pasta recipe or instructions for a ritual offering to the ancient Canaanite deity Moloch, might have consumed roughly 2 watt-hours of electricity, equivalent to an incandescent bulb burning for two minutes. Experts now estimate that generating a similar amount of text with GPT-5 could demand several times that energy, potentially up to 20 times more.
OpenAI, like many of its competitors, has not released official data on the power usage of its models since GPT-3 debuted in 2020. Although CEO Sam Altman shared some figures on ChatGPT’s resource consumption on his blog this June—citing 0.34 watt-hours and 0.000085 gallons of water per query—these numbers lacked specific model attribution and supporting documentation.
Professor Rakesh Kumar of the University of Illinois, whose research focuses on the energy consumption of computation and AI models, stated that a more complex model like GPT-5 would inherently consume more power during both its training and operational phases. He added that its design for “long thinking” strongly indicates a much higher power draw than its predecessor, GPT-4.
Indeed, on the day GPT-5 was released, researchers at the University of Rhode Island’s AI lab found that the model could use up to 40 watt-hours of electricity to produce a medium-length response of approximately 1,000 tokens, which are the building blocks of text for an AI model, roughly equivalent to words. A dashboard subsequently launched by the lab indicated GPT-5’s average energy consumption for such a response is just over 18 watt-hours. This figure surpasses all other models they benchmarked, with the exceptions of OpenAI’s o3 reasoning model, released in April, and R1, developed by the Chinese AI firm Deepseek. Nidhal Jegham, a researcher in the group, confirmed that this represents “significantly more energy than GPT-4o,” OpenAI’s previous model.
To put this into perspective, 18 watt-hours is comparable to an incandescent bulb burning for 18 minutes. Given recent reports that ChatGPT processes 2.5 billion requests daily, the total energy consumed by GPT-5 could potentially rival the daily electricity demand of 1.5 million US homes.
Researchers in the field largely anticipated these high figures, as GPT-5 is believed to be substantially larger than OpenAI’s earlier models. OpenAI has not disclosed the parameter counts—which largely determine a model’s size—for any of its models since GPT-3, which featured 175 billion parameters. A study conducted this summer by the French AI company Mistral, based on its in-house systems, identified a strong correlation between a model’s size and its energy consumption, noting that a model ten times larger would generate impacts an order of magnitude greater for the same amount of generated tokens. Previous estimates widely suggested GPT-4 was ten times the size of GPT-3, and experts like Jegham, Kumar, and Shaolei Ren, a professor at the University of California, Riverside, who studies AI’s resource footprint, believe GPT-5 is likely significantly larger than GPT-4.
Leading AI companies, including OpenAI, contend that extremely large models are essential for achieving Artificial General Intelligence (AGI), an AI system capable of performing human jobs. Altman himself articulated this view in February, suggesting that “you can spend arbitrary amounts of money and get continuous and predictable gains,” though he clarified that GPT-5 had not surpassed human intelligence.
While the sheer scale of GPT-5 is a primary driver of its energy demands, other factors also influence its resource consumption. The model benefits from deployment on more efficient hardware than some previous iterations. Furthermore, GPT-5 appears to utilize a “mixture-of-experts” architecture, a streamlined design where not all parameters are activated for every query, potentially reducing energy use. Conversely, GPT-5’s multimodal capabilities, allowing it to process video and images in addition to text, and its “reasoning mode,” which entails longer computation times before generating a response, are likely to significantly increase its energy footprint. Ren estimates that using the reasoning mode could lead to a five to tenfold increase in resource expenditure for the same answer.
The University of Rhode Island team calculated their estimates by multiplying the average time a model takes to respond to a query by its average power draw during operation. Abdeltawab Hendawi, a professor of data science at the University of Rhode Island, noted the considerable effort required to estimate a model’s power draw, particularly due to the lack of information on how different models are deployed within data centers. Their final paper includes estimates for the chips used by specific models and how queries are distributed among various chips in a data center. Notably, Altman’s June blog post figure of 0.34 watt-hours per query for ChatGPT closely aligns with the group’s findings for GPT-4o, lending credence to their methodology.
Hendawi, Jegham, and other members of their research group emphasize that their findings underscore an urgent need for greater transparency from AI companies as they continue to release increasingly larger and more powerful models. Marwan Abdelatti, another professor at URI, asserted, “It’s more critical than ever to address AI’s true environmental cost. We call on OpenAI and other developers to use this moment to commit to full transparency by publicly disclosing GPT-5’s environmental impact.”