UPDATED 20:24 EST / FEBRUARY 22 2026

AI

Sam Altman defends AI’s resource consumption and ridicules Musk’s plan to put data centers in space

OpenAI Group PBC Sam Altman fielded questions in India this week about the environmental impact of artificial intelligence at a special event hosted by The Indian Express. But even as he conceded that AI’s total energy consumption is a legitimate concern, Altman contended that ChatGPT doesn’t really consume any more resources over its lifetime than the average human being.

The executive also strongly refuted sensationalist claims around AI’s water usage. When asked if it’s accurate to say that a single ChatGPT query “consumes 17 gallons of water” and the equivalent of 1.5 iPhone battery charges to process a single query, he replied that such claims are “completely untrue, totally insane and have no connection to reality.”

According to Altman, AI’s water consumption was once a legitimate worry with older data centers that use evaporative cooling systems. But modern data centers use much more efficient methods to cool their servers, and so the issue has evaporated, he insisted.

With regard to AI’s energy consumption, Altman conceded that the total amount of power used globally is troublesome, but said it should really just encourage us to accelerate the shift to nuclear, wind and solar energy sources.

However, many observers criticized Altman’s own apples-and-oranges framing. “This alone may have lost my trust in Sama to build a good AI company I understand the point he’s trying to make, but this is trying to break down people and models into cost for output and ignoring the value of humanity itself It’s a bad path imo,” Creative Strategies analyst Max Weinback wrote on X. For another commenter, Altman’s comparison just reinforced the clueless-tech-bro reputation of tech leaders: “I assume OpenAI holds some kind of internal competition for worst comms idea in terms of likely perception by the general public and the winner gets to have their suggestions spoken out by sama.”

Chatbots vs. humans

Some of the AI industry’s biggest critics like to make what Altman said is an unfair “apples-to-oranges comparison” about AI’s energy usage, contrasting the massive amounts of electricity used to train models to the tiny amount of energy a human brain uses for inference tasks.

“But it also takes a lot of energy to train a human,” Altman contended. “It takes like 20 years of life and all of the food you can eat during that time before you get smart. And not only that, it took the very widespread evolution of the 100 billion-odd people that have ever lived to build our cumulative knowledge in survival, science, mathematics and more.”

Altman said a fairer comparison would be to look at the total energy used to train an AI model and respond to one question, and contrast this with the lifetime of energy used by humans to get them to where they need to be to perform the same task. “Most probably, AI has already caught up on an energy efficiency basis,” he insisted.

Space-based AI is ‘ridiculous’

Earlier in the discussion, Altman was asked about his thoughts on Elon Musk’s ambitions to send data centers into low Earth orbit. This month, Musk cited orbital data centers as one of the main reasons for merging his rocket company SpaceX Corp. with his AI firm xAI Corp. Google LLC has also explored the concept.

However, Altman dismissed the idea as “ridiculous” at the present time. He said the launch costs involved in getting a data center into orbit are likely to be extremely prohibitive compared with terrestrial power generation. He also cited the near-impossible challenge of fixing things such as broken processors or storage arrays. “If you just do the rough math of launch costs relative to the cost of power we can do on Earth, we are not there yet,” he said.

Altman did say that space could one day be a useful environment for certain AI applications, but insisted that “orbital data centers are not something that’s going to matter at scale this decade.”

Here’s the full interview with Altman:

Photo: TechCrunch/Flickr

A message from John Furrier, co-founder of SiliconANGLE:

Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.

  • 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
  • 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About SiliconANGLE Media
SiliconANGLE Media is a recognized leader in digital media innovation, uniting breakthrough technology, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — with flagship locations in Silicon Valley and the New York Stock Exchange — SiliconANGLE Media operates at the intersection of media, technology and AI.

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.