The global rush to embrace artificial intelligence is no longer just about cool apps and clever chatbots. It is also about water use, power grids, chip shortages and, ultimately, the health of our planet. That is why China and Indonesia are moving fast to put hard limits on some of the most addictive and energy hungry forms of AI.
In late 2025, China’s cyber regulator released draft rules for AI systems that mimic human personalities and build emotional bonds with users. The proposal would force providers of these human-like “companion” services to warn about excessive use, detect signs of addiction and step in when users show extreme emotions.
It also demands algorithm reviews, strong data protection and strict content red lines that bar material that threatens national security or promotes rumors, violence or pornography.
Indonesia is following a different, but related, path. The government is finalizing a presidential regulation that will anchor a national AI roadmap and AI ethics rules. Officials describe it as a framework that ministries can adapt to their own sectors, from health to finance. In public remarks, Deputy Minister Nezar Patria has stressed that one guiding principle is sustainability and said that “AI must be developed with consideration for its impact on humans, the environment, and all living creatures.”
Why link emotional chatbots and roadmaps to the environment at all? Because the current AI boom is powered by a sprawling network of data centers that consume enormous amounts of electricity and water. The International Energy Agency (IEA) estimates that data centers already emit around 180 million tons of CO2 per year and that their electricity demand could more than double by 2030 if current trends continue. AI workloads are only a slice of that total today, but they are growing quickly.
Recent research led by Alex de Vries-Gao suggests that AI systems alone could soon have a carbon footprint comparable to that of New York City and consume as much water as all bottled water drunk worldwide in a year. Another United Nations-backed analysis warns that global AI demand could use 4.2 to 6.6 billion cubic meters of water by 2027, roughly similar to Denmark’s annual water withdrawals.
To put that into everyday terms, a single medium-sized data center can “drink” as much water in a year as about one thousand households. Larger campuses can rival small cities. The more we lean on AI for search, work and entertainment, the more those server farms must be cooled, often with freshwater that could otherwise support agriculture or homes already dealing with sticky summer heat and stressed rivers.
The pressure does not stop at the tap or the power plant. The AI surge is also helping to trigger a global shortage of memory chips, as factories race to supply high-bandwidth components for AI servers and divert capacity away from phones and laptops. Analysts and chipmakers warn that this AI-driven squeeze is pushing up prices for consumer electronics and could last well into 2027.
For many people, the environmental cost of AI will first show up as a higher price tag on their next device and, later, as more electronic waste when older hardware is discarded sooner than planned.
Against this backdrop, Jakarta’s message that humans must not be “enslaved” by technology is not only about ethics in the abstract. The Indonesian roadmap aims to steer AI into priority sectors such as healthcare, education, smart cities and food security, while requiring accountability, transparency and respect for copyright. If it succeeds, AI tools might help farmers adapt to shifting rainfall or help public transport planners cut emissions, instead of simply feeding another round of mindless screen time.
China’s draft rules tackle a different risk. Emotional companion apps can feel endlessly patient and available, especially late at night when real friends are asleep. Regulators worry that users could grow dependent on these systems in ways that harm mental health or push people toward bad decisions. The proposal would require providers to monitor user emotions, flag risky behavior and avoid manipulative designs that keep people hooked at any cost.
Taken together, these policies show a growing recognition that AI is not an invisible cloud. It is a very physical industry that pulls on power grids, water supplies, rare minerals and, increasingly, people’s attention. For the most part, experts argue that strong guardrails, better transparency and clear environmental targets will be needed if AI is to help with climate solutions instead of quietly adding to the problem.
At the end of the day, the question is simple. Do we want AI that quietly drains reservoirs and drives chip prices higher, or AI that helps societies save energy and protect ecosystems while keeping humans firmly in charge? Countries like China and Indonesia are starting to put their answer into law.
The study was published in Patterns.












