Artificial intelligence

Hassan Taher Examines the Hidden Infrastructure Costs Behind AI’s Growth

The artificial intelligence tools millions now use daily—ChatGPT, image generators, recommendation algorithms—depend on a physical infrastructure most users never consider. Behind each query processed in milliseconds sits a sprawling network of specialized facilities consuming resources at scales that challenge local power grids and water supplies. Hassan Taher, an AI consultant and author based in Los Angeles, has spent recent months analyzing how this infrastructure expansion intersects with environmental sustainability and community impact.

His focus reflects broader industry conversations about whether AI’s promised benefits justify its material costs. As founder of Taher AI Solutions since 2019, Taher advises organizations on implementing AI systems while considering their broader societal implications. His recent commentary on AI data centers builds on years spent examining how emerging technologies reshape industries from healthcare to finance.

Technical Requirements That Drive Resource Consumption

AI data centers differ fundamentally from traditional computing facilities. Standard data centers rely on Central Processing Units to handle tasks like web hosting and email services. AI facilities require Graphics Processing Units and Tensor Processing Units—specialized chips designed to process thousands of calculations simultaneously.

These processors generate substantially more heat than conventional hardware. A single GPU rack can produce thermal output equivalent to multiple traditional server racks, necessitating advanced cooling systems. Many facilities now employ liquid cooling rather than air-based systems to manage temperatures.

Hassan Taher points to these technical specifications when discussing infrastructure planning. Organizations implementing AI systems must account for power delivery, cooling capacity, and network bandwidth that exceed conventional computing requirements. A hyperscale AI facility might deploy thousands of GPUs across hundreds of racks, each demanding continuous power and cooling.

Water Withdrawal Patterns Across Facility Types

Data centers serving AI applications withdraw between one and five million gallons of water daily for cooling operations. This volume compares to consumption levels of small cities, with 80% to 90% coming from municipal water systems rather than private sources.

Individual AI queries contribute to cumulative demand. Research estimates that generating 100 words through a language model uses roughly 519 milliliters of water—approximately one standard bottle. Image and video generation requires substantially higher volumes per task.

Much withdrawn water evaporates during cooling processes rather than returning to source watersheds. This one-directional flow distinguishes data center consumption from industrial uses that recirculate water. Taher notes that companies rarely disclose facility-specific water usage, making it difficult for municipalities in states with lots of AI data centers to assess impacts on regional supplies.

Electricity Demand Growth Trajectories

AI-specific computing draws significantly more power than traditional search and processing tasks. A single AI-enhanced search query uses approximately 30 times the electricity of a conventional search. Global electricity consumption by data centers supporting AI applications could increase eleven-fold between 2023 and 2030 according to International Energy Agency projections

Regional grids face mounting pressure. Dublin’s data centers now account for nearly 80% of the city’s total electricity consumption, constraining power available for residential and commercial uses. Utility companies in Ireland have implemented connection delays for new data center projects while infrastructure catches up to demand.

Hassan Taher observes that meeting projected AI electricity needs sometimes requires expanding fossil fuel generation. Meta’s planned Louisiana data center complex correlates with three new methane gas plants being constructed specifically to supply power. These projects contradict stated corporate climate commitments while locking in decades of emissions-intensive infrastructure.

Geographic Distribution and Local Effects

Technology companies have concentrated recent data center expansion in southern U.S. states, seeking lower land costs and electricity rates. This geographic pattern often places facilities in counties with limited regulatory oversight and historically marginalized populations.

Municipal utility costs rise when data centers join local grids. Facilities negotiate favorable rate structures that can shift infrastructure upgrade costs to residential customers. Several states offer tax exemptions worth hundreds of millions annually to attract data center investment, reducing funds available for schools and environmental programs.

Taher emphasizes that communities hosting these facilities rarely receive proportional economic benefits. Data centers employ relatively few workers compared to their physical footprint and resource consumption. Tax arrangements often exempt facilities from property assessments that would otherwise fund local services.

Current Regulatory Gaps

Federal and state frameworks have not adapted to AI infrastructure’s resource intensity. No federal regulations currently limit data center water withdrawal volumes. Texas law prevents local authorities from even tracking facility water use, eliminating transparency mechanisms that might inform policy.

Energy efficiency standards remain voluntary rather than mandatory. Some facilities report Power Usage Effectiveness ratios, measuring total facility energy against computing equipment energy, but companies choose whether to disclose these figures. Verification mechanisms do not exist for self-reported data.

Hassan Taher argues that regulatory architecture must address several priorities simultaneously. Mandatory reporting requirements would establish baseline data on resource consumption patterns. Binding efficiency targets could drive technological improvements while limiting growth in aggregate demand. Renewable energy requirements might prevent new fossil fuel infrastructure built specifically to serve AI computing.

Questions About Necessity and Design

Beyond regulatory approaches, Taher suggests the AI sector needs foundational questions about when and whether these systems serve genuine needs. Some applications currently handled by AI could accomplish similar outcomes through less resource-intensive methods. Recommendation algorithms, for instance, functioned adequately using traditional computing before AI enhancement.

Companies should evaluate whether AI implementation offers sufficient advantages over existing approaches to justify its material costs. This assessment differs from asking whether AI can handle a task—focusing instead on whether it should given available alternatives.

Sustainability considerations might inform system design from initial stages rather than as add-ons to existing architecture. Facilities could prioritize renewable energy sources during site selection. Cooling systems might incorporate closed-loop designs that reduce water withdrawal. Companies could establish maximum resource budgets for applications, forcing efficiency improvements rather than accepting unlimited consumption growth.

Creating sustainable infrastructure to support AI developmentpresents challenges extending beyond technical optimization. Water supplies, electricity grids, and community health all intersect with decisions about how and where to build facilities serving AI applications. Hassan Taher’s analysis emphasizes that addressing these intersections requires both policy changes and industry practices that prioritize sustainability from design stages forward.

Comments
To Top

Pin It on Pinterest

Share This