by Julian Cooper*
Oct 22, 2025
Silicon Valley is using legal loopholes and NDAs to keep the public in the dark about the water and energy being consumed by generative AI.
A new Microsoft data center in Mount Pleasant, Wisconsin, will require more than eight million gallons of water each year to operate, according to records that Microsoft did everything in its power to keep hidden from the public.
Last month, the company argued to Wisconsin officials that documents related to its use of Lake Michigan’s water for artificial intelligence (AI) data centers should be treated as trade secrets—a status typically used to protect intellectual property, such as the Coca-Cola recipe.
In the southern Wisconsin city of Racine, which has a water usage agreement with Mount Pleasant pertaining to the data center, Microsoft pleaded to environmental authorities that its use of water from Lake Michigan should be exempt from public accounting processes that it claims violate the business’ confidentiality. But the Great Lakes, which contain 20 percent of the Earth’s surface freshwater, are protected by the Great Lakes Compact of 2008, which requires strict and detailed documentation of any water diverted outside of the Great Lakes Basin—including water diverted for Microsoft’s data centers. The trade-secret request ultimately failed after the Midwest Environmental Advocates group sued the city of Racine for failing to hand over water usage documents.
The events in Mount Pleasant and Racine belong to a nationwide wave of data center development that has been particularly prevalent in Midwestern states such as Wisconsin, Illinois, and Ohio, where tech companies hope to source the millions of gallons of water that can be needed to cool thousands of computer servers directly from the Great Lakes. And while its argument for withholding its water usage data failed in this case, Microsoft and other tech giants like Amazon and Elon Musk’s xAI are fighting desperately and creatively across the country to cover up the environmental footprint of their generative AI data centers.
In Virginia, which hosts more data centers than any other state in the country, non-disclosure agreements, or NDAs, have become tech companies’ tool of choice to cover up their water and energy use. These NDAs, which companies push local governments to sign when they open data centers, are broadly written to prohibit officials from publicly sharing any business information about the centers. Earlier this year, The Virginia Mercury identified at least twenty-five municipal governments that had entered NDAs with data center operators, out of thirty-one localities with data centers.
Julie Bolthouse, land use director for the Piedmont Environmental Council, a Virginia advocacy group formed in 1972, tells The Progressive these agreements deprive the public of transparency in how these data centers affect residents’ water and electricity costs.
“NDAs have been signed at all levels: the water utilities, the energy providers, the localities, the state level,” Bolthouse says. “The NDAs are very broad. They basically say you can’t share any information about energy usage without talking to [the data center operator] first. They’re mostly a scare tactic.”
The debate over data centers has brought controversy—and money—to Virginia state politics. Dominion Energy, an investor-owned private electric utility and the largest electricity provider in Virginia, stands to profit when new electric infrastructure is built to meet data center demand. The company is currently lobbying the state to reform electricity rates structures for large power users to account for the rapid spread of data centers, with the hope of increasing electricity rates to maximize profits. And they’re backing up their lobbying effort with money: Dominion Energy has spent more than $67 million on elections in Virginia since 2020, making it one of the largest political donors in the state outright.
Dominion Energy’s lobbying money, the NDAs across Virginia, and even Microsoft’s trade-secret argument in Wisconsin are all part of the same racket to coerce small towns into relinquishing control of public resources to tech companies and their generative AI operations. Tech giants like Microsoft, Amazon, and xAI are wealthier than some entire nations, and can offer small municipalities hundreds of millions of dollars in tax revenue. In turn, cities build transmission lines and water infrastructure to power data centers, even if it raises the prices of these goods for residents.
What’s more, utilities don’t actively disclose how much water and energy the data centers use, because water and energy utilities are often under municipal or private control—meaning they have no legal obligation to disclose the usage of individual buyers, even if a buyer requires millions of gallons of water or fifty megawatts of power per year.
In many cases, a data center operator would need to obtain state and federal water permits in order to obtain this much water themselves, such as by tapping water from Lake Michigan directly. But if they instead choose to purchase water through a utility, the burden of acquiring permits falls on the municipality. For the most part, state environmental agencies can’t intervene in this process, because neither utilities nor zoning ordinances are under the state’s purview. Unless municipal governments disclose information—which tech companies are working to prevent them from doing—state regulators have few opportunities to get involved.
However, there may be a few levers of power for state agencies to audit data centers. States like California and Wisconsin have environmental impact assessment laws modeled after the National Environmental Policy Act. While these laws do not necessarily require data center operators or municipalities to disclose their own usage data, they do allow the state to conduct a public investigation of a data center’s energy and water needs before it is built.
One of the only state-level environmental impact assessments of a data center to date was completed in 2021, when California’s energy commission used the California Environmental Quality Act to audit the environmental impact of Microsoft’s San Jose data center. The resulting 471-page report contains only two paragraphs on water usage—the commission did not investigate how much water would be used, how that water would be returned to the watershed, if the center would increase drought risk, or if it would increase water or energy prices for local residents. The state instead deferred these queries to the local San Jose water utilityIf the utility were to investigate these environmental impacts, they would be investigating their own paying customer.
Wisconsin’s Department of Natural Resources will be conducting an environmental impact assessment for an $8 billion data center this year under the Wisconsin Environmental Policy Act. But, like in San Jose, the agency may defer to the utility on the issue of industrial water use. If state environmental agencies continue to abdicate their responsibility to audit data centers, then utilities will continue to be caught in the same conflicts of interest with their data center customers.
Bolthouse says the new wave of AI data centers, which comes with unprecedented needs in terms of water and energy use, are mainly used to power generative AI systems for programs like ChatGPT. The generative AI market is growing rapidly: Meta just announced Vibes, a new platform which allows users to scroll endlessly through AI-generated videos, while Elon Musk is looking to break into this space with his Grok chatbot on X.
This new generation of data center infrastructure supporting generative AI is markedly different in terms of use as well as environmental impact compared to server farms built in the past several decades, which support everyday Internet functions such as search engines and Apple’s iCloud storage.
“We’ve had data centers being developed [in Virginia] since the 1990s,” Bolthouse says. “They do a lot of things that are very beneficial for society. I think it’s really important to differentiate between that and the explosive speculative market that we’re seeing drive the development of generative AI data centers. Generative AI is based on near endless scaling.”
*Julian Cooper is a writer from Madison, Wisconsin. His work has been published in The Progressive, The Michigan Advance, and Tone Madison.
.
We remind our readers that publication of articles on our site does not mean that we agree with what is written. Our policy is to publish anything which we consider of interest, so as to assist our readers in forming their opinions. Sometimes we even publish articles with which we totally disagree, since we believe it is important for our readers to be informed on as wide a spectrum of views as possible.











