Hacker News new | past | comments | ask | show | jobs | submit | scottcha's comments login

Yet we also see that hyperscale cloud emissions targets have been reversed due to AI investment, Datacenter growth is hitting grid capacity limits in many regions, and peaker plant and other non-renewable resources on the grid are being deployed more to handle this specific growth from AI. I think the author, by qualifying on "chatgpt" maybe can make the claims they are making but I don't believe the larger argument would hold for AI as a whole or when you convert the electricity use to emissions.

I'm personally on the side that the ROI will probably work out in the long run but not by minimizing the potential impact and keeping the focus on how we can make this technology (currently in its infancy) more efficient. [edit wording]


Voluntary conservation was only working by accident and guilt tripping never works. The grid needs to become clean so that we can have new industries.

Yep, this is the real answer. It's also the only answer. The big fiction was everyone getting hopped on the idea that "karma" was going to be real, and people's virtue would be correctly identified by overt environmentalism rather then action.

Fossil fuel companies won, and they won in about 1980s when BP paid an advertising firm to come up with "personal carbon footprint" as a meaningful metric. Basically destroyed environmentalism since...well I'll let you know when it stops.


It's a false dichotomy to say "either systemic change or individual change" - both have always and will always go hand in hand, influencing each other in the process. To say only systemic change is required leaves out the individual responsibility for those who have the means to choose. To say it's just individual change required leaves out the fact that people can only choose within the reality of their situation, which clearly is defined by the outcome of the system they are in.

> for those who have the means to choose

interesting qualification.

maybe conservation should start with the masters of the universe who fly private jets all over the world etc. and emit more than the rest of us do all year in a matter of hours.


I made a point in the post to say that it's better to mostly ignore your personal carbon footprint and focus on systematic change, but that I was writing the post for people who still wanted to reduce their consumption anyway

Emissions are a collective action problem. Guilt tripping works poorly directly on behaviour but it works on awareness&public discourse -> voting -> policy.

Cf how we addressed the ozone hole, acid rain, slavery, etc.


The grid being clean means not having any fossil power. We can only get there by shutting down all fossil fuel power plants.

We can not get there by adding new power generation.


Well you need the latter to replace the former. So you need to add new power generation to allow you to shut down fossil fuel plants.

And to be honest what we need to do is replace them with nuclear power stations to manages the base load of nations power requirements. Either that or much better power storage is required


> We can only get there by shutting down all fossil fuel power plants. We can not get there by adding new power generation.

https://knowyourmeme.com/photos/1433498-no-take-only-throw

"No add new power plants, only transform our grid to greener".


Even if grid the was 100% renewable, this does not mean that there's no environmental cost to producing electricity. As a society, we need to decide what is important and try to minize energy consumption for things that are not important.

And shoving LLMs into every nook and cranny of every application, so just tech giants who run the data centers can make more money and some middle managers get automatic summaries of their unnecessary video calls and emails is, I would argue, not important.

But once again, the fundamental issue is late-stage capitalism.


What's the upside of moralizing energy consumption, especially once it's 100% renewable. Why not just let the market decide? If I'm paying for it, why does anyone else get a say in how I use it?

Isn't that kind of a non-sequitur? The claim made was that renewable energy would still be a finite resource to some degree. It's possible that the available energy surplus will be too big for any decisions about usage to matter, but that's a strong claim and you're doing nothing to make it here.

A lot of people believe in a higher power. If trusting in this supposed "market" brings you comfort and clarity in a complicated world, I do not begrudge you it. But invoking it doesn't address the claim it's answering

It's also clear that "the market" does not care enough about environmental impact to even do stuff like remove the current significant fossil fuel subsidies present in most government budgets, nor stop individuals or organizations from consuming or selling said fuels, natural gas, or plastic products at massive scales, so it's unclear why it would allocate energy in a way that didn't deprive crucial priorities.

Like the theodicy on the invisible hand's problem of environmental collapse ain't lookin' good is all I'm saying


> What's the upside of moralizing energy consumption, especially once it's 100% renewable.

Because of the reason I just explained above. 100% renewable energy does not mean that producing that energy does not have an environmental cost.

> Why not just let the market decide? If I'm paying for it, why does anyone else get a say in how I use it?

Because the "free market" does not internalize externalities (costs to shared resources like the environment, air quality, public health, etc.).


Focussing on pricing those externalities (tiny as they'll be in a 100% renewable world), through laws + policy, is a better strategy than trying to convince people not to use their electricity for <thing I personally don't value>.

I kind of agree with this sentiment , though we haven't reached 100% renewable ...

Having LLMs everywhere haven't helped me much, it just gets in the way.

Why do you belive this? Datacenter uses just a 1-1.3 percent of electricity from grid and even if you suppose AI increased the usage by 2x(which I really doubt), the number would still be tiny.

Also AI training is easiest workload to regulate, as you can only train when you have cheaper green energy.


I also had doubts, but asked chat and it confirms it’s an issue - including sources.

https://chatgpt.com/share/678b6b3e-9708-8009-bcad-8ba84a5145...

The issue is that they are often localised, so even if it’s just 1% of power, it can cause issues.

Still, by itself, grid issues don’t mean climate issues. And any argument complaining about a co2 cost should also consider alternative cost to be reliable. Even if ai was causing 1% or 2% or 10% of energy use, the real question is how much it saves by making society more efficient. And even if it wasn’t, it’s again more of a question about energy companies polluting with co2.

Microsoft, which hosts OpenAI, is famously amazing in terms of their co2 emissions - so far they were going way beyond what other companies were doing.


ChatGPT didn't "confirm" anything there. It is not a meaninful reference.

What do you mean by confirms the issue? What's the issue exactly?

The issue is that when you have a high local usage your grid loses the ability to respond to peaks since that capacity is now always in use. Essentially it raises the baseline use which means your elasticity is pretty much gone.

A grid isn't a magic battery that is always there, it is constantly fluctuating, regardless of the intent of producers and consumers. You need to be able to have enough elasticity to deal with that fact. Changing that is hard (and expensive), but it is the only way (such as the technical reality).

The solution is not to create say, 1000 extra coal-fired generating facilities since you can't really turn them on or off at will. Same goes for gas, nuclear etc. You'd need a few of them for your baseline load (combined with other sources like solar, wind, hydro, whatever) and then make sure you have your non-renewable sources have margin and redundancy and use storage for the rest. This was always the case, and it will always be the case.

But now with information technology, the degree to which you can permanently raise demand on the grid to an extreme degree is where the problem becomes much more apparent. And because it's not manufacturing (which is an extreme consumer of energy) you don't really get the "run on lower output" option. You can't have an LLM do "just a little bit of inferencing". Just like you can't have your Netflix send only half a movie to "save power".

In the past we had the luxury of nighttime lower demand which means industry could up their usage, but datacenters don't sleep at night. And they also can't wait for batch processing during the day.


Except neither ChatGPT and nor sources say this. First source says:

> Gas-fired generation could meet data centers’ immediate power needs and transition to backup generation over time, panelists told the Northwest Power and Conservation Council.

What you are saying has nothing to do with local, but has to do with large abrupt changes in electricity usage, and datacenter electricity usage is generally more predictible and smooth than most other industry.


I'm not talking about fluctuations (i.e. a datacenter with fluctuating usage). I'm taking about adding a datacenter to an existing grid. That significantly changes the baseline load on the grid, and that is a local problem because transmission is not universally even across an entire grid.

If your transmission line is saturated, it doesn't matter how much more generation you add on the source end, it's not gonna deliver 'more' over the transmission lines.

And that is just a simplistic local example, because it's not a single producer, single consumer, single transmission line scenario. ChatGPT and the article aren't diving in to that. The closest they might get is congestion but even then you already have to know the issue to be able to ask about it.

As far as the article itself is involved here, this tread mostly goes into the reason why global usage percentages doesn't mean there are no problems. It's like saying gerrymandering has no impact because of some percentages elsewhere.


Is that true though? Data centers can be placed anywhere in the USA, they could be placed near a bunch of hydro or wind farm resources in the western grid which has little coal anyways outside of one line from Utah to socal. The AI doesn’t have to be located anywhere near to where it is used since fiber is probably easier to run than a high voltage power line.

That was already done years ago and people are predicting that the grid will be maxed out soon.

Build new data centers near sources of power, and grid capacity isn’t going to be a problem. Heck, American industry used to follow that (building garment factories on fast moving rivers before electricity was much of a thing, Boeing grew up in the northwest due to cheap aluminum helped out by hydro). Why is AI somehow different from an airplane?

They'll have to build new power generation and build the data centers next to it.

You are massively conflating what is possible with what is done.

I know data centers are built next to wind farms for these reasons already. We have an abundance of those projects out here in the PNW.

There are a large number of reasons the AI datacenters are geographically distributed--just to list a few off the top of my head which come up as top drivers: latency, data sovereignty, resilience, grid capacity, renewable energy availability.

Why does latency matter for a model that responds in 10s of seconds? Latency to a datacenter is measured in 10s or 100s of milliseconds, which is 3-4 orders of magnitude less.

Two reasons that I understand 1. not all these AIs are LLMs and many have much lower latency SLAs than chat and 2. These are just one part of a service architecture and when you have multiple latencies across the stack they tend to have multiplicative effects.

If you look at a model with a diverse competitive provider set like llama 3 the latency is 1/4 second, and it will definitely improve at a minimum incrementally if quality is held constant: https://artificialanalysis.ai/models/llama-3-3-instruct-70b/... Remember that as long as you experience the response linearly (very much the case for audio output for eg) then the first-chunk latency is your actual latency, not to stream the entire response.

The root problem there is that fossil energy is very cheap and the state sponsors production of fossil fuels. Consequently the energy incentives are weak and other concerns take priority.

This is coupled with low public awareness, many people don't understand the moral problem in using fossils so the PR penalty from a fossil powered data center is low.


Thanks, Yes I'm a long time electricity maps customer. I also agree that the EU markets data transparency is further along in helping incentivize opportunities here.

Also thanks for the other links!


It looks like they have data for the United States now.

We have public utilities that must raise bonds to grow to scale with demand, and some competitive electrical energy markets with competitive margin and efficiency incentives in the US.

FWIU, datacenters are unable to sell their waste heat, boiled sterilized steam and water, unused diesel, and potentially excess energy storage.

To sell energy back to the grid to help solve the Duck curve and Alligator curve problems requires a smart grid 2.0 or a green field without much existing infrastructure to cross over or under.

To sell datacenter waste heat through a pipe under the street to the building next door, you must add some heat.

Nonprofit org opportunity: Screenplay wherein {{Superhero}} explains this and other efficiency and sustainability opportunities to the market at large, perhaps with comically excessive product placement for which charity or charities and physics with 3D CG

"Solar thermal trapping at 1,000°C and above" (2024) should be enough added heat to move waste datacenter heat to a compatible adjacent facility.

Sand batteries hold more heat than water, in certain thermal conditions.

Algae farms can eat CO2 and Heat, for example.

Cooling towers waste heat and water as steam.

Nonprofit org opportunity: Directory with for-charity sponsored lists of renewable energy products and services, with regions of operation; though there's a way to list schema:LocalBusiness and their schema:Offer (s) for products and services with rdfs:Class and rdfs:Property for search engines to someday also index

Business opportunity: managed renewable energy service to quote solar/wind/hydro/geo/fauna/insulation site plans [for nonprofit organizations], with support contracts and safety inspection and also crew management


The Landauer limit is presumed to be a real limit to electronic computation: changing a 1 to a 0 releases energy that can't be sent out on the negative so it's wasted as thermal energy.

Photonic computing, graphene computing, counterfactual computing, superconductors, and probably edge chiral currents do or will eventually do more computation per unit of energy.

Ops per wHr metrics quantify the difference.


Yes we have an early PoC demonstrating significant gains. There are certainly levels of investment where consumption is managed well but that tends to be up to the point and at the DCIM level while we are working more on the compute level.


I wonder how this compares to the use of zarr format which seems to have similar goals and design and is already well integrated in several libraries (particularly xarray)?


Hi, Sid here from Terrafloww, essentially this blog and library's attempt is based on the fact that COG has emerged as the standard format used by NASA and ESA and others over the past 5 years. And there is way more data in COG and continually being produced as COGs still. So we wanted to make sure that we use the most prevalent format as efficiently and quickly as possible. We didn't try to create a new format, I believe thats best left to the large and open community itself to decide.

Thanks for the question! Happy to answer more.


Chopins posthumously published Nocturne in C# minor is a very popular piece to play/listen to. I also think there are several Debussy pieces he didn’t want published but are very popular now (maybe Reverie is one if my memory serves me)


Ok that's fair, I play it myself. Though it's also pretty weird in its middle section.


Are you planning on open sourcing your code and/or model weights? Aurora code and weights were recently open sourced.


Not immediately, but we will consider open sourcing some of our future work. At least, we definitely plan to be very open with our metrics and how well (or bad) our models are doing.


Couple of questions: 1. You link to fair.io but it seems there isn't a latest license there only a legacy version. Do you know what the current status is on what seems like a pending version update?

2. Your project repro actually links to the elastic license. Is there some relation between the two? Do you prefer the fairsource license over the elastic license>


1. You can read more about the Fair Source initiative here: https://twitter.com/chadwhitacre_/status/1790101820364267902 (also [0] and [1] for previous discussions). It's not official yet, but it's an attempt to define a new term for Open Source vs Source Available. I figured I'd start using the term early and see what happens.

2. Yes, the ELv2 will be considered a Fair Source license, because it allows contributions, redistribution, forking, etc. You can visit https://faircode.io for an idea of what Fair Source will ultimately entail (Fair Source and Fair Code will be merged eventually AFAICT).

[0]: https://github.com/getsentry/fsl.software/issues/4

[1]: https://github.com/getsentry/fsl.software/issues/2


What was your calculus on ELv2 vs SSPL? The latter feels a better fit for the continuation of copyleft--too close to the fears you mentioned around AGPL?


I looked at SSPL, but it was really between BUSL and ELv2 for me. I ultimately chose ELv2 because it allowed me to keep my community edition and enterprise edition in a monorepo, where the latter edition is protected by license keys, i.e. ELv2 has a clause to where you can't remove or circumvent the license key functionality.

Since Keygen was previously closed source, I didn't want to split the repo up into the ee/ directory structure that you commonly see in other open core and fair source projects. I deemed doing so as 'too risky' -- too much code churn for little benefit -- as those changes would have a direct effect on the SaaS.

Instead, I opted for ELv2 because it offered me the ability to say, "hey, I'm making this fair source, do what you want with it, just don't compete with me or remove the license gates for the paid features."

Here's some of my internal dialog i.r.t. the license choice: https://github.com/keygen-sh/keygen-api/pull/668


Fair Source seems to be pretty bleeding edge:

https://github.com/fairsource/fair.io/issues/14


Microsoft's officially reported carbon numbers don't include the impact of their software on devices sold by other OEMs.

This is a gray area of the GHG protocol on how to account for SW carbon emissions; some companies count it some don't.

[https://query.prod.cms.rt.microsoft.com/cms/api/am/binary/RW... pp 11.]


Maybe of some interest is my project https://openavalancheproject.org which is an attempt to improve/automate some of the forecasting of this with DL.

My goal isn't really a money making one though.


I worked in that area on a SCA trail crew in 1996, in the Bechlor ranger district, and its nothing we discussed then but looking as the ref it seems this loophole was only noted in 2008.

The area is very remote and while in the park not often visited. Its very beautiful with meandering rivers through large grasslands as well as some canyons with the Tetons in the distance.


> Its very beautiful with meandering rivers through large grasslands as well as some canyons with the Tetons in the distance.

Nice try, murderer.


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: