bayindirh 10 days ago

Actually, there are two main problems with AI:

    1. How it's gonna be used and how it'll be a detriment to quality and knowledge.
    2. How AI models are trained with a great disregard to consent, ethics, and licenses.
The technology itself, the idea, what it can do is not the problem, but how it's made and how it's gonna be used will be a great problem going forward, and none of the suppliers tell that it should be used in moderation and will be harmful in the long run. Plus the same producers are ready to crush/distort anything to get their way.

... smells very similar to tobacco/soda industry. Both created faux-research institutes to further their causes.

2
EFreethought 10 days ago

I would say the huge environmental cost is a third problem.

Aeolun 10 days ago

Data centers account for like 2% of global energy demand now. I’m not sure if we can really say that AI, which represents a fraction of that, constitutes a huge environmental problem.

bayindirh 10 days ago

An nVIDIA H200 uses around 2.3x more power (700W) when compared to a Xeon 6748P (300W). You generally put 8 of these cards into a single server, which adds up to 5.6KW, just for GPUs. With losses and other support equipment, that server uses ~6.1KW at full load. Which is around 8.5x more when compared to a CPU only server (assuming 700W or so at full load).

Considering HPC is half CPU and half GPU (more like 66% CPU and 33% GPU but I'm being charitable here), I expect an average power draw of 3.6KW in a cluster. Moreover, most of these clusters run targeted jobs. Prototyping/trial runs use much limited resources.

On the other hand, AI farms use all these GPUs at full power almost 24/7, both for training new models and inference. Before you asking, if you have a GPU farm which you do training, having inference focused cards doesn't make sense, because you can divide nVIDIA cards with MIG, so you can put aside some training cards, divide these cards to 6-7 and run inference on them, resulting ~45 virtual cards for inference per server, again at ~6.1KW load.

So, yes, AI's power load profile is different.

defrost 10 days ago

Data centres in general are an issue that contribute to climbing emissions, two percent globally is not trivial .. and it's "additional" over demand of a decade and more ago past, another sign we are globally increasing demand.

Emissions aside, locally many data centres (and associated bit mining and AI clusters) are a significant local issue due to local demand on local water and local energy supplies.

bayindirh 10 days ago

Yeah, that's true.

clown_strike 10 days ago

> How AI models are trained with a great disregard to consent, ethics, and licenses.

You must be joking. Consumer models' primary source of training data seems to be the legal preambles from BDSM manuals.