epolanski 6 days ago

So, in essence, all AMD does to launch a successful GPU in inference space is to load it with ram?

2
TrueDuality 6 days ago

AMD's limitation is more of a software problem than a hardware problem at this point.

AuryGlenz 6 days ago

But it’s still surprising they haven’t. People would be motivated as hell if they launched GPUs with twice the amount of VRAM. It’s not as simple as just soldering some more in but still.

wruza 6 days ago

AMD “just” has to write something like CUDA overnight. Imagine you’re in 1995 and have to ship Kubuntu 24.04 LTS this summer running on your S3 Virge.

mirekrusin 5 days ago

They don't need to do anything software wise, inference is solved problem for AMD.

thomastjeffery 5 days ago

They sort of have. I'm using a 7900xtx, which has 24gb of vram. The next competitor would be a 4090, which would cost more than double today; granted, that would be much faster.

Technically there is also the 3090, which is more comparable price wise. I don't know about performance, though.

VRAM is supply limited enough that going bigger isn't as easy as it sounds. AMD can probably sell as much as they get their hands on, so they may as well still more GPUs, too.

regularfry 5 days ago

Funnily enough you can buy GPUs where someone has done exactly that: solder extra VRAM into a stock model.

yencabulator 5 days ago

Or let go of the traditional definition of a GPU, and go integrated. AMD Ryzen AI Max+ 395 with 128GB RAM is a promising start.