Welcome to WarBulletin - your new best friend in the world of gaming. We're all about bringing you the hottest updates and juicy insights from across the gaming universe. Are you into epic RPG adventures or fast-paced eSports? We've got you covered with the latest scoop on everything from next-level PC gaming rigs to the coolest game releases. But hey, we're more than just news! Ever wondered what goes on behind the scenes of your favorite games? We're talking exclusive interviews with the brains behind the games, fresh off-the-press photos and videos straight from gaming conventions, and, of course, breaking news that you just can't miss. We know you love gaming 24/7, and that's why we're here round the clock, updating you on all things gaming. Whether it's the lowdown on a new patch or the buzz about the next big gaming celeb, we're on it.

Contacts

  • Owner: SNOWLAND s.r.o.
  • Registration certificate 06691200
  • 16200, Na okraji 381/41, Veleslavín, 162 00 Praha 6
  • Czech Republic

Nvidia's CEO reckons that millions of AI GPUs will reduce power consumption, not increase it

The benefits of AI can be debated. But one thing we're all sure about is that massive server farms packed full of hundreds or thousands of high-end AI GPUs, each consuming hundreds of watts, soak up a lot of power. Right?

Not Jensen Huang, the CEO of Nvidia whose catchphrase has become «the more you buy, the more you save», in reference to his company's stratospherically expensive AI chips. Perhaps inevitably, Huang has a similar take when it comes to the power consumption associated with the latest AI models, which mostly run on Nvidia hardware.

Speaking in a Q&A following his Computex keynote, Huang's point is firstly that Nvidia's GPUs do computations much faster and more efficiently than any alternative. As he puts it, you want to «accelerate everything». That saves you money, but it also saves you time and power.

Next, he distinguished between training AI models and inferencing them and how the latter can offer dramatically more efficient ways of getting certain computational tasks done.

«Generative AI is not about training,» he says, «it's about inference. The goal is not the training, the goal is to inference. When you inference, the amount of energy used versus the alternative way of doing computing is much much lower. For example, I showed you the climate simulation in Taiwan— 3,000 times less power. Not 30% less, 3,000 times less. This happens in one application after another application.»

Huang also pointed out that AI training can be done anywhere, it's doesn't need to be geographically proximate. To put it his way, «AI doesn't care where it goes to school.»

«The world doesn't have enough power near population. But the world has a lot of excess energy. Just the amount of energy coming from the sun is incredible. But it's in the wrong place. We should set up power plants and data centres for training where we don't have population, train the model somewhere else. Move the model for inference everyone's closer to the people—in their pocket on phones, PCs,

Read more on pcgamer.com