Welcome to WarBulletin - your new best friend in the world of gaming. We're all about bringing you the hottest updates and juicy insights from across the gaming universe. Are you into epic RPG adventures or fast-paced eSports? We've got you covered with the latest scoop on everything from next-level PC gaming rigs to the coolest game releases. But hey, we're more than just news! Ever wondered what goes on behind the scenes of your favorite games? We're talking exclusive interviews with the brains behind the games, fresh off-the-press photos and videos straight from gaming conventions, and, of course, breaking news that you just can't miss. We know you love gaming 24/7, and that's why we're here round the clock, updating you on all things gaming. Whether it's the lowdown on a new patch or the buzz about the next big gaming celeb, we're on it.

Contacts

  • Owner: SNOWLAND s.r.o.
  • Registration certificate 06691200
  • 16200, Na okraji 381/41, Veleslavín, 162 00 Praha 6
  • Czech Republic

Apple’s Secret AI Sauce Apparently Skips NVIDIA GPUs & Uses Google’s Chips For Training

This is not investment advice. The author has no position in any of the stocks mentioned. Wccftech.com has a disclosure and ethics policy.

In a fresh research paper that details its AI training capabilities for the iPhone and other products artificial intelligence features announced this year, Cupertino tech giant Apple, it seems, has chosen to rely on Google's chips instead of market leader NVIDIA's. NVIDIA's rise to the top of the market capitalization food chain is based on the strong demand for its GPUs which have pushed revenue and earnings higher by triple digit percentages.

However, in its paper, Apple shares that its 2.73 billion parameter Apple Foundation Model (AFM) relies on v4 and v5p tensor processing unit (TPU) cloud clusters typically provided by Alphabet Inc's Google.

Related Story Apple Intelligence Is Now Available To All Developers With iOS 18.1, iPadOS 18.1, And macOS Sequoia Beta Updates

Apple's AI Approach Relies On Using TPUs Instead of GPUs Shows Research Paper

Apple's research paper, released earlier today, covers its training infrastructure and other details for the AI models that will power features announced at the WWDC earlier this year. Apple announced both on device AI processing and cloud processing, and at the heart of these AI features is the Apple Foundation Model dubbed AFM.

For AFM on server, or the model that will power the cloud AI features called Apple Cloud Compute, Apple shared that it trains a 6.3 trillion token AI model "from scratch" on "8192 TPUv4 chips." Google's TPUv4 chips are available in pods made of 4096 chips each.

Apple added that the AFM models (both on device and cloud) are trained on TPUv4 chips and v5p Cloud TPU clusters. v5p is part of Google's Cloud AI 'Hypercomputer,' and it was announced in December last year.

Each v5p pod is made of 8,960 chips each, and according to Google, it offers twice the floating point operations per second (FLOPS) and three times the memory over TPU v4 to train models nearly

Read more on wccftech.com