Welcome to WarBulletin - your new best friend in the world of gaming. We're all about bringing you the hottest updates and juicy insights from across the gaming universe. Are you into epic RPG adventures or fast-paced eSports? We've got you covered with the latest scoop on everything from next-level PC gaming rigs to the coolest game releases. But hey, we're more than just news! Ever wondered what goes on behind the scenes of your favorite games? We're talking exclusive interviews with the brains behind the games, fresh off-the-press photos and videos straight from gaming conventions, and, of course, breaking news that you just can't miss. We know you love gaming 24/7, and that's why we're here round the clock, updating you on all things gaming. Whether it's the lowdown on a new patch or the buzz about the next big gaming celeb, we're on it.

Contacts

  • Owner: SNOWLAND s.r.o.
  • Registration certificate 06691200
  • 16200, Na okraji 381/41, Veleslavín, 162 00 Praha 6
  • Czech Republic

iPhone 16 Lineup To Feature 8GB RAM, But Analyst Believes Apple’s Development Of On-Device LLMs Will Be Limited Due To Insufficient Memory

Apple has various plans regarding generative AI, but to access them, users will need an iPhone 15 Pro or an iPhone 15 Pro Max at the bare minimum as both flagships feature 8GB RAM. The same amount of memory is said to be retained on the iPhone 16 series, but according to an analyst, it is still an insufficient number. According to him, Apple is facing severe limitations in the development of its on-device Large Language Models, or LLMs.

A mix of on-device and cloud-based AI functionality is necessary due to RAM limitations on the iPhone 16 models

TF International Securities analyst Ming-Chi Kuo has made predictions on this latest Medium blog, talking about what Apple intends to unveil during its WWDC 2024. While Kuo believes that Apple is working on its cloud-based and on-device LLMs, he does not believe that the Cupertino firm will proceed with an announcement during the event. He also mentions that cloud-based LLMs are difficult to train, so Apple requires immense development time. As for on-device AI, progress is minimal due to the iPhone 16’s 8GB RAM.

Related Story Check Out When Apple Will Start WWDC 2024 Event In Your Local Time To Announce iOS 18 Update For The iPhone

It is no secret that phone makers cannot deliver complete on-device LLM solutions to the masses because the memory requirements are through the roof. One estimation states that Android smartphones touting 20GB of RAM will become a common sight as these devices will have sufficient memory to run on-device LLMs. Apple is said to be researching on how to store Large Language Models on the flash memory, which will make it much easier to bring this technology to a multitude of devices.

Unfortunately, based on Kuo’s latest prediction, a breakthrough from Apple’s side is still a dream, and more progress is required. However, the company has been reported to run more on-device AI features than cloud-based ones, resulting in faster operations. This may explain why the iPhone 15 Pro and iPhone 15 Pro Max

Read more on wccftech.com