Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Windows Central
Windows Central
Technology
Kevin Okemwa

Apple's new outperforming GPT-4 LLM will reportedly run on-device like Copilot on future AI PCs, but OpenAI's Sam Altman admitted it's virtually 'impossible' to train ChatGPT-like tools without copyrighted content

Apple logo on a building.

What you need to know

  • Apple is reportedly developing a new LLM that is expected to run on-device.
  • This means it will spot great performance speeds and maintain the user's privacy, unlike its rivals like Copilot and ChatGPT, which heavily rely on an internet connection and the cloud.
  • The iPhone maker will likely license Google's Gemini AI software to handle complex tasks like text generation, which requires an internet connection, ultimately bypassing copyright infringement issues. 

Apple's annual developer conference, WWDC 2024 is just a few months away. And while we expect the iPhone maker to announce its new lineup of phones alongside the usual announcements, the company is rumored to make its debut in the generative AI landscape with an on-device LLM.

Microsoft's multi-billion investment in AI and extended partnership with OpenAI has placed the company in a beneficial position — the world's most valuable company, with over $3 trillion in market capitalization. Market analysts affirm Microsoft will potentially continue to hold this position ahead of Apple for the next five years and beyond, with some indicating it's on the brink of reaching its 'iPhone moment' with AI

As it happens, we potentially have an early look at what Apple's AI-centered announcements might entail, specifically regarding the company's soon-to-launch LLM which has been purported to outperform OpenAI's GPT-4. According to Bloomberg's Mark Gurman (usually a reliable source for all matters Apple), Apple is developing its LLM that will reportedly run on-device to affirm the consumers' privacy while simultaneously promising great performance.

An on-device-based LLM essentially means it will depend on the iPhone processor rather than the cloud. But what does this mean? The model will spot better performance. In the past, we've witnessed the frustration avid users of OpenAI's ChatGPT or Microsoft's Copilot AI have gone through during peak times. For instance, when Microsoft shipped DALL-E technology to Image Creator from Designer (formerly Bing Image Creator), image generation speeds were painfully slow. Some users indicated they had to wait up to an hour to generate a single image. Even Microsoft Copilot Pro subscribers encounter similar issues, despite the promised expedited access to services during peak times. 

While Apple's LLM will spot great performance, it may not share the same knowledge scope, consistent across its rivals. However, Gurman indicated reaching out to Google and other AI would help fill the missing knowledge gaps. This aligns perfectly with an earlier report that suggested Google and Apple are reportedly in the middle of a megadeal that could potentially lead to Gemini being used as the default AI assistant on the iPhone.

Still, Apple's LLM addresses most of the concerns expressed by users about generative AI — privacy and faster response times. It'll also be interesting to see how the model competes against ChatGPT, which currently holds a huge percentage of the mobile market share, despite Copilot shipping with free access to DALL-E image generation technology and GPT-4.

Apple plays it safe with copyrighted material

The world is waiting to see what Apple does when it joins the growing AI race.  (Image credit: Windows Central)

An AI model that runs on-device means users can leverage its capabilities without having an internet connection. However, its capabilities will be a tad limited to basic text analysis. Apple may license Google's Gemini AI software to handle complex tasks like text generation, which requires an internet connection.

With the PC market shifting towards the AI PCs trend, Intel working closely with Microsoft indicated that Copilot AI is expected to start running on Windows locally shortly, potentially addressing some of the issues riddling the service.

According to AppleInsider's source with close knowledge of Apple's AI test environments, there are "no restrictions to prevent someone from using copyrighted material in the input for on-device test environments."

It's unclear what measures Apple has to prevent its LLM from encountering copyright infringement issues. Microsoft and OpenAI are in court battling lawsuits over the same. While reports indicate Apple's LLM will outperform OpenAI's GPT-4 model, Sam Altman already indicated it "kind of sucks" amid rumors OpenAI is getting ready to unveil a new model that's "really good, like materially better." Additionally, Altman admitted it's 'impossible' to create ChatGPT-like tools without using copyright material.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.