Apple execs explain why its AI is different from competitors

Technology

In this article

Apple CEO Tim Cook (L); John Giannandrea (C), senior vice president of machine learning and AI strategy; and Craig Federighi (R), senior vice president of software engineering, speak during Apple’s annual Worldwide Developers Conference in Cupertino, California, on June 10, 2024.
Nic Coury | AFP | Getty Images

Apple fully embraced artificial intelligence on Monday, as company executives explained the features and reasoning behind Apple Intelligence, the company’s new AI software suite.

But Apple’s Worldwide Developers Conference launch event was carefully crafted to distinguish the iPhone maker from current AI leaders, such as Microsoft and Google, at a panel discussion Monday afternoon.

Software chief Craig Federighi and AI chief John Giannandrea said during the panel that Apple has a different approach to the technology than its Silicon Valley rivals. Unlike companies that are building AI for a broad range of products, Apple is instead focused only on the devices it sells and the personal data that AI could use.

Apple revealed a more limited approach that eschews future-focused thinking about the potential of the technology in favor of small tasks that can be done now without burning up battery life.

“We think AI’s role is not to replace our users but to empower them,” Federighi said.

Apple’s AI may be the first that its over 2 billion users interact with. If its AI features are favored over cloud-based competition from Microsoft or Google, it could change how billions of dollars in AI infrastructure per year is built and shift the direction of products that use the technology.

Much of the AI development that has captured investor and technological interest has focused on building or securing powerful supercomputers equipped with Nvidia chips to develop even more power-hungry AI models. In this scenario, users access the AI software by communicating with equally powerful servers over the web.

Apple’s AI is mostly on your device

Apple Intelligence was unveiled during Apple’s Worldwide Developers Conference in Cupertino, California, on June 10, 2024.
Source: Apple Inc.

Apple’s vision for AI isn’t about one big model — it’s a slew of smaller models that don’t require the same amount of computing power and memory, running on Apple’s devices and chips themselves. If the AI on the phone can’t do it, then Apple, or an app using Apple’s tools, reaches out to the cloud to access a larger AI model. Apple partnered with OpenAI, for example, to give users access to ChatGPT if Siri can’t provide an answer. These features come into play only if users allow it.

Apple executives don’t refer to this strategy as using one or multiple models. Instead, they package it as just “Apple Intelligence.”

“We think that the right approach to this is to have a series of different models and different sizes for different use cases,” Giannandrea said.

Giannandrea said the company worked to create a 3-billion parameter model as part of Apple Intelligence. ChatGPT’s GPT-3 model from 2020, in comparison, is much larger, at 175 billion parameters. The more parameters, the more memory and computing power needed to run the model.

Apple’s approach is faster than the cloud-based options and has privacy benefits. However, there can be issues when the models are too small to get anything done. Apple is betting that through a user’s iPhone, its AI can tap into personal data about appointments, location, and what the user is doing. One example provided by Federighi is that his phone knows who his daughter is.

Apple also says it’s making sure its small models work only on tasks they can excel at, rather than give users an open-ended chatbot interface.

“There’s a critical extra step, which is we’re not taking this teenager and telling him to go fly an airplane,” Federighi said.

Many AI features Apple announced on Monday are similar to products already announced this year. Apple’s AI can summarize and rewrite documents, generate small images, and translate conversations in real-time. One notable feature will enable users to generate new emojis using AI without connecting to the internet. The new features will be released this fall in a beta version.

Apple’s approach to privacy

Private Cloud Compute unveiled during Apple’s Worldwide Developers Conference in Cupertino, California, on June 10, 2024.
Source: Apple Inc.

Privacy will be a challenge for Apple as it embraces AI. It has used privacy as one of its primary marketing tools for years, highlighting that Apple’s business model doesn’t require ad targeting, and that it has the best interests of its users in mind versus data brokers and spammers.

Other AI companies collect user data and store it to make their software better, a practice that doesn’t fit with Apple’s current privacy policies. Much of Apple’s presentation on Monday was about steps they’ve taken to prevent the impression that Apple is hoovering up user data to improve its AI.

“We’re not going to take that data and go send it to some cloud somewhere,” Giannandrea said. “Because we want everything to be very private, whether it’s running locally or on a cloud computing service, and that’s the way we want it so we can use your most personal data.”

Apple didn’t detail what data was used to train its AI models, beyond that it uses files scraped from the public web in addition to licensed data such as news archives and stock photography.

For example, Apple said that it developed its own servers using its Apple chips, called Apple Private Cloud, to prevent user data sent back to an AI server from being stored or re-used. It will allow third parties to inspect the software, a notable move for a secrecy-focused company that usually doesn’t provide information about its infrastructure.

“Even if a company maybe makes a promise and says, ‘Well, hey, look, we’re not going to do anything with the data.’ You have no way to verify that,” Federighi said, explaining why Apple will allow inspection of its AI server software.

More AI to come

ChatGPT integration with Apple iOS 18 announced for later this year during Apple’s WWDC2024 in Cupertino, California, on June 10, 2024.
Source: Apple Inc.

At times, Apple officials seemed to downplay how big a shift this is in the company’s AI strategy, saying that it’s a continuation of the machine learning work the company has already done to edit photos or transcribe text, or to put AI-specific blocks on the company’s chips.

“It’s only recently that others are starting to suddenly claim like there’s some new category there,” Federighi said. “But those are things we’ve been shipping for a long time.”

However, Apple didn’t bet it all on a single approach. It will offer ChatGPT built into its operating systems, allowing users to prompt OpenAI’s model for free and offering users a more powerful and larger AI model. However, OpenAI’s ChatGPT will be clearly marked in Apple’s software, telling users that data will be sent to OpenAI servers (which run on Microsoft’s cloud) and answers will show they’re from ChatGPT too, just in case they go off the rails.

Apple said it could offer different models in the future, signaling that Apple Intelligence is not the only AI system it expects its customers to use. Federighi said that one day some of its customers might want a medical AI system or legal AI model built into Apple products, for example. Or, maybe one of Google’s models.

“We’re going to look forward to doing integrations with models like Google Gemini, for instance, in the future. I mean, nothing to announce right now,” Federighi said. “But that’s our direction.”

Articles You May Like

Elon Musk endorses far-right Alternative for Germany party in upcoming election
Blake Lively accuses It Ends With Us co-star of sexual harassment in legal complaint
Toyota lands $4.5M to boost EV battery sustainability
Police concerned there may be more victims of ‘dangerous’ rapist who targeted young girl
Texas aces challenge as RBs explode in CFP win