Business Standard

Tuesday, December 24, 2024 | 09:48 PM ISTEN Hindi

Notification Icon
userprofile IconSearch

Apple denies using YouTube videos for training Apple Intelligence: Report

Many big technology companies including Apple, Nvidia and Amazon-backed Anthropic have been reportedly using subtitles of YouTube videos to train their AI models

Apple Intelligence

Apple Intelligence

Harsh Shivam New Delhi

Listen to This Article

Apple has clarified that its artificial intelligence features that the company collectively calls Apple Intelligence is not powered by the company’s OpenELM AI model. According to a report by 9To5Mac, the cupertino-based technology giant, in a statement to the media outlet stated that “OpenELM doesn’t power any of its AI or machine learning features – including Apple Intelligence.”

This comes after a Wired report stated that many technology giants including Apple, Nvidia and Amazon-backed Anthropic used material from thousands of YouTube videos, including video subtitles to train their AI models. The report mentions that Apple used plain text of video’s subtitles along with its translations into different languages to train its OpenELM model.
 

Google prohibits the use of videos posted on YouTube for applications that are “independent” of the video platform.

In its statement to 9To5Mac, Apple said that it created the OpenELM to contribute to the research community and advancing open source large language model (LLM) development. According to the company, OpenELM was only created for research purposes rather than for powering AI features on its products and devices. 

Apple Intelligence training

Previously, in a research paper published by Apple on June 10, the company said that it does not use its “users' private personal data or user interactions” for training its AI models. However, the tech giant did say that it uses “publically available data” from the web using its web-crawler AppleBot. The company said that web publishers have to opt-out if they wish to not allow Apple to use their web content for Apple Intelligence training.

Apple OpenELM: What is it

In April, Apple released its OpenELM AI model on the Hugging Face model library. OpenELM, which stands for “Open-source Efficient Language Models", is a series of four small language models that are capable of running on devices such as Phones and PCs.

The four models within OpenELM come with 270 million parameters, 450 million parameters, 1.1 billion parameters, and the largest with 3 billion parameters, respectively. These parameters refer to the number of variables an AI-model can understand from its training data for decision making.

For comparison, Microsoft’s Phi-3 model can go up to 3.8 billion parameters. Similarly, Google’s open model Gemma, which was launched earlier this year, offers up to 2 billion parameters.

Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: Jul 18 2024 | 11:58 AM IST

Explore News