UPDATED 18:19 EDT / MAY 27 2024

APPS

Apple will reportedly introduce AI-generated emojis, LLM-powered Siri with iOS 18

Apple Inc. is reportedly preparing to embed new artificial intelligence models into several components of iOS including the built-in emoji library and Siri.

Bloomberg reported the upcoming enhancements on Sunday, citing sources familiar with the matter. Apple is expected to unveil the features at its WWDC 2024 developer event next month. The upgrades are expected to make their debut alongside several other new AI capabilities that were leaked over the past few weeks.

Apple historically has expanded the emoji catalog in iOS with several dozen additions every year. At WWDC, the company is expected to introduce an AI tool that will enable consumers to create their own custom emojis using natural language instructions. It’s unclear which apps will support the feature.

Down the road, consumers might also receive the ability to animate the emojis they create with the AI tool. In February, a group of Apple researchers detailed a machine learning application called Keyframe that can animate static images based on user prompts. Under the hood, it leverages large language models to turn user instructions into motion design.

Alongside the emoji library upgrade, Apple is expected to enhance the way iOS displays app icons. Users will reportedly gain the ability to change the color of icons and arrange them more freely on their home screens. Currently, the iOS home screen only displays icons in a fixed grid layout.

Siri is reportedly another major focus of the operating system refresh. Apple is expected to make the AI assistant more useful for Apple Watch users, as well as equip it with internally-developed LLMs that will generate more natural-sounding responses. Amazon.com Inc. is reportedly preparing a similar update for its rival Alexa assistant that will add custom LLMs to improve the user experience.

Only limited information is available about the neural networks that Apple is using to power the AI features in iOS 18. According to Bloomberg, the most hardware-intensive AI tasks will be offloaded to models deployed in the cloud. Less demanding computations will be performed on the user’s device.

In April, Apple open-sourced a collection of small language models designed to run on devices with limited computing capacity. In conjunction, it released a tool for turning language models into a form that can run on iPhones. The technologies developed as part of that project may lend themselves to powering some of the AI enhancements in iOS 18.

Some of the enhancements in the operating system update will reportedly be powered by AI models from OpenAI. According to Bloomberg, Apple will officially announce its long-reported partnership with the LLM developer at WWDC. A separate report published last month suggested that the companies could work together to develop a chatbot service and new search tools for iOS.

Photo: Pixabay

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU