UPDATED 17:17 EST / JUNE 09 2025

APPS

Apple debuts elegant glass-like user design experience and powerful new OS capabilities

Apple Inc. previewed a slick new software design and powerful software updates today during its Worldwide Developer Conference, including new features coming to its next-generation operating systems across devices that will receive a unified version 26.

The new design features a new material called Liquid Glass, which creates a translucent effect similar to water that sits atop the display, refracting content below it and allowing colors to flow through. The company says this will bring greater focus to content, deliver a new level of quality to controls and keep users more attuned to what’s happening on screen.

“This is our broadest software design update ever — meticulously crafted by rethinking the fundamental elements that make up our software,” said Alan Dye, Apple’s vice president of human interface design. “It combines the optical qualities of glass with a fluidity only Apple can achieve, as it transforms depending on your content or context.”

The new design extends across Apple’s entire device ecosystem, including iOS 26, iPadOS 26, macOS Ventura 26, watchOS 26 and tvOS 26. The company said the idea was to “harmonize” the user experience across all devices so they could expect every device to look and feel the same.

It will affect buttons, switches, sliders, text and media in the user interface and shift dynamically according to user needs. Controls, toolbars and navigation within apps have been redesigned with rounded corners and they “float above” content so that they stay out of the way and avoid interrupting content. They also shift into thoughtful groupings, allowing users to find the controls they need.

Tab bars and sidebars have also been redesigned with the same approach. In iOS 26, when users scroll, tab bars will shrink to bring focus to content but keep navigation accessible – but when users scroll back up, they will expand again.

Major updates with iOS 26

The new iOS 26 for iPhones benefits from Liquid Glass, especially on its Lock Screen, which now has a specialized clock that can expand to take up space on the background image. If a user has an image with a photo of a subject with a lot of sky above their head, the clock will stretch to take up more of the screen, and it will even tuck itself behind the subject instead of in front of them.

As notifications appear on the screen, the clock will slowly recede in size dynamically as well, reducing the amount of space that it takes up.

During last year’s WWDC, Apple introduced Apple Intelligence, a generative artificial intelligence-powered capability that arrived on Apple devices in late 2024. It’s deeply integrated into the operating system and uses Apple silicon to power itself to allow users to rewrite text, create images and take action within their apps.

Today, Apple previewed a capability coming to iPhones that will help users understand one another in Messages, FaceTime and Phone, called Live Translate. It is enabled by Apple-built models that run entirely on-device, the company said, no personal conversations will be sent to the cloud.

When translation happens in a messaging app, the original message and the translated message appear one above the other. In FaceTime, live captions appear on screen as the other user speaks. In phone conversations, the translation is spoken out loud for the recipient, and as they respond, the user can hear a translated version of their voice. It works even for those who doesn’t have an iPhone.

Building on Apple Intelligence, visual intelligence a capability that can answer questions and provide search information from camera images can now use screenshots and iPhone apps. Users can now ask ChatGPT questions about what they’re looking at onscreen to learn more things, or search Google, Etsy or other apps to find similar images and products.

For example, if users see a pair of shoes they want, they can quickly touch the Action button and search for them on Etsy or Amazon.com. Or if they see an event poster on X for an upcoming show that they want to attend, they can trigger visual intelligence to have it add a calendar event by capturing time, date and location.

Apple also added some nice features that help eliminate common phone annoyances. Call Screening answers unknown calls to reduce interruptions by gathering information from the caller, and then providing the details on screen to decide if they will pick up. For users who are stuck on hold, Hold Assist will hold their place in the queue and notify the user when a live agent is available. And in a perfect turnabout, it will even ask the agent to wait a moment.

Messages received custom backgrounds and the ability to create polls within conversations. Apple Intelligence can detect when users might benefit from a poll and suggest one, allowing users to add their own options. Additionally, in group chats, users can now see typing indicators so that people can see who is going to speak next. Users will also now be able to request, send and receive Apple Cash, which will be very useful during outings where money might need to exchange hands.

“iOS 26 shines with the gorgeous new design and meaningful improvements to the features users rely on every day, making iPhone even more helpful,” said Craig Federighi, Apple’s senior vice president of software engineering. “Experiences are more expressive and personal, from the Lock Screen and Home Screen to new capabilities across Phone and Messages that help users focus on the connections that matter most.”

iPadOS 26 brings multitasking options to the forefront

Apple previewed a new windowing system coming to iPad with iPadOS 26, which will allow users to control, arrange and switch between apps with ease.

Familiar window controls will allow users to close, minimize resize or tile their windows. Window tiling will also take into account the unique form factor of the iPad, allowing users to arrange windows by flicking them to the edge of the screen. If users previously resized the app, it will return to its dimensions. With Exposé, a feature that brings all apps into view, users can quickly spread out all windows and pick the app they want in the forefront.

Apple said the new windowing system works well with Stage Manager, a feature that allows organizing open apps and windows into a single focused view. This allows users to stay focused on a current task while easily switching between other apps and managing multiple windows.

The Preview app, which originally comes from macOS, is coming to iPadOS 26. Preview is a dedicated app for creating a quick sketch, as well as viewing, editing and marking up PDFs and images with Apple Pencil or by touch.

MacOS Tahoe 26 enhances versatility and intelligence

The next major release of macOS 26, dubbed Tahoe, introduces the new design and powerful capabilities that include new ways for users to express themselves with color options, device continuity and a major update to Spotlight.

App icons in the new macOS are more visible in both light and dark appearances with the new Liquid Glass design, users can also change the colors of folders and add a symbol or emoji to give them a unique identity. Apple said this will add a new, vibrant personalization to Mac that hasn’t been present in the OS before, especially when combined with custom wallpapers and theme colors.

New device continuity will add the Phone app on the Mac, bringing the familiar features to the OS, including Recents, Favorites and Voicemails — as well as Call Screening and Hold Assist. Users will be able to make calls and answer calls from their desktops.

With Live Activities, users will be able to see activities from their nearby iPhone on their Mac. It will appear in their menu bar so they can stay on top of things happening while they’re gaming or working — for example, if they’re waiting on Uber Eats, a flight or a live sports score. When clicking on a Live Activity the app opens in the iPhone Mirroring app to show more information so a user can take action right on the desktop.

Spotlight received the biggest updates. It’s the central place to search for things on a Mac, essentially a search bar for anything and everything from apps to documents. During a search, everything from results — including files, folders, events, apps, messages and more — are now listed together and ranked according to relevance to the user. It even has direct access to the clipboard and works in context with the currently open app.

Users can take numerous actions directly from Spotlight, such as sending emails, creating notes or playing podcasts, all without jumping between apps. Developers can also create their hooks directly into their own apps. Users can create shortcuts for Spotlight, for example, “sm” for “send message” or “am” for “add reminder.”

Spotlight learns from users’ routines when they use their system and brings up personalized actions, such as sending a message to a colleague a user regularly talks to or opening a particular app that a user goes to commonly upon visiting the desktop first thing in the day.

New spatial experiences through VisionOS 26

In Apple’s preview of visionOS 26, the company’s spatial operating system for its mixed reality device Apple Vision Pro, it revealed how users will be able to enjoy more immersive experiences, such as more lifelike 3D photos and spatial widgets.

Widgets offer personalized and useful information at a glance. On a 2D screen, they are small applets that sit on a screen and display information. But in a 3D environment, they are “spatial,” which means they can integrate into a 3D space, such as a person’s home. They appear every time users put on the Apple Vision Pro headset exactly where they placed the widget. For example, users could pick a clock widget and put it on the wall, and the next time they put on their headset, they would see their clock widget appear on the wall again.

Apple said several new widgets shall be available at launch, fully customizable, including the Clock, Weather, Music and Photos. The Photo widget in particular will allow users to create what will appear to essentially be a “window” to another place or time, potentially a portal anywhere in their vision.

With visionOS 26, Apple has updated Personas, a lifelike virtual avatar that looks strikingly similar to the user for use during video calls and collaboration sessions. The avatars take advantage of industry-leading volumetric rendering and machine learning technology to reproduce striking expressiveness, full side profile views, remarkably accurate hair, lashes and complexion.

Apple said personas are still generated on-device in a matter of seconds, and new improvements to the setup process will allow users to adjust and preview how their Persona looks spatially. Users can even choose glasses, with more than 1,000 frame variations.

Spatial browsing in Safari now allows users to hide distractions and reveal spatial scenes while scrolling. Web developers now have the ability to embed 3D models directly into web pages, which will permit web shoppers to pull them out, manipulate them directly in Safari and even place them in the room.

For gamers, visionOS 26 has introduced support for the PlayStation VR2 Sense handheld controller. Now developers can provide more engaging gameplay experiences for Apple Vision Pro by using its motion tracking capability in six degrees of freedom, finger touch detection and vibration support.

Images: Apple

A message from John Furrier, co-founder of SiliconANGLE:

Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.

  • 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
  • 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About SiliconANGLE Media
SiliconANGLE Media is a recognized leader in digital media innovation, uniting breakthrough technology, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — with flagship locations in Silicon Valley and the New York Stock Exchange — SiliconANGLE Media operates at the intersection of media, technology and AI.

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.