
On June 9, Apple staged its Platforms State of the Union as part of its 2025 Worldwide Developers Conference. This serves as a secondary keynote presentation, intended for app developers, that delves into some of the announcements in greater depth.
SEE: Apple’s WWDC 2025 Keynote Roundup
Apply ‘Liquid Glass’ to app design
One of the biggest announcements was the unveiling of “Liquid Glass,” the new interface design to be used in the next operating system update. Developers can now adopt the glassy, shiny new look into their apps using SwiftUI, UIKit, or AppKit, such as in the app icon.
Billy Sorrentino, Apple’s senior director of human interface, said at WWDC that the new design aims to bring better hierarchy, harmony, and consistency to devices.
Key user interface (UI) elements now float above content for clarity and focus, clarifying the hierarchy. At the same time, harmony is created as the UI shapes and interactions are better aligned with device geometry and natural touch patterns. As the new design is universal across Apple platforms, apps feel consistent and familiar wherever they run.
Create AI features with Apple Intelligence
The keynote also delivered some news about Apple Intelligence, including that developers can now access Apple’s on-device foundation models to integrate into their apps.
For example, an education app could use them to generate a personalised quiz out of a student’s notes. There are no associated cloud API or server costs, making it a much cheaper alternative than paying for a third-party service. It works even when the user is offline, and privacy is guaranteed.
To make this integration even more seamless, Apple introduced the Foundation Models Framework, which is embedded in the Swift programming language. Developers can unlock generative capabilities such as text summarization and custom tool calling with just a few lines of code. This framework is designed to be developer-friendly.
Building on this, Apple has also enhanced its App Intents platform. Previously a tool for integrating app content with features like Siri, Spotlight, or widgets, it now supports Visual Intelligence. This advancement enables developers to incorporate visual search capabilities into their apps, allowing users to identify and interact with content directly from images using AI, such as recognizing an object in a photo or scanning handwritten notes.
Xcode 26 has ChatGPT
Apple’s development toolset, Xcode, is gaining built-in support for ChatGPT in its next iteration.
When developers build their apps within Xcode 26, they can ask the AI assistant for bug fixes, documentation, and any other assistance they need while coding. It both responds to natural language prompts and suggests actions based on activity. Developers can also use API keys to integrate third-party AI chatbots, other than ChatGPT, and use the timeline feature to undo any AI-generated changes that don’t meet their requirements.
Matthew Firlik, Apple’s senior director of developer relations, also announced that macOS Tahoe, the next operating system release, will be the last that supports Intel Macs. He encouraged developers to rebuild their apps with Xcode 26 so they can make the most of the performance features of M-Series chips found in newer Macs.
All of these developer options are now available for testing, with a public beta scheduled for July.
visionOS 26 helps spatial app development
visionOS 26, the upcoming operating system for the Vision Pro headset, will come with several features that enable spatial app development, including:
- New volumetric APIs: These allow the creation of fully 3D UI layouts using SwiftUI, for example, through the 3D anchoring of widgets.
- Nearby Window Sharing: Developers can build shared spatial experiences for users wearing two different Vision headsets in the same room.
- RealityKit’s Image Presentation Component: Allows 2D images to turn into immersive 3D scenes within the app.
- Support for immersive media formats: Including 180°, 360°, and wide-field-of-view video via Apple Projected Media Profile.
Swift 6.2 includes support for WebAssembly and more
Swift has reached version 6.2. The new version offers:
- New APIs for efficiently working with types of memory, such as Span and InlineArray.
- Support for WebAssembly.
- Improved interoperability with other languages, such as C++ and Java.
- A containerization tool that allows Linux container images to run on Mac.
Swift now bridges AI, design, and performance, giving developers the tools to create cohesive cross-platform apps.