Apple has reportedly scrapped its long-rumored AR video glasses project, codenamed N107, marking a major pivot in the company’s vision for lightweight augmented reality devices. Once seen as a potential mass-market companion to the premium Vision Pro headset, the AR glasses were intended to bring immersive features to everyday wearables—like turn-by-turn directions, notifications, and app extensions—all within a sleek, eyeglasses-style frame.
But despite years of internal development, the project hit insurmountable roadblocks. According to sources familiar with Apple’s plans, the glasses suffered from critical hardware limitations, particularly battery life. Early prototypes were designed to connect to an iPhone for processing, but power consumption proved too high. Engineers then tried offloading compute tasks to a Mac, but the solution created a disjointed user experience that failed to impress executives during testing.
These setbacks, combined with escalating costs and supply chain challenges, led Apple to shelve the AR glasses altogether—at least in their current form.
Apple Isn’t Done With Smart Glasses Yet
While the cancellation of N107 might seem like Apple stepping back from AR, the reality is more nuanced. Apple is already developing a custom chip designed specifically for smart glasses, drawing from the low-power, high-efficiency architecture of Apple Watch processors. Internally dubbed a “wearable-grade” SoC, this chip is designed to support features such as multi-camera input, spatial awareness, and always-on display functionality.
Industry insiders suggest this chip could enter mass production as early as 2026, which aligns with reports that Apple may aim for a 2028 launch window for its next-gen glasses—potentially in both AR and non-AR (camera-enabled) variants.
This strategic delay gives Apple more time to solve problems that have plagued not just its own AR efforts, but the entire industry: battery miniaturization, thermal management, and consumer pricing.
Rivals Push Ahead
Apple’s retreat comes as competitors double down on their own smart glasses platforms. Meta, for example, has found surprising success with its second-generation Ray-Ban Meta smart glasses, developed in partnership with EssilorLuxottica. These stylish frames now support Meta AI voice commands, real-time photo and video capture, and livestreaming capabilities. Meta is also expected to debut Oakley-branded smart glasses aimed at fitness and action sports users later this year, alongside a premium model that includes a built-in heads-up display.
Meanwhile, Google and Samsung are working on an Android XR platform, with early smart glasses prototypes expected by 2026. Qualcomm’s XR2 Gen 2 chip is powering much of this momentum, offering a standardized solution for lightweight AR wearables.
Why Apple’s Exit Matters
Apple’s decision to cancel its first AR glasses project doesn’t mean the company is abandoning the space—it means it’s not willing to release a product that isn’t ready. This mirrors the cautious, methodical approach Apple took with the Apple Watch and, more recently, the Vision Pro. Instead of racing Meta or Samsung to market, Apple appears focused on building an ecosystem of technologies—chips, sensors, operating systems, and developer tools—that will power its next breakthrough wearable.
VisionOS, Apple’s operating system for spatial computing, continues to evolve with tools that could one day scale down to glasses. And with Apple’s track record of transforming mature ideas into category-defining products, a delayed launch could simply mean they’re preparing to leapfrog the competition when the time is right.
For now, Apple may be taking a step back—but the smart glasses war is just beginning, and Cupertino still plans to be a major player in its next chapter.
Breaking Down VisionOS
Apple has long been a pioneer in shaping how we interact with technology, and its latest leap into spatial computing is no exception. Initially rumored as “realityOS,” the operating system designed for Apple’s AR and VR ambitions has officially emerged as visionOS. This new OS powers the Apple Vision Pro, marking Apple’s first foray into the world of mixed reality with a dedicated spatial interface that redefines how we see, hear, and engage with digital content.
Early clues to the existence of this platform came through trademark filings and developer logs referencing “realityOS.” These hinted at Apple’s behind-the-scenes development of a new ecosystem tailored for immersive experiences. Over time, internal codenames like “rOS” and “xrOS” surfaced, reflecting Apple’s experimental naming conventions. Eventually, at WWDC 2023, the company publicly introduced the finalized name: visionOS.
Key Takeaways
- Apple’s “realityOS” was the early codename for what is now officially called visionOS.
- visionOS powers the Apple Vision Pro headset, released February 2, 2024.
- The OS delivers immersive AR/VR experiences through a 3D interface, custom gestures, and deep ecosystem integration.
System Overview
visionOS is built on the foundation of Apple’s proven platforms like iPadOS and integrates technologies such as SwiftUI, ARKit, and RealityKit. The system introduces a groundbreaking 3D interface where digital content blends seamlessly with the real world. Users interact using eye tracking, hand gestures, and voice, providing intuitive control without physical input devices.
Development and Reveal
visionOS evolved from a multi-year development effort starting under the codename “realityOS.” Early signs were visible through trademark shell companies like Realityo Systems LLC and Yosemite Research LLC. Apple officially unveiled the operating system and Apple Vision Pro at WWDC 2023, calling it the start of the spatial computing era.
Technical Architecture
visionOS borrows heavily from Apple’s iOS and macOS frameworks but introduces spatial awareness, advanced rendering, and context-sensitive UI components. It supports advanced upscaling technologies like foveated rendering, and runs on powerful Apple silicon, enabling high-fidelity visuals and low-latency interaction critical for AR and VR applications.
Hardware Integration
visionOS is deeply tied to the Apple Vision Pro headset, but its design suggests long-term compatibility with future AR glasses and mixed-reality wearables. The system supports seamless handoff and syncing with Mac, iPad, and iPhone, and even allows you to use the Vision Pro as a massive virtual display for your Mac.
Platform Ecosystem
Apple has ensured that visionOS includes access to the App Store, with support for existing iPad and iPhone apps out of the box. Developers are encouraged to create spatial experiences using new APIs and toolkits provided in Xcode, and early adopters have already begun building productivity, fitness, and entertainment apps tailored for the 3D space.
User Experience Innovations
visionOS introduces a completely new way of interacting with digital environments. Floating app windows, spatial FaceTime avatars (called “Personas”), eye-driven cursor control, and AI-enhanced media tools like “Memory Movie” and “Clean Up” in Photos showcase the operating system’s unique advantages. It’s the first Apple OS where the environment itself becomes the interface.
Corporate Strategy and Launch
visionOS and the Vision Pro headset were launched strategically after years of secrecy, consistent with Apple’s tightly controlled release cycles. While early leaks came from code strings and shell company filings, Apple maintained a low profile until it was ready to showcase the entire ecosystem in a polished form during WWDC 2023.
Challenges and Competition
Creating an operating system for spatial computing involves overcoming unique challenges like motion sickness reduction, battery efficiency, and realistic rendering at high frame rates. Competitors like Meta (with Quest and Horizon OS) and Microsoft (with HoloLens) are also pursuing AR/VR markets, but visionOS sets a high bar in hardware-software synergy and ecosystem support.
What’s Next for visionOS
visionOS is just getting started. Apple is expected to expand its capabilities over time, possibly integrating generative AI, deeper health monitoring, and new wearable platforms beyond the Vision Pro. Developers and consumers alike will play a key role in shaping the future of Apple’s spatial computing vision.