Apple shifts from Vision Pro to smart glasses
Bloomberg reports that Apple may be reorienting its wearables strategy, moving away from polishing the Vision Pro in favor of a new line of smart glasses. The iPhone maker, led by Tim Cook, would redirect staff from the Vision Pro project toward developing what could become Apple’s most ambitious foray into augmented reality eyewear yet. The plan, if confirmed, involves at least two distinct models, each aimed at different user needs and price points, and designed to intensify competition with Meta’s growing glasses ecosystem.
The N50: a lightweight, iPhone-connected option
According to the reports, the first model—internally codenamed N50—would connect to an iPhone but would not have its own built-in display. This approach would mirror the way some smart glasses extend a phone’s capabilities rather than replace a dedicated screen. The N50 is described as a lighter, more affordable entry into Apple’s AR/AI eyewear universe, potentially serving as a gateway device that broadens the company’s presence in daily wearables. If timelines hold, Apple could unveil this model as early as next year, with a commercial launch anticipated around 2027. The strategic merit lies in lowering the barrier to entry for mainstream users while testing new interaction paradigms that rely on the iPhone as the primary processor and hub for computation and content.
The second model: a built-in display to rival Ray-Ban Meta Display
The second headlining device would include its own display, positioning it to directly compete with Meta’s newer Ray-Ban Meta Display and Oakley Meta offerings. Initially expected to arrive in 2028, Apple’s push to accelerate development could shorten the time to market if the project gains internal momentum and external signals align. A built-in display would allow Apple to offer a standalone mixed-reality experience, enabling richer apps, higher-contrast visuals, and more immersive AR interactions. This model could leverage Apple’s ecosystem—supported by visionOS, the M-series-era silicon, and tight integration with iPhone and other devices—to create a differentiated experience even among premium competitors.
Meta’s current smart-glasses ecosystem: what Apple is up against
Meta has been aggressively deploying its smart-glasses strategy with multiple lines designed for different audiences. The Ray-Ban Meta and Oakley Meta series, followed by the Ray-Ban Meta Display, showcase a broadened hardware portfolio that blends AI-powered features with practical, real-world utility. A hallmark across these devices is Meta AI, the voice assistant that responds to quick prompts with contextual information. Saying “Hola Meta” triggers a flow of data and actions, enabling users to glance at information while keeping hands free—an appealing proposition for athletes, travelers, and on-the-go professionals alike.
Live video transmission is another standout feature: users can broadcast feeds directly from their glasses, turning workouts, trips, or everyday moments into shareable content in real time. The Oakley variants target a more active, outdoor-oriented demographic, boasting rugged design, enhanced camera specs, longer battery life, and UV/impact protection through specialized lenses. Together, Meta’s lineup illustrates a clear strategy: blend augmented reality with social and content-sharing capabilities, anchored by AI-driven voice and gesture controls.
Vision Pro: Apple’s current flagship and what it means for the future
Apple’s Vision Pro stands as the company’s first major hybrid reality device since entering the space. It uses dual micro-OLED displays delivering a combined resolution around 23 million pixels and offers adaptive refresh rates suitable for different media types. Core processing relies on the M2 chip for CPU/GPU/memory, with a dedicated R1 chip handling sensor data with minimal latency. Vision Pro can store content in 256 GB, 512 GB, or 1 TB configurations and features a suite of cameras, LiDAR, and eye-tracking sensors. The headset runs on visionOS and presents apps as floating windows within the user’s spatial environment, controlled by gaze, hand gestures, and voice commands. Practical limitations, such as roughly two hours of battery life without external power, have pushed Apple to emphasize short sessions or on-the-go charging scenarios. The ongoing evolution of Vision Pro’s software and ecosystem remains a critical piece of Apple’s broader AR strategy, even as the company explores complementary form factors.
What this potential pivot could mean for the market
If Apple proceeds with two distinct smart-glasses models, the company would be reinforcing its longer-term ambition to own the wearables space alongside iPhone, iPad, and Mac. A lighter, iPhone-connected model could appeal to mass-market users seeking subtle AR capabilities without a built-in display, while a premium, standalone device would target enthusiasts and professionals seeking immersive AR experiences and richer content creation tools. Competition with Meta would intensify on several fronts: hardware performance, software polish, privacy and security measures, and the quality of AI assistance. Pricing strategies will be pivotal, as Apple often competes with premium hardware that emphasizes privacy, ecosystem integration, and premium service experiences.
Conclusion: timing and trends to watch
The proposed shift toward smart glasses signals Apple’s intent to broaden its wearable footprint beyond the Vision Pro, potentially reshaping how consumers interact with augmented reality on a daily basis. While Bloomberg’s reporting frames two parallel tracks—one lightweight and iPhone-centric, the other fully equipped with a built-in display—the ultimate realization will hinge on engineering challenges, developer momentum, and consumer adoption. If the timeline holds, Apple’s two-model strategy could begin to roll out in the second half of the decade, challenging Meta to continue innovating at pace and prompting broader conversations about privacy, display technologies, and the future of wearable computing.