AI Tools

Samsung's AI Smart Glasses Are Coming: What It Means for the Race to Put AI on Your Face

Samsung confirmed AI smart glasses with an eye-level camera and agentic AI. Here's what we know and why 2026 is the year smart glasses go mainstream.
March 7, 2026 · 11 min read

Meta has owned the smart glasses market with 82% global share. That dominance is about to get tested. At Mobile World Congress in Barcelona this week, Samsung's executive vice president Jay Kim confirmed what the industry had been expecting: Samsung is shipping AI-powered smart glasses later this year, complete with an eye-level camera and deep smartphone integration.

TL;DR:
  • Samsung confirmed AI smart glasses launching later in 2026, codenamed Project HAEAN
  • Features include a 12MP eye-level camera, gesture controls, and "agentic" AI processing
  • The glasses will rely on your Galaxy phone for heavy computing, keeping the hardware light
  • Smart glasses shipments grew 139% in H2 2025, with 2026 revenue projected at $5.6 billion

This is not a concept video or a keynote tease. Samsung's partner Qualcomm confirmed a 2026 launch. The smart glasses market grew 139% in the second half of 2025, and every major tech company, from Samsung to Google to Alibaba, is now racing to put AI on your face.

Here is what Samsung revealed, who they are competing against, and why this moment matters more than any other in the short history of wearable AI.

139% Smart glasses shipment growth H2 2025
$5.6B Projected 2026 smart glasses revenue
82% Meta's current market share

What Samsung Actually Revealed

Kim's interview with CNBC was careful but specific. Samsung's glasses will feature a built-in camera at eye level, designed not just for photos or video, but as an input channel for AI. The camera sees what you see. The AI interprets it.

The processing model is deliberate: the glasses themselves stay lightweight by offloading computation to your Galaxy smartphone. The phone handles the heavy AI work. The glasses handle the capture and display layer.

When asked whether the glasses would include a built-in display, Kim declined to answer directly, pointing to Samsung's existing ecosystem: "We have other products like the smartwatch or phone if a user needs a display." Leaked specs suggest a 12MP autofocus camera, gesture-based controls, Wi-Fi and Bluetooth connectivity, and a compact 155mAh battery optimized for all-day wear rather than feature overload.

Samsung's strategy is clear: make the glasses a lightweight sensor hub for AI, not a standalone computer strapped to your skull. The phone does the thinking. The glasses do the seeing.

The internal codename for the project is HAEAN, and Samsung has been developing this product in partnership with Qualcomm and Google since 2023. Their first collaboration produced the Galaxy XR headset last year. Smart glasses were always the intended destination.

Qualcomm CEO Cristiano Amon backed up the timeline at MWC, telling CNBC that the glasses would ship this year. He also described the product philosophy in terms that would matter to anyone paying attention to AI trends: "Close to our eyes, close to our ears, close to our mouth. We're going to have those agentic experiences and workloads."

The Word That Changes Everything: Agentic

Both Samsung and Qualcomm repeatedly used the word "agentic" to describe the smart glasses experience. That word carries significant weight.

Agentic AI does not just respond to your prompts. It acts on your behalf. It books the restaurant. It identifies the plant you are looking at and pulls up care instructions. It reads the menu at a foreign cafe and translates it in real time. It recognizes the person walking toward you at a conference and whispers context about your last conversation with them.

This is a different category of product from the Ray-Ban Meta glasses, which are primarily a camera and speaker system with AI bolted on. Samsung is positioning its glasses as a sensor layer for autonomous AI agents that can take action in the physical world.

What this means for you: If agentic AI on smart glasses works as described, you would not need to pull out your phone to search, translate, navigate, or identify objects. The glasses observe your environment and the AI acts. That is a fundamental shift in how humans interact with information.

For those tracking how AI agents work in practice, the smart glasses form factor adds something no other device can: persistent visual context. Your phone's AI only sees what you point the camera at. Your laptop's AI only sees your screen. Smart glasses AI sees what you see, continuously, all day long.

Amon compared the current state of smart glasses to the early smartphone era. "You go to 200 apps, 1,000 apps, and that's how we're going to see those glasses getting better over time as new agents get developed." The implication is that agentic AI on glasses will follow the same app ecosystem trajectory that turned smartphones from novelties into necessities.

Who Samsung Is Actually Fighting

Samsung is entering a market that Meta currently dominates, but the competitive field is wider than a simple two-player match.

Meta (Ray-Ban Smart Glasses) The incumbent. Meta shipped the Ray-Ban Gen 2 glasses with a 12MP camera, improved audio, Meta AI integration, and livestreaming to Facebook and Instagram. No built-in display. At roughly $300, they are the affordable standard. Counterpoint Research puts Meta at 82% global market share, and shipments grew 139% year-over-year in H2 2025. Meta's advantage is simple: they shipped first, iterated fast, and priced aggressively.

Snap (Specs Inc.) Snap spun off its smart glasses division into a standalone company called Specs Inc. in January 2026. Unlike Meta and Samsung, Snap's approach leans harder into augmented reality with a built-in display. Consumer launch is expected later this year. Snap's bet is that lightweight AR, not just audio and camera, will be what consumers actually want.

Google and Warby Parker Google is back in the smart glasses game, this time partnering with Warby Parker on AI-powered glasses expected in 2026. Google's Android XR operating system already powers Samsung's Galaxy XR headset, and it is likely to be the foundation for their own glasses as well. The Warby Parker partnership suggests a focus on making smart glasses look like actual glasses people want to wear.

Alibaba and Chinese manufacturers Alibaba, Xreal, and several Chinese manufacturers are shipping smart glasses in Asia, often at lower price points. The Chinese market for AI wearables is growing fast, and these players may define the mass-market tier while Western brands compete at the premium level.

Meta Ray-Ban

Strategy: Camera + AI assistant. No display.

Price: ~$300

Edge: Market leader, ecosystem, price

Samsung (Project HAEAN)

Strategy: AI sensor hub + Galaxy phone processing

Price: TBD

Edge: Deep phone integration, agentic AI focus

Snap Specs

Strategy: AR display + social camera

Price: TBD

Edge: Built-in display, AR-first approach

Why 2026 Is the Tipping Point

Smart glasses have been "about to happen" for over a decade. Google Glass launched in 2013 and became a cautionary tale about privacy, social acceptability, and premature technology. So what makes 2026 different?

Three things converged simultaneously.

AI models got good enough. The large language models and multimodal vision models available today can actually interpret visual scenes, understand spoken context, and take useful action. In 2013, the AI behind Google Glass could do basic search and display notifications. In 2026, the AI can identify objects, read text in any language, summarize documents, recognize faces (where legal), and execute multi-step tasks. The software finally matches the hardware ambition.

The form factor shrank. Modern smart glasses look like actual glasses. The Ray-Ban Metas are genuinely stylish. Samsung's leaked designs suggest a similar approach. People stopped asking "will I look ridiculous?" and started asking "what can these do for me?" That psychological shift matters as much as any technical specification.

Smartphones hit a plateau. The annual phone upgrade cycle has lost its excitement. Screens got bigger, cameras got better, but the fundamental interaction model has not changed since 2007. AI tools are already replacing traditional software subscriptions. Smart glasses represent the first genuinely new interaction model in nearly two decades: ambient computing that does not require you to pick up a device.

$1.44B to $5.6B Smart glasses market growth from 2025 to projected 2026, a nearly 4x increase in one year

The market agrees with this assessment. SNS Insider valued the AI smart glasses market at $1.44 billion in 2025 and projects $4.59 billion by 2035. Seoul Economic Daily projects 2026 revenue at $5.6 billion and forecasts 75 million annual shipments by 2030. These are not speculative numbers from optimistic startups. These are industry-wide consensus projections.

The Privacy Question Nobody Wants to Answer

Every smart glasses launch runs headlong into the same wall: other people do not want your face-computer recording them.

Meta dealt with this by adding a small LED indicator light that turns on when the camera is active. Samsung will almost certainly adopt something similar. But indicator lights do not solve the fundamental tension: smart glasses with AI that "sees what you see" are, by definition, surveillance devices that happen to also help you translate menus.

Restaurants, gyms, and private venues have already started posting policies about smart glasses. Some ban them outright. Others require the camera to be visibly covered. As these devices go mainstream, expect privacy legislation to follow, especially in the EU, which already has the strictest AI and data regulations globally.

Worth watching: Samsung has not yet disclosed how it will handle the visual data that flows from the glasses to the phone to the AI. Will conversations be processed locally? Will images be sent to the cloud? These details will determine whether the glasses feel like a helpful assistant or a walking surveillance rig. Buyers should pay close attention to the privacy policy when Samsung publishes it.

The social acceptability problem is also real but fading. When one person at a dinner table wears smart glasses, it feels intrusive. When half the table does, it feels normal. If smart glasses shipments hit the projected volumes, social norms will shift faster than legislation can keep up.

What This Means for You

If you are already embedded in the Galaxy ecosystem, Samsung's glasses will likely integrate more deeply with your phone than any competitor. The tight coupling between glasses and Galaxy phone, using your phone's processor and AI stack, means the experience will be purpose-built for Samsung users.

If you do not care about ecosystems and just want the most mature product, Meta's Ray-Ban glasses remain the safest choice. They are shipping, they work, and the AI is competent if not yet truly agentic.

If you want AR with a real display, wait for Snap's consumer Specs. They are the only company betting on putting a display in the frame, which could make them the best option for navigation, real-time information overlays, and anything that benefits from visual output.

For those building with AI, the smart glasses wave opens an entirely new input modality. Computer control by AI agents currently works through screen interaction. Smart glasses add visual-world interaction to the toolkit. Developers who figure out how to build agentic experiences for glasses, not just porting phone apps to a smaller screen, will define the next platform cycle.

The Bigger Picture

Samsung entering the smart glasses race is not just another product launch. It is a signal that the biggest consumer electronics company in the world believes the post-smartphone era starts now.

The smart glasses market is following a pattern we have seen before: one pioneer proves the concept (Meta), a wave of competitors arrives (Samsung, Google, Snap, Alibaba), prices drop, quality improves, and within 3 to 5 years the product category becomes as normal as smartwatches.

The difference this time is AI. Previous attempts at smart glasses failed because the software could not justify the hardware. You strapped a computer to your face and got... notifications you could have checked on your phone. In 2026, the AI behind these glasses can actually do things your phone cannot: persistent visual awareness, real-time environmental understanding, and autonomous action without you lifting a finger.

Samsung's bet is that you will want an AI agent riding on your face, watching the world through your eyes, and quietly making your day easier. Whether that sounds exciting or unsettling probably depends on how you feel about the broader trajectory of AI agents in daily life.

Either way, the race is on. And it is moving faster than most people realize.

Share This Article

Share on X Share on Facebook Share on LinkedIn
Future Humanism editorial team

Future Humanism

Exploring where AI meets human potential. Daily insights on automation, side projects, and building things that matter.

Follow on X

Keep Reading

Tether Just Made Your Phone an AI Training Lab. The Cloud Should Be Nervous.
AI Tools

Tether Just Made Your Phone an AI Training Lab. Th...

Tether's QVAC framework enables billion-parameter AI model fine-tuning on smartp...

ODEI and the Case for World Memory as a Service
AI Agents

ODEI and the Case for World Memory as a Service

Every AI agent forgets everything. ODEI is building the persistent memory infras...

The Three Laws of Agent Commerce: How x402, ERC-8004, and ERC-8183 Built an Economy in Three Weeks
AI Agents

The Three Laws of Agent Commerce: How x402, ERC-80...

Three standards dropped in three weeks and together form the complete infrastruc...

These AI-Evolved Robots Refuse to Die, and That Changes Everything
AI Agents

These AI-Evolved Robots Refuse to Die, and That Ch...

Northwestern's legged metamachines are the first robots evolved inside a compute...

Share This Site
Copy Link Share on Facebook Share on X
Subscribe for Daily AI Tips