There’s a growing sentiment that gadgets have gotten boring. And while I don’t fully agree, I understand why people might feel that way. Just think about some of the novel device types that companies have tried to push since the original iPhone came out.
3D TVs were a massive flop and tablets still feel like extra-large smartphones despite Apple’s efforts to prop them up as laptop replacements. Meanwhile, even with huge technological advancements over the last decade, VR headsets remain relatively niche due to factors like high prices and a lack of compelling content. And although big names like Google, Microsoft, Meta and others continue to dump billions into AI development, the first wave of dedicated AI devices was an abject failure.
When you think about it, the only new(ish) class of gadget that has made major inroads to the mainstream market is smartwatches. That said, because they’ve evolved into wearable health and fitness sensors instead of the wrist-based computers that many once thought they would be, they haven’t really disrupted our lives like the personal computer and smartphone did. But that seems poised to change because the tech giants have decided that smart glasses are going to be the next big thing.
Headsets versus smart glasses, what’s the difference?
Google is planning to support both smart glasses and headsets with Android XR, though the increased size and weight of devices like the Galaxy XR means it’s not a great choice for all-day functionality. (Sam Rutherford for Engadget)
At this point, you might be saying, “Wait, hold on. Aren’t VR headsets and smart glasses kind of the same thing?” Well, yes and no. Both types of gadgets require similar software and hardware, but they utilize them in very different ways. Not only are VR goggles typically much bigger and heavier, they also provide a more isolated experience that can make it feel like you’ve been transported to another world.
Sure, most modern headsets have exterior cameras that support some level of mixed reality (blending virtual graphics with physical objects) or let you peek quickly into meatspace (passthrough view) for when you need to get a drink or acknowledge other humans in the room. But in many respects, that closed-off feeling is the goal because it creates the ideal environment for playing games, taking virtual meetings or modeling 3D objects without real-world distractions. Furthermore, while many headsets like the Vision Pro and the Meta Quest 3 can function as standalone systems and support accessories like controllers or other motion trackers, they can also be tethered to a nearby PC for enhanced functionality.
On the other hand, the default use case for smart glasses is a mixed reality environment where the spectacles can overlay helpful info or messages while you stay active and aware of your surroundings. Notably, while smart glasses might come with lenses or clip-on attachments that allow them to get darker or serve as sunglasses when you’re outside, there typically isn’t a way to completely block out the world like you can with a headset, mostly because that’s simply not the point. And even though most smart spectacles can be paired with a phone to get access to mobile data or notifications, they’re generally not meant to be tethered to a PC full-time (though there are some exceptions). The goal for smart glasses is more to provide a mobile-first heads-up display that augments what you see with your eyes instead of replacing things entirely with a digital environment.
OK, but what makes you so sure that smart glasses are “it?”
Now that we’ve discussed what separates smart glasses from headsets, what makes it so obvious that they are going to be the next big thing? This one is a bit easier to answer because we can simply look at the sheer number of companies that have released smart glasses or are planning to do so in the future. If we skip past the Google Glass from 2013 as forward-thinking specs that were ahead of their time, the most well-known example of modern smart glasses is the Meta Ray-Ban (or the even earlier Ray-Ban Stories from back when Facebook was still Facebook).

While they are a bit chunky, the Meta Ray-Ban Display are some of the most sophisticated smart glasses on the market right now due in large part to their single full-color screen. (Karissa Bell for Engadget)
Even though they don’t have built-in displays, the ability to capture photos and videos and play audio via built-in speakers brought the idea of smart glasses into the mainstream without making the concept look or feel completely ridiculous. Those earlier models then paved the way for even more sophisticated iterations like the Meta Ray-Ban Display from earlier this fall, which features a stunning RGB HUD (though only in the right lens) that has gotten us tantalizingly close to a true wearable display that doesn’t make you look like a cyborg. Of course, Meta isn’t the only game in town: there’s a rapidly growing number of competitors from companies like Even Realities, Rokkid, TCL, Xreal, Viture and more.
But for an even clearer sign of where the tech giants are heading, we can just look at Meta’s two biggest competitors: Apple and Google. While Apple hasn’t publicly announced plans to make its own smart glasses, Bloomberg’s Mark Gurman — who is one of the company’s most reliable analysts — provided inside info earlier this fall that Tim Cook and Co. are planning to pivot away from a proper follow-up to the Vision Pro in favor of more lightweight spectacles with greater mass appeal.
This shouldn’t really come as a major surprise, as sales of Apple’s $3,500 headset have been lackluster. But more importantly, for a company that’s extremely cautious about entering new product categories (foldable iPhone anyone?), it feels very telling to hear that Apple is shifting to smart glasses instead of abandoning the idea of wearable displays entirely. This is a company that doesn’t swing and miss very often, so the idea of two flops in a row seems preposterous. If this pivot is real, there must be some Apple execs who are big believers that glasses and not goggles are the right choice for future development.

Here are two of Google’s reference design smart glasses. The one in the front features dual RGB waveguide displays while the one in the back relies on a single monocular screen. (Sam Rutherford for Engadget)
Meanwhile, Google is taking a two-pronged approach. In addition to releasing a new mixed reality OS — Android XR — on Samsung’s Galaxy XR headset in October, the company has also teased upcoming smart glasses along with a handful of partners including glasses makers Gentle Monster and Warby Parker. Just this week, the company also added a number of new features to Android XR designed to support a wide range of upcoming devices while simultaneously making it easier for developers to port existing apps over to smart glasses and headsets. And if you still need additional evidence regarding Google’s desire to get into smart glasses, consider that even with its ongoing collaboration, the company also spent $100 million to acquire a 4 percent stake in Gentle Monster.
Regardless of who is making them though, the big draw for these companies is the idea that smart glasses will become a new piece of core personal computing, similar to how people rely on smartphones and laptops today (or to a lesser extent wireless headphones and smartwatches). If true, that could become a trillion-dollar market in the next 10 to 15 years (or sooner, who knows), which not only makes it a natural avenue for expansion but possibly a future existential crisis for certain companies. After all, none of these organizations want to be the next Microsoft after it failed to develop a successful smartphone or mobile OS.
Fine, the smart glasses trend is real, but why would we even want them?
At this point, I hope it’s clear that the push for smart glasses is very real and very serious. But so far, we’ve only addressed why companies are betting big on them. So what’s in it for us, the people who might actually buy and use them? Well, to answer that, we need to separate the current models into three main categories.

A great use case for smart glasses would be to provide heads-up mapping without the need to constantly look down at your phone as seen in this demo clip of Android XR. (Google)
First, there are the most basic smart glasses that don’t come with built-in displays and typically rely on cameras and built-in speakers for enhanced functionality. The best example of this class of devices is the Meta Ray-Ban smart glasses (or the original Ray-Ban Stories) along with rivals like the Bose Sound Frames, which, believe it or not, have been on the market since 2019.
However, before anyone gets attached to these early models, the simplest smart glasses already kind of feel like dinosaurs and will probably, in the not-too-distant future, go extinct. They were an interesting attempt to add things like music playback or photo and video capture to regular-looking sunglasses, but their limited feature set puts a clear ceiling on what they can do. Plus, if this is what people really wanted, they would have taken off already.

Waveguides like the ones built into the Even Realities G2 project images directly onto their lenses allowing for super sleek glasses featuring a heads-up display. (Sam Rutherford for Engadget)
This brings us to more recent offerings like the Meta Ray-Ban Display, Even Realities G2, the Halliday glasses and others which add some type of built-in display to the mix. Most often, these models rely on waveguide displays as they enable thinner and lighter designs while propagating images onto the glasses’ lenses. Currently, most of these smart glasses feature single-color optics (usually green) to reduce complexity and power draw, but there are others like the Meta Ray-Ban Display and both the TCL RayNeo X2 and X3 that support full color.
In this day and age when everyone is surrounded by screens, the idea of yet another display mounted inches away from your eyeballs might sound like the last thing you want. However, because modern smart glasses are much more discreet and less awkward-looking, I find that they can actually help cut down on distractions. That’s because instead of having to peek down at your phone or smartwatch to check notifications, reply to messages or look up directions, you can do many or all of these things using smart glasses — all in the middle of a conversation without anyone noticing.
Not only does this keep your focus where it should be — on people instead of gadgets — the glasses are also just as easy to wear as a smartwatch and far more comfortable than bulky VR headsets. Then, when you consider some other features of modern smart glasses like on-the-fly translation, the ability to function as a teleprompter hidden in plain sight or additional support from AI, suddenly you have a wearable that allows you to keep all of your other devices neatly stashed away. In many respects, smart glasses could be the portable displays that people might not even know they want.

Compared to rivals with waveguides, glasses featuring “birdbath” optics are often significantly thicker and bulkier. (Sam Rutherford for Engadget)
Speaking of portable displays: If you recall, I mentioned above how most smart glasses generally don’t need to be tethered to other devices. The exception to that comes from a subclass of specs that are primarily designed to function as wearable monitors capable of supporting one or more virtual screens that can be in excess of 100 inches in size, relatively speaking.
The most well-known smart glasses in this category come from Xreal and Viture, with both companies offering a range of models with varying levels of performance. One interesting thing to note is instead of waveguides, some of these smart glasses rely on birdbath optics. This means instead of projecting an image into the lens itself, they use a beamsplitter and mirror to reflect images into your eye. The benefit of this is that you get good image quality from components that cost less than an equivalent waveguide setup, with the downside being increased light loss, potentially lower brightness and a much thicker design. This results in chunky frames that often look like they are sitting too far away from your face, which might not be immediately apparent if you see someone using them from afar. But up close, they don’t look quite right. Or at least they don’t look like a pair of “normal” glasses.
Another issue is that due to more light loss, birdbath smart glasses require darker lenses (similar to sunglasses), which means they aren’t great for wearing all day in a variety of environments. And because we still don’t really have a great protocol for wireless displays (though it looks like Valve may be cooking up something with the Steam Frames), most of these need to be connected by wire to a nearby PC. So you plug them in, put them on, get your work done and then you take them off.

Project Aura is Xreal’s next-gen smart glasses and they feature a large 70-degree field of view and fancy electrochromic lenses. (Sam Rutherford for Engadget)
That said, for those who need a ton of screen real estate, this type of smart glasses can be a very attractive alternative to traditional portable monitors. On top of being smaller and more portable, they provide additional privacy when working in public spaces like a cafe or plane, which is what prompted a doctor friend of mine to get a pair instead of going with a portable display. And for the gamers out there, because they can be connected to a phone or even a portable PC or Switch 2 (with the proper dock, of course), they’re great for people who might not have room for or access to a big screen TV.
So where do we go from here?
Ultimately, I think all three types of smart glasses will merge into one as engineers perfect the tech and steal ideas from one another, though there will surely be plenty of room for more niche designs. But more importantly, if we consider the types of gadgets most people carry around today, it boils down to just a handful of devices: a smartphone, some type of wireless audio (either earbuds or headphones) and maybe a health and fitness tracker of some kind (typically a smartwatch or smart ring).

Even tough they didn’t have a built-in display, the Meta Ray-Ban smart glasses from 2023 raised a ton of awareness for the category. (Sam Rutherford for Engadget)
Smart glasses have the potential to really round out that kit by allowing us to keep most of those devices in our pocket while the wearables serve up helpful info when we need it, but without being overly intrusive or distracting. In the short term, you’ll still need a laptop for work, but smart glasses may have a role to play there too, as they can provide way more screen space than a traditional physical display (even the new-fangled flexible ones). It might never happen, but I wouldn’t rule out a future scenario where your next employer gives you a company-issued phone and a pair of smart glasses and that’s it.
Before that happens though, there are still a bunch of other things that need to be figured out. Without help from a mouse or keyboard, navigating a virtual display is a bit of a challenge. AI combined with hand and eye tracking can help, but no one has really nailed that combo yet. Not even Apple could do so on the much bulkier Vision Pro. To address this, Meta created a bracelet (they call it a neural band) that pairs with the Ray-Ban Display that can detect subtle movements so you can type or navigate menus practically anywhere. Even Realities opted for a ring accessory that does some basic health monitoring and comes with a tiny touchpad. In the more distant future, this hurdle may be solved by BCIs (brain-computer interfaces), but even the most optimistic view suggests that those aren’t going to be mainstream for a long time.

Even though we’re still a long ways away, one day everyone might be able to have something like Tony Stark’s E.D.I.T.H. smart glasses from the Marvel Universe. (Marvel)
The issue for Meta is that it’s pretty obvious that its wristband really ought to be incorporated into a smartwatch. The idea of a single-purpose bracelet that doesn’t track your health or do anything else sort of feels like a step backwards. And there’s the problem of Meta’s glasses being largely tied down to its own platforms (i.e. Instagram, Whatsapp and Facebook), which may end up being a major hindrance after rivals like Google and Apple catch up.
And then there’s the cost. Right now, a pair of Meta Ray-Ban Displays (which thankfully come with the wristband) costs $800. That’s a lot for what is basically a publicly available beta test. But when you consider that an Even Realties G2 and an R2 ring costs even more at $850, it’s clear that wearing smart glasses is going to be a very expensive hobby for at least the next few years. And while more single-purpose smart glasses from Xreal and Viture are a bit more affordable, with models ranging from $400 to $550 or $600, they still aren’t cheap. On top of that, getting prescription lenses for smart glasses can often be a major pain in the ass and may not even be an option for people with more limited eyesight.
But those are problems for another day. And just because tech giants are pouring billions into the development of smart glasses doesn’t mean they will be a guaranteed hit. If you care about tech, alongside AI and possibly EVTOL aircraft (aka flying taxis), pay attention to the advancements in smart glasses. Otherwise, you could miss out on what might be the next major wave in sci-fi gadgetry made real.