More than a decade has passed since Google Glass smart glasses were announced in 2013 and then quickly withdrawn due to low adoption rates. A later (and lesser-known) second version was released in 2017 and was aimed at the workplace. They were phased out in 2023.
In December 2025, Google made new commitments to its smart glasses to release two new products in 2026. But why are Google smart glasses struggling while other companies are succeeding? And can Google succeed a third time?
you may like
These are the types of accessories that have been around for centuries and are now commonly adopted in society.
Some of the latest academic research is taking this approach, incorporating sensors into jewelry that people actually want to wear. Research has developed a scale (WEAR scale, or Wearable Acceptance Range) to measure the social acceptability of wearable technology. The scale includes questions such as “I think it would be acceptable for my co-workers to wear this device.”
Noreen Kelly and colleagues at Iowa State University showed that the scale measures two things at its core. One, the device helps people achieve their goals (so it’s worth wearing), and two, it doesn’t raise social concerns about privacy or being seen as rude.
This latter issue was most prominently highlighted by the term “glass hole” that emerged for Google Glass users. A number of studies have looked at the potential benefits of smart glasses, from mental health to surgical use, but privacy concerns and other issues are ongoing with new smart glasses.
That being said, the most common concern for potential buyers is “look and feel.” The most successful products are first designed to be desired as accessories and then incorporate smart technology. In fact, designer brands are more common.
a splendid sight
Following Google Glass, Snapchat released smart glasses called “glasses” that have a built-in camera, emphasize fashion, and are more easily accepted by society. Today’s hottest smart glasses have been released by Meta (Facebook’s parent company) in collaboration with designer brands like Ray-Ban and Oakley. Most of these products include a front-facing camera and support for conversational voice agents with Meta AI.
So what can we expect from Google smart glasses in 2026? Google is promising two products. One is audio only, and the other has a “screen” on the lens (like Google Glass).
you may like
The biggest assumption (based on the promotional video) is that there will be a significant change in form factor, from the futuristic, if not horribly unfamiliar design of Google Glass to something more commonly seen as glasses.
Google’s announcement also focused on the addition of AI (it was actually announced as “AI glasses” rather than smart glasses). However, these two types of products (audio-only AI glasses and vision-projection AI glasses) are not particularly new when combined with AI.
Meta’s Ray-Ban products are available in both modes and include voice interaction with a unique AI. These are more successful than, for example, the recent Humane AI Pin, which includes a front-facing camera, other sensors, and voice support from an AI agent. This was the closest thing to a Star Trek collar communicator ever.
Direction of travel
Perhaps the main direction of innovation in this is to first reduce the thickness of smart glasses. Smart glasses are necessarily bulky due to the electronics they incorporate, and they appear to be in normal proportion.
“Making glasses you want to wear” is how Google describes it, so we might see some innovation from the company that just improves the aesthetics of smart glasses. We also collaborate with popular brand partners. Google also touted the release of wired XR (mixed reality) glasses, which have a significantly reduced form factor compared to commercially available virtual reality headsets.
Second, you can expect more integration with other Google products and services. Google has many more commonly used products than Meta, including Google Search, Google Maps, and GMail. The company’s promotional materials show an example of AI glasses displaying Google Maps information while walking around town.
Finally, and perhaps the biggest area of opportunity, is innovation around the incorporation of additional sensors and perhaps integration with other Google wearable health products, which we see in much of the company’s current business, such as the introduction of its own smart ring.
Many studies have focused on what can be sensed from common touch points on the head, such as heart rate, body temperature, galvanic skin response (skin moistness that changes due to stress, etc.), and even brain activation via EEG and other methods. With current advances in consumer neurotechnology, we could easily see smart glasses that use brain waves to track brain data within the next few years.
This edited article is republished from The Conversation under a Creative Commons license. Read the original article.
Source link
