For years, riding a weekend bike has been a sacred escape for me. All the pedal strokes helped melt the stressors stacked throughout the week, collecting some gadgets that would make these vehicles even better. But I learned the hard way that bringing too much gear allows you to get away from the ride itself and not just ride the bike, but also to manage the ping and battery level network.
Ray-Ban Meta: Enter smart glasses that made my weekend rides simpler and made them a little more fun.
Instead of wearing sunglasses, a pair of headphones and groping around with your phone to take photos while riding, there’s one device that can help you with everything.

The Ray-Ban Meta smart glasses were a surprising hit with more people than I did. Meta says it sold millions of these devices. CEO Mark Zuckerberg recently said its sales have tripled last year.
Several Reddit threads and YouTube videos suggest that many people wear Ray-Ban Meta glasses on their bikes. Meta is catching up in the same way. It reports that it builds the next generation of AI smart glasses along with Oakley, which was specifically built for athletes.
I never expected to use my Ray-Banmeta on my bike. But a few months ago I decided to give them a try.
Now I ride my bike and wear these glasses more than anywhere else. Meta has got enough of these smart glasses and convinces me that there’s something here. It’s almost a joy to use and with a few upgrades you could get there.
TechCrunch Events
Berkeley, California
|
June 5th
Book now
An important selling point for Ray-Ban Meta is that they are simply a solid pair of Ray-Ban sunglasses. Mine is a wayfarer style with transition lenses and a clear plastic body.
I have these that work well for riding bikes and are good for protecting my eyes from the sun, dirt and pollen. They sit comfortably under the bike helmet, but perhaps not perfect. (More details later.)
Meta’s smart glasses’ killer features are the cameras above the right and left eye. Instead of groping on my phone, the glasses just press a button in the top right corner of the frame to grab a photo or video of what I saw on my vehicle.


Last weekend, while walking through San Francisco’s Golden Gate Park, I used Ray-Ban Metagrass to photograph beautiful blue Lake Heron, shrub-covered dunes where the park meets the Pacific Ocean, and wood-covered tracks at the park’s entrance.
Is the camera great? no. But it’s pretty good, and if I wasn’t wearing glasses I’d be capturing moments that I would never have. So, while we don’t consider a camera as a replacement for your phone’s camera, it’s a way to capture more photos and videos perfectly.
The feature I use most: The open early speaker in the arms of my glasses allows me to listen to podcasts and music without blocking the noise of people around me, bikers and cars. Meta was far from the first company to put speakers in glasses. Bose had a solid pair for years. However, the take on Meta’s open-ear speakers is surprisingly good. I was impressed by the quality of the audio and how little traditional headphones can be missed on these vehicles.
I found myself chatting a bit with Meta’s AI assistant on weekend rides. I recently asked about the nature I’ve seen throughout the park – “Hey, meta, look, can you tell me what kind of tree this is?” – as well as the origins of the historic buildings I’ve seen.
I usually use bike rides as a way to unplug from the world, so talking to an AI chatbot while on a ride seemed counterintuitive. But I found that these short queries surprised my curiosity about the world around me without sucking me into the rabbit hole of content or notifications.
And again, the biggest thing about these features is that they are all in one device.
This means there’s less to charge, less clutter on the bike gearbox, and fewer devices to manage along the ride.
Potholes
Ray-Ban Metagrass is perfect for roaming around, but it was not designed with bikes in mind.
Often, the metaglass of rays falls from my nose during the bumpy ride. I bend on my bike and look up at what’s in front of me, and the thick frame blocks my view. (Most sunglasses for cyclists have thin frames and nose pads to solve these problems.)
There are some limitations to how Ray-Ban Meta Glasses works with other apps, but this is a problem. I love taking photos and pausing music with my glasses, but whatever else my phone has to get out of my pocket.
For example, Ray-Ban Meta has a Spotify integration, but I struggled to play certain playlists to my AI assistant. Sometimes when I asked for playlists or played a completely wrong playlist, my glasses didn’t play anything.
We look forward to seeing these integrations improved and expanded to include more bike-specific integrations with apps like Strava and Garmin.
The Ray-Ban Meta is not working well on the rest of my iPhone as well. This is due to Apple’s restrictive policies.
I would like to be able to fire text and easily navigate Apple Maps with Ray-Ban Meta Glasses, but such features may not be available until Apple releases its own smart glasses.
It leaves the AI assistant in Meta. AI features are often advertised as the main selling point of these glasses, but I have found it to be often lacking.
Meta’s voice AI is not as impressive as Openai, Prplexity, or Google’s other voice AI products. The AI’s voice feels more robotic, and the answer proves unreliable.
We tested the recently launched Ray-Ban Meta live video AI session, first announced at last year’s Meta Connect Conference. This feature aims to stream live video and audio from Ray-Ban Meta to AI models in the cloud, creating a more seamless way to interact with AI assistants, and “show” what you’re viewing. In fact, it was a hot mess of hallucination.
We asked Ray-Banmeta to identify some of the interesting cars that were riding bikes near the apartment. Glasses described the modern Ford Bronco as a vintage Volkswagen beetle. The glasses then confidently told me that the 1980s BMW was a Honda Civic. It’s closer, but still very different cars.
During a live AI session, I asked the AI to help identify some plants and trees. AI told me that eucalyptus trees are oak trees. When I said, “No, I think it’s a eucalyptus tree,” the AI replied, “Yeah, that’s right.” Such an experience makes me wonder why I’m talking to AI.
Google Deepmind and Openai are also working on multimodal AI sessions like Meta offers with smart glasses. But for now, the experience seems far from over.
I really want to see an improved version of AI smart glasses that can ride a bike. The Ray-Ban Metagrass is one of the most convincing AI devices I still saw and after some important upgrades I was able to see how it would be a joy to wear on a ride.
Source link