Glasses of Perception

How Smart Glasses Could Transform Driving Safety

The smart glasses market stands at an inflection point. Following Meta's successful Ray-Ban collaboration that proved consumer appetite for wearable tech, industry giants including Meta, Snap, Google, and Apple continue significant investment in AR glasses prototypes and developer editions. This momentum has triggered a wave of innovation, with hundreds of Chinese companies, from established tech brands to OEMs and chip manufacturers, now actively developing smart glasses solutions.

The evolution of smart glasses mirrors the smartphone's transformative journey. Just as mobile phones evolved from simple communication devices into powerful ecosystems that revolutionised how we live and work, smart glasses are positioned for a similar trajectory. As these products mature and integrate more deeply with systems in our homes, workplaces, and vehicles, they'll do more than display information, they'll reveal hidden patterns that help us understand our world more profoundly and take meaningful action.

Matthew Cockerill and Carsten Eriksen bring decades of experience helping global brands transform advanced technologies into compelling use cases and everyday products. This background gives them unique insights into this rapidly evolving product category and its untapped potential.

Moving beyond the obvious applications of navigation prompts and smartphone notifications, Matthew and Carsten have explored use cases that demonstrate the true transformative potential of smart glasses. Their automotive integration concept serves as an example of how these devices could create entirely new value when thoughtfully integrated with existing systems.

 
 

Beyond Smartphone Extensions

The automotive context provides an ideal proving ground to explore this because it combines high-value safety outcomes with existing sensor infrastructure, creating immediate practical applications while demonstrating how smart glasses can move beyond passive information display to become active perception enhancers in complex environments.

Matthew and Carsten have explored what happens when AI reveals the hidden patterns of the world while driving with smart AR glasses. Their concept augments our view to enhance visibility in difficult driving conditions and anticipate the actions of other road users.

Their smart glasses solution would tap into vehicle systems, leveraging existing sensing capabilities and combining them with edge-AI processing to generate augmented reality overlays that align perfectly with the driver's field of view.

The Current Market

First-generation offerings from Meta and Ray-Ban have introduced glasses with integrated cameras and speakers for hands-free video capture and AI assistant interactions. These glasses showcase AI capabilities by listening to our questions while understanding the world we see. For example, Meta's glasses can tell the wearer more about a house they're looking at in San Francisco.

 
 

While Meta has initially focused on enhancing existing glasses with digital capabilities, both them and Google are developing the next step, glasses with integrated displays.

Google's Android XR, an AI-powered operating system specifically designed for headsets and glasses, is nearing public release. With Google's stated vision is to "bring together digital information and the real world to help you watch, work, and explore," creating a seamless blend between digital and physical realities. And the company recently showcased their advanced AR Glasses prototype at the TED 2025 conference, demonstrating the capability to project contextual information directly into the user's field of view through sophisticated display technology.

 

Google VP Shahram Izadi wearing Google's prototype AR glasses. © Jason Redmond / TED

 

However, even these promising developments represent only the early generation of smart glasses technology. With companies like Apple rumoured to be developing their own AR glasses, the market is poised for rapid expansion. Yet all current offerings primarily function as smartphone companions, essentially overlaying digital information onto our physical environment without deeper integration with surrounding systems.

Improved Vision in Challenging Conditions

 

Driving in bright sunlight, night time, or fog presents significant perceptual challenges. Vital elements like road edges, markings, and other road users can become nearly invisible, creating disorientation and increasing accident risk.

The smart driver glasses concept leverages the vehicle's existing sensor suite to detect these elements and enhance them through the glasses' in-lens additive light field display. The system colourise important features—road edges, other vehicles, pedestrians, and the horizon—in real time, extending beyond the windshield view to include peripheral awareness inside the cabin and driver blind spots. This maintains driver orientation even in the most challenging visibility conditions, significantly reducing the dangers of blind spots during lane changes and turns.

 

Anticipating Other Road Users’ Actions

One of the most stressful aspects of driving is the unpredictability of other road users. While turn signals and brake lights help us anticipate changes in direction or speed, they're often used inconsistently or too late to provide adequate warning.

Advanced AI processing calculates relative motion patterns and predicts movement changes seconds before they become apparent to the driver, displaying subtle "shadow" projections of predicted vehicle positions. The system outlines road users like cyclists and motorcyclists who can be hard to spot and are beginning to make a move.

This provides drivers with crucial additional seconds to determine smoother, less stressful responses, creating a safer and more comfortable driving experience.

How It Works

 

Existing driver assist systems in advanced cars today—with adaptive cruise control, lane control, emergency braking, and park assist—are powered by cameras and LiDAR sensors to detect objects and estimate distances. This information is often shown to the driver with explicit warnings or ongoing street visualisations on the dashboard, as in Tesla vehicles.

The proposed system utilises both inward-facing sensors (monitoring brakes, gears, steering, and acceleration) and outward-facing sensors (radar, LiDAR, and cameras) to understand the world around the car, the driver's behaviour, and that of surrounding vehicles.

 

In this concept, this information also helps understand what the car and driver are seeing. In real-time, it calculates what would be helpful to the driver, projecting graphic elements that enhance the driver's view of the road ahead to augment their natural perception of the surrounding environment.

 

Not ready yet, but useful today

The technical challenges to bringing this vision to life are substantial, involving sensor fusion, real-time processing, and overcoming the inherent limitations of current light field displays such as low-luminance colours and restricted viewing angles.

However, this work isn't just speculative futurism; it serves as a strategic tool with immediate practical applications for business and product teams:

  • Accelerate R&D Focus By identifying specific high-value use cases like enhanced driving perception, companies can focus their technical research efforts on solving the most impactful problems rather than pursuing generic capabilities.

  • Clarify Go-to-Market Positioning Understanding how smart glasses might integrate with existing ecosystem products provides clear positioning opportunities in the marketplace.

  • Partnership Opportunities This exploration highlights potential collaboration points between technology companies and automotive manufacturers, creating pathways for strategic alliances.

  • Enable Strategic Roadmapping By projecting the evolution of smart glasses from standalone devices to integrated system components, product leaders can develop staged roadmaps that grow capabilities over time while delivering value at each step.

  • Guide Investment Priorities For technology investors and corporate development teams, understanding these potential integration points helps identify promising acquisition targets and technological capabilities worth investing in today, even before the full vision materializes.

 

Looking Forward

By exploring these speculative but plausible futures, Matthew and Carsten are able to move beyond just iterating on existing products. As smart glasses continue their evolution from novelty to necessity, those companies that anticipate integration opportunities with existing systems will find themselves best positioned to compete. The future of smart glasses isn't simply about what we see through them, but how they help us perceive and understand the complex world around us.

About Us

If you’ve made it this far, you must be curious about us.

Matthew is a design innovation consultant specialising in uncovering the human and business value of new technology paradigms. To inspire and accelerate their R&D, productisation, and go-to-market strategies.

Carsten is founder and CEO of at Swift Creatives, a design studio based in Aarhus, Denmark. His team excels at thinking, shaping, and crafting physical products and experiences that help startups and global brands stay ahead in highly competitive and disruptive markets.

Together, we provide a unique combination of strategic vision and hands-on implementation expertise to help forward-thinking brands build their future. Our work spans from speculative concept development to practical roadmapping that connects long-term vision with immediate action.