
Voy
LAS VEGAS — As the curtain rises on CES 2026, the mixed reality landscape is undergoing a seismic shift. The entry of tech titans like Apple, Samsung, and Google into the high-end headset market has catalyzed a massive wave of innovation, moving spatial computing from a niche enthusiast hobby into a mainstream productivity tool. Amidst this high-stakes arms race, Voy has emerged as a critical infrastructure player, providing the optical precision necessary for users to actually work and live within these digital worlds.
By leveraging its proprietary tunable lens technology, Voy is solving the "clarity gap" that has long plagued VR and AR devices. At this year's show, the company is demonstrating how its expertise in adaptive optics is moving beyond simple vision correction and into a future of fully autonomous, intelligent eyewear.
The last 12 months have been a whirlwind for the VR and AR industries. The primary catalyst was the market entry of the Apple Vision Pro, which forced other major players to accelerate their own development cycles.
"One of the interesting news is about Samsung and Google enter this market after they saw the Vision Pro from Apple," a Voy representative explained at their booth. This competition has led to a flood of new products, particularly on the "colossus" or enterprise side of the industry. As these devices become more "extra productive," the demand for perfect visual clarity has never been higher. For a user to spend eight hours a day in a virtual workspace, the optics must be flawless.

Voy
Voy's strategy for 2026 is one of broad ecosystem support. Recognizing that no single headset will own the entire market, the company has spent the last year building a production infrastructure that caters to every major player.
While the current generation of Voy lenses requires manual adjustment, the company is using CES 2026 to preview a revolutionary leap in optical automation.
The "one key application" currently in development is an automatic mechanism for the tunable lens. By integrating the lenses with the eye-tracking sensors that are becoming standard in high-end headsets, Voy aims to create a system where the user never has to touch a dial again.
"User doesn't have to be manually adjusted, but using the eye tracking to read the user's distance view," the representative shared. As your eyes move from a virtual spreadsheet at "arm's length" to a distant 3D model, the lenses will automatically reshape themselves in real-time to maintain perfect focus. This technology relies on sensors that are "widely available these days," and Voy is currently building the foundational software and hardware to make this seamless transition a reality.
Voy's vision for the future is technically brilliant, but its success will depend on its ability to overcome the traditional hurdles of adaptive optics.
As the conversation at the booth concluded, the message from Voy was clear: 2026 is about building the foundation for a future where vision is no longer a static experience.
By CES 2027, Voy expects to have moved past the "interesting project" stage and into full-scale deployment of their automated lenses. The company is hoping to achieve a level of integration where their technology is the "de facto" standard for anyone using high-productivity VR or AR devices.
"We are building this working, and those are the foundations," the representative noted. As they continue to refine their production lines for Apple and Quest, Voy is positioning itself as the literal lens through which we will view the digital future.
Voy is proving that in the world of high-tech headsets, the most important component isn't the screen or the processor—it's the interface between the machine and the human eye. By bringing tunable, and soon autonomous, optics to the masses, they are ensuring that the VR revolution is not just immersive, but crystal clear. Whether it's helping a professional finish a workday in the Apple Vision Pro or creating a new category of intelligent eyewear, Voy is making sure that we never lose focus on what matters most.
