
Exactly one year ago, Google unveiled a pair of augmented reality (AR) glasses at its I/O developer conference. But unlike Google Glass, this new concept, which didn’t have a name at the time (and still doesn’t), shows the practicality of digital overlays, promoting the idea of real-time language translation while you’re talking. another person. .
It’s not about shooting magic spells or watching dancing cartoons but about providing accessibility to something we all do every day: communicate.
Also: How to join the Google Search Labs waitlist to access its new AI search engine
The concept has the appearance of a regular pair of glasses, which explains that it is not necessary to look like a cyborg to reap the benefits of today’s technology. But, again, it was just a concept, and Google hasn’t really talked about the product since then.
Twelve months later and the popularity of AR has now been replaced by another acronym: AI, which has shifted much of Google and the tech industry to focus more on artificial intelligence and machine learning and away from metaverses and, in I think, glasses that help you transcribe language in real time. Google literally said the word “AI” 143 times during yesterday’s I/O event, as counted by CNET.
But it was also during the event that something else caught my eye. No, not Sundar Pichai’s declaration that hotdogs are actually tacos but, instead, a feature that Google quickly demoed in the new Pixel Fold. (Tacos on smartphones? No brainer.)
The company calls it Dual Screen Interpreter Mode, a transcription feature that uses the front and rear screens of the foldable and the processing power of the Tensor G2 to simultaneously show what someone is saying and how it’s being said. translated into another language. At a glance, you can understand what others are saying, even if they don’t speak the same language as you. Sound familiar?
I’m not saying a foldable phone is a direct replacement for AR glasses; I still believe there is a future where the latter is present and may replace all the devices we carry. But the Pixel Fold’s Dual Screen Interpreter Mode is the closest callback we’ve gotten to Google’s concept this year, and I’m excited to try out the feature when it arrives.
Also: All the hardware Google announced at I/O 2023 (and yes, there’s a foldable one)
The Pixel Fold is available for pre-order now, and Google says it will start shipping next month. But even so, you’ll have to wait until the fall before the translation feature sees an official release, so stay tuned.