Google announced it will begin testing new AR, augmented reality, experiences in the public with a limited number of Googlers and trusted testers. These include in-lens displays, microphones, and cameras, that Google will start to test next month in the real world.
Google explained these can be “used to enable experiences like translating the menu in front of you or showing you directions to a nearby coffee shop.” Adding use cases include navigation, translation, transcription, and visual search.
Google has a help document that goes into a bit more detail on these devices. It says Google is “testing new experiences such as translation, transcription, and navigation on AR prototypes.” The “research prototypes look like normal glasses, feature an in-lens display, and have audio and visual sensors, such as a microphone and camera.”
So “normal glasses” is one case, maybe like the Facebook glasses, I hope it is not like the old Google Glass.
Google added it “will be researching different use cases that use audio sensing, such as speech transcription and translation, and visual sensing, which uses image data for use cases such as translating text or positioning during navigation.” Google added “we will test experiences that include navigation, translation, transcription, and visual search.”
Don’t like this? Google said an LED indicator will turn on if image data will be saved for analysis and debugging. If a bystander desires, they can ask the tester to delete the image data and it will be removed from all logs.
Now, I need to work on getting one of these. 🙂
Forum discussion at Twitter.