Highlights:

  • A LED indicator lets people nearby know that data is being recorded and they can ask to have any image data that is being saved for analysis and debugging removed.
  • Google seems to be trying to prevent a recurrence of disaster with Google Glass headset by revealing these testing ahead of when they’ll actually take place

Google announced recently that the company intends to test augmented reality (AR) prototypes in public in the autumn. Google has been investigating ideas like augmented reality (AR) glasses that display translations in real time, but the corporation wants to apply its theories outside of the lab. According to our prior reports, Google plans to start shipping its “Project Iris” AR headset in 2024.

“This will allow us to better understand how these devices can help people in their everyday lives,” Google AR Product Manager Juston Payne wrote in a blog post. “And as we develop experiences like AR navigation, it will help us take factors such as weather and busy intersections into account — which can be difficult, sometimes impossible, to fully recreate indoors.”

According to a Google support page there are tight constraints on where testers can operate, and the types of activities they can engage in, and the corporation will test a limited number of prototypes in particular US cities. Device, protocol, privacy, and safety training is required of testers.

Additionally, the business has issued a warning about prototypes that would resemble regular glasses but contain an in-lens display and visual and audio sensors like a microphone and camera. A LED indicator lets nearby people know that data is being recorded and they can ask to have any image data that is being saved for analysis and debugging removed.

Google intends to investigate application cases for visual sensing technologies such as text translation and navigation assistance, as well as speech transcription and translation of spoken language. Although the corporation asserts that its prototypes do not permit photography or videography, any images recorded during testing will be lost unless they are needed for additional research or debugging. The business states that in that situation, the image data is initially cleansed for sensitive content, including faces and license plates. After that, it is kept on a secure server with restricted access for analysis and debugging by a select group of Google employees. It is erased 30 days after that.

In testing, Google lists translation, transcription, and navigation, and at Google I/O earlier this year, it showed off glasses that could display language translations right in front of the user’s eyes. But Google seems to be trying to prevent a recurrence of the 2014 disaster that dogged the company’s iconic Google Glass headset by revealing these testing ahead of when they’ll actually take place in the real world and detailing what they’ll entail.