Google begins testing ‘smart glasses’ in public: What they look like, what they are, and all the other details

banner img

Almost 10 years later Google Glass, Google will put its smart glasses back in public again. Google announced a pair of Augmented Reality (AR)-based prototype glasses at this year’s Google I/O conference in May 2022. The company has now announced that it plans to test prototype glasses. This AR or ‘smart glasses’ goes to the public early next month. . The search giant wants to bring its ‘smart glasses’ from the lab to the real world.
“…starting next month, we plan to test real-world AR prototypes. This will allow us to better understand how these devices can help people in everyday life. And as we develop experiences like AR navigation, it helps us Juston Payne, Group Product Manager at Google, write: Here are all the details:
What is that ‘limited public testing‘of smart glasses
Limited AR prototype testing will allow selected participants to test new AR prototypes and services that are not available to the public.
Who is eligible to test these AR glasses prototypes
At this time, eligibility is limited to Googlers (Google employees) and select ‘trusted testers’. According to the company.
What Google is testing in AR glasses right now
Google is testing new experiences like translation, transcription, and navigation on AR prototypes. The company will study different use cases using audio sensors, such as voice transcription and translation, and image perception, using image data for use cases such as text translation. or positioning during navigation.
What Google’s AR glasses prototype looks like
These Google AR glasses prototypes look like normal glasses, have an in-lens display, and have audio and visual sensors, such as microphones and cameras.
Does the AR glasses prototype support photography or video recording?
No. Google’s AR prototypes don’t support photography or video, although image data will be used to enable use cases like navigation, translation, and image search. For example, use the camera to translate a menu in front of the user or show directions to a nearby coffee shop, overlaying AR in view. Once the experience is complete, the image data will be deleted, except where the image data will be used for analysis and debugging. In that case, the image data will be filtered first to find sensitive content, including faces and license plates. It is then hosted on a secure server, with limited access by a small number of Googlers for analysis and debugging. After 30 days, it will be deleted.
How do other people know if they’re near someone testing Google’s ‘smart glasses’
Google says the LED indicator will turn on if image data is saved for analysis and debugging. If bystanders wish, they can ask the tester to delete the image data and it will be removed from all records. Google says that all testers must undergo rigorous training on the device, protocol, privacy and safety, and the device before they can use the prototype device. They learn about the limitations placed on testing and best practices for operating appropriately, safely, and responsibly in real-world environments.


FacebookTwitterInstagramKOOKS APPYOUTUBE

Source link


News7F: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button