Google Glass sensors list now available, AR apps made possible

Publish date: 2022-08-22

The Google Glass sensors list has been unearthed by developer Lance Nanek, and, judging from it, it appears that AR apps will be a very definite probability sometime in the future.

Nanek says that he has managed to get a hold of the Google Glass sensors list (which isn’t mentioned in the official specs) by pushing an Android app to the device using debug mode and then listing sensors. While at the moment the Mirror API (the one developers can use to build apps) only supports getting the location every 10 minutes, these sensors will allow developers to create “real” AR apps.

Here’s the list of sensors, without further ado:

[list_color style=”minus_black”]

[/list_color]

Location providers are as follows:

[list_color style=”minus_black”]

[/list_color]

Lance Nanek has also posted a video, showing that the sensors actually work and, by looking closely, you can see that the parameters change depending on the direction the user looks in.

What could the future bring?

While these sensors are not enabled yet, their presence there could only mean that they will be at some point, allowing developers to make real AR apps, which will offer an experience that couldn’t be offered until now. Just imagine games like Ingress played on Google Glass and think of the possibilities this could open for augmented reality games.

Also, GPS navigation apps could offer, for example, lane assistance right in front of your eyes, without you needing to take your eyes off the road, or you could get accurate and up-to-date information about landmarks around you when you’re on vacation.

These are just a few examples, but the possibilities are almost endless, once Google opens them up for developers of third-party apps.

What kind of apps would do you think will be developed for Google Glass, using these possibilities?

Comments

ncG1vNJzZmivp6x7orrDq6ainJGqwam70aKrsmaTpLpws86onqWdXZy5or%2FSZqqepqOkv7R5y6KqrWViZYF4g49o