Google Glass: Are Humans Ready for Ubiquitous Computing?
Are you ready to walk down the street wearing a computer screen over your eyes that will feed you information or coupons as you pass shops and restaurants, as well as share details about the weather, traffic and daily news? To issue voice commands for recording hands-free, point-of-view video, take pictures, search the Internet or provide translation?
For some people, the idea is a thrilling advance toward convenience and permanent connection to the Internet via ubiquitous computer integrated into your life. For others it may seem creepy, like the display Arnold Schwarzenegger used in "Terminator" to find targets and come up with rejoinders in conversations with humans, and a step toward the merger of humans and machines that sci-fi flicks warn us about.
But whatever we may think of it, the augmented reality of Google Glass is here, already in use and running the Mountain View, Calif.-based technology giant's Android operating system. It has been available to selected "explorers" since last year, and is expected to be commercially available later this year.
For now, the price tag is a whopping $1,500 for the developer version, but the consumer version is likely to be considerably less.
Developers are working with Google's Mirror application programming interface (API) to create third-party applications, after Google released Glass on April 15.
Google Glass made its debut at the company's I/O (input/output) developer conference last year, when skydivers landed on the roof of the building wearing the headsets.
Together with the anticipated iWatch reported to be under development by Apple, the trend toward wearable technology is well underway.
"Google Glass is a good first attempt to build a whole new ecosystem of devices with newer form factors," analyst Neil Shah of Strategy Analytics told us.
"However, the reported user experience is obviously subpar, being a very first generation device. Wearable devices in near- to mid-term will remain more of an 'appcessory,' still dependent on the smartphone as a primary device and mostly a secondary 'notification' device."
Shah sees the introduction of Apple's iPhone in the summer of 2007 as the "inflection point" of the mobile industry, changing the way we interact with phones and building a new ecosystem with touchscreen interaction and the availability of utility-building applications.
The Next Frontier
"Similarly, Google Glass is a good attempt but it is still miles away to completely change how we will interact next, not only with the digital but physical world as well. That's the next frontier."
To really advance the technology, Shah said, Google, Microsoft and others must create a "natural user interface" that brings together such technology as the Xbox's motion-sensing Kinect, augmented reality, location tracking and Google Glasses to provide a case for users, as Apple did with the iPhone's slate form factor.
Posted: 2013-05-09 @ 6:54am PT
I think the author is just trying to introduce a new term, "appcessory", which is not going to catch on because it's not about the apps, it's about the hardware and being wearable. He is correct we are entering a wearable computer era and new ecosystems are being built (or more resasons to use an exsiting private one), but he neglects to consider the fast pace technology takes now, especially when you factor in competing companies jumping into the mix. There are already 2 other "Glass" similar but inferrior products in the market. A watch is not going to catch on unless it completely replaces carrying a smartphone.