The age of wearable technology will be ushered in this year with the release into the consumer market of Google Glass, the headworn gadget that allows you to check recipes while you cook, share what you see as you see it, speak to send a message and ask whatever's on your mind.
Essentially, wearable computers are computers that are worn on the body and are always on, always accessible, and always aware of the user and their surroundings.
But the concept isn't new: it dates back to the 1960s, with early efforts to develop hidden computers that could help people beat the casinos.
In the 50 years since, it has evolved from bulky backpack computers to devices that can be worn on the head.
However, most wearables have been confined to research labs or niche applications like the military.
Scheduled for introduction this year, Google Glass is the first truly wearable computer than could be sold in large numbers for general purpose use, and with its competitors, represents a fourth generation of computing technology.
Over the past 40 years computers have followed a trend from desktop, to laptop, to phone and finally wearables, representing an evolution from machines that you walk up to, to devices that are worn on the body.
Google Glass has similar hardware to mobile phones, but can't make phone calls and is worn on the head rather than held in the hand.
Information is shown to the user via a small micro-projector display, and input provided by camera, touch and speech.
The display is see-through and worn above the eye, so the user is able to see the real world and information superimposed over it.
Over three years in development, Glass was first released to selected developers in the middle of 2013 through the Google Glass Explorer Programme.
There are now more than 10,000 devices in the hands of programmers who are eagerly exploring potential uses for the technology.
The Human Interface Technology Laboratory New Zealand (HIT Lab NZ) at the University of Canterbury is one of the few organisations in New Zealand with access to Google Glass.
Its director, Professor Mark Billinghurst, spent five months at Google last year on sabbatical as part of the Glass team, and returned to New Zealand with the skills to develop Glass applications and several devices to continue the research.
The HIT Lab NZ team is exploring a number of application areas for Glass, the first being the use of the technology for Augmented Reality, or AR.
This is technology that allows virtual images to be overlaid on the real world, and until now has mostly been experienced on handheld devices such as mobile phones and tablets.
The HIT Lab NZ has a long history of conducting ground-breaking research in AR, such as running the first mobile AR advertising campaign in the world in 2005.
"Glass enables a new type of AR experience, where people can see virtual content in their field of view all the time without having to constantly hold a device in front of their face, as with a mobile phone," Professor Billinghurst told the Weekend Herald.
Researchers at the HIT Lab NZ have been exploring what type of AR experiences would be suitable for Glass, and one of the first they have developed is a version of their existing CityViewAR application.
"This software allows people to walk through the streets of Christchurch and see virtual buildings appearing in front of them, showing what the city used to look like before the earthquakes," he said.
"CityViewAR on Glass also shows panorama images taken after the earthquake, allowing people to look around them and use the head-tracking capability of Glass to see a full 360-degree photo of the city damage.
"This shows how wearable devices like Glass might be able to be used in city planning, architecture or tourism."
A second area of research is how Glass can be used to create new types of shared experiences.
Today, many people use Skype or similar video conferencing tools to connect with one another, but in this case, said Professor Billinghurst, "Glass puts a camera on the user's head and so when this is used in conferencing applications it provides a view of the user's workspace and not their face.
"This change in perspective can be used to create new types of collaborative experiences."
At the HIT Lab NZ, researchers are working on tools that would allow people to capture their surroundings and then send them to a remote person so that they can share in some of the same experience.
"For example, a person may be in a beautiful location and want to share it with their friend far away."
With the HIT Lab NZ tools they will be able to quickly capture an immersive photo on Glass and share it with their friend while talking so both people can enjoy the scene together.
"This is a very different experience than what is currently available on mobile devices," he said.
Finally, the researchers are exploring new interaction methods for wearable devices.
"The usual touch-screen methods for handheld devices don't work on Glass, so there is a need for new input methods.
"Glass has a touchpad that provides very intuitive simple touch input, but for some applications there is a need for other techniques."
The researchers have been exploring gesture input, developing technology that allows natural 3D hand interaction with Glass.
In this way, a person could see a 3D virtual object in the world in front of them, and then reach out with their hand to pick it up and rotate it.
This type of input cannot be easily done with existing Glass input methods, and so may enable a wider range of applications.
As can be seen, Glass changes how people can access digital content and at the same time creates new opportunities for researchers and developers.
"The team at the HIT Lab NZ is conducting ground-breaking research to provide new Glass user experiences," Professor Billinghurst said. "One day, technologies developed in New Zealand may be shown on the faces of millions of people worldwide."
The full series
• Part 1: Tackling the obesity epidemic
• Part 2: Solving the human jigsaw puzzle
• Part 3: The Kiwi-made biotech wonder
• Part 4: Learning mental time travel
• Part 5: The birth of the artificial muscle
• Part 6 (today): The age of wearable computing