Astronauts Turn to Augmented Reality on the Space Station

Ever tried reading the instruction manual to a space station sewage system while in zero gravity? According to the crew of the International Space Station, it's hard. Damn hard. So hard, in fact, that astronauts turned to the European Space Agency for augmented reality helmets that display repair instructions and data right on the machinery.

Read on for a full breakdown of augmented reality helmets, iPhone temperature sensors, and how ESA made a space helmet out of garage scraps.

From the ESA's story:

The ESA-designed Wearable Augmented Reality (WEAR) is a wearable
computer system that incorporates a head-mounted display over one eye
to superimpose 3D graphics and data onto its wearer’s field of view.

Controlled by voice for hands-free operation, WEAR includes onboard
location and object identification to show astronauts precise
information about what they are looking at, as well as providing
step-by-step instructions to guide them through difficult, lengthy
procedures.

IMG_1706_frank_large,0

     The WEAR Helmet. Image courtesy ESA

WEAR helmets will replace the volumes of paper instruction manuals written in a variety of languages currently floating (what an awful pun) around in the space station.

This is promising for future civilian use of similar or scaled-down augmented reality devices: Traditionally, space programs have trickled down their technology to the civilian consumer.

What's especially interesting is just how easy the WEAR helmet was to make and how much power it packs. According to the ESA, the version going up to the space station is made from mostly off-the-shelf pieces with a bit of Steve Jobs technology:

WEAR has been assembled from
largely off-the-shelf components. Key hardware elements include a
mobile computer connected to a headset with a head-mounted display, a
pair of video cameras – for barcode reading and objection recognition –
and an Inertial Measurement Unit (IMU). 

The image-recognition system allows WEAR to know where its user is
located, by checking his viewpoint against stored 3D information about
the module, derived from a Columbus Computer Aided Design (CAD) model.
The IMU works like the orientation tracking technology on an Apple
iPhone, letting WEAR assess in which direction its wearer is moving and
update his position and the image displayed accordingly.

So how long until we see augmented reality technology like this helping bumbling home carpenters build a treehouse for their kids? It seems like the only real hurdle in bringing the WEAR helmet to a wider market is a large-scale test run to work out the last kinks and modify the technology for features more common to the space heater than the space station.