Augmented reality



Augmented reality (AR) is a field of computer research which deals with the combination of real-world and computer-generated data (virtual reality), where computer graphics objects are blended into real footage in real time.

At present, most AR research is concerned with the use of live video imagery which is digitally processed and "augmented" by the addition of computer-generated graphics. Advanced research includes the use of motion-tracking data, fiducial markers recognition using machine vision, and the construction of controlled environments containing any number of sensors and actuators.

Synopsis
AR is one of the more focused descriptions. It covers a subset of AR's original goal, but it has come to be understood as representing the whole domain of AR: Augmented reality is an environment that includes both virtual reality and real-world elements. For instance, an AR user might wear translucent goggles; through these, he could see the real world, as well as computer-generated images projected on top of that world. Azuma defines an augmented reality system as one that
 * combines real and virtual,
 * is interactive in real-time,
 * is registered in three dimensions.

History

 * 1849: Richard Wagner introduces the idea of immersive experiences using a darkened theatre and surrounding the audience in imagery and sound.
 * 1938: Konrad Zuse invents the first digital computer known as the Z1.
 * 1948: Norbert Wiener creates the science of cybernetics: transmitting messages between man and machine.
 * 1962: Morton Heilig, a cinematographer, creates a motorcycle simulator called Sensorama with visuals, sound, vibration, and smell.
 * 1966: Ivan Sutherland invents the head-mounted display suggesting it was a window into a virtual world.
 * 1975: Myron Krueger creates Videoplace that allows users to interact with virtual objects for the first time.
 * 1989: Jaron Lanier coins the phrase Virtual Reality and creates the first commercial business around virtual worlds.
 * 1990: Tom Caudell coins the phrase Augmented Reality while at Boeing helping workers assemble cables into aircraft.
 * 2008: The first end user applications featuring Augmented Reality enter the market: Wikitude AR Travel Guide launches on Oct. 20, 2008 with the G1 Android phone and is downloaded about 50,000 times by year end.

Fields of augmented reality
For many of those interested in AR, one of its most important characteristics is the way in which it makes possible a transformation of the focus of interaction. The interactive system is no longer a precise location, but the whole environment; interaction is no longer simply a face-to-screen exchange, but dissolves itself in the surrounding space and objects. Using an information system is no longer exclusively a conscious and intentional act.

Outdoor Augmented Reality
A new and major area of current research is into the use of AR outdoors. GPS and orientation sensors enable backpack computing systems to take AR outdoors.

Early systems have been developed by Steven Feiner at Columbia University (MARS system) and Bruce H. Thomas and Wayne Piekarski in the Wearable Computer Lab at the University of South Australia (Tinmith and ARQuake systems).

Trimble Navigation, a provider of positioning solutions, has been researching Outdoor AR in collaboration with the Human Interface Technology Laboratory at its New Zealand R&D site in Christchurch. Local network news has reviewed its progress.

Mobile Augmented Reality
Mobile Augmented Reality, or "mobile AR", is a combination of AR and mobile computing technology on mobile phones. Mobile phone's applications can use both fiduciary marker and markerless video tracking for image registration and insertion of 3d or 2d virtual objects into camera frame. Phone on-line connection in concert with a GPS unit, accelerometer and/or compass could also be used in combination with the camera for image registration.

Some of the earliest applications emerging in this field are projects such as Enkin and Wikitude on the Google Android platform, Tonchidot's Sekai Camera on the iPhone platform, augmented reality marker-based games on the Nokia S60 platform, and Kweekies by int13.

Mobile Projective Augmented Reality
Mobile Projective Augmented Reality (MPAR) involves use of small hand held projector in combination with camera and/or motion tracker. The virtual scene is projected on real world. The recent trend towards miniaturization of projection technology indicates that soon mobile phone with embedded nano-projectors will be available in consumer market. This will enable new AR based user interfaces that are currently not possible. A demo of possible applications of MPAR was presented by MIT's Fluid Interface group.

Ubiquitous computing
AR has clear connections with the ubiquitous computing (abbreviated UC) and wearable computers domains. Mark Weiser stated that "embodied virtuality", the original term he used before coining "ubiquitous computing", intended to express the exact opposite to the concept of virtual reality (Mark Weiser's personal communication, Boston, March 1993). The most salient distinction to be made between AR and UC is that UC does not focus on the disappearance of conscious and intentional interaction with an information system as much as AR does: UC systems such as pervasive computing devices usually maintain the notion of explicit and intentional interaction which often blurs in typical AR work such as Ronald Azuma's work. The theory of Humanistic Intelligence (HI), however, also challenges this semiotic notion of signifier and signified. In particular, HI is intelligence that arises from the human being in the feedback loop of a computational process in which the human is inextricably intertwined, and does not typically require conscious thought or effort. In this way, HI, which arises from wearable Computer Mediated Reality, shares a lot in common with AR.

Augmented stereoscopic reality
Stereoscopy can also be utilized in Augmented Reality in order to give the illusion of depth to digital 3D images that are projected alongside real-world objects.

Notable researchers

 * Steven Feiner is the leading pioneer of augmented reality, and author of the first paper on the subject.
 * Bruce H. Thomas is the current Director of the Wearable Computer Laboratory at the University of South Australia. He is currently a NICTA fellow, CTO A-Rage Pty Ltd, Member of HxI team, and visiting Scholar with the Human Interaction Technology Laboratory, University of Washington. He is the inventor of the first outdoor augmented reality game ARQuake. His current research interests include: wearable computers, user interfaces, augmented reality, virtual reality, computer supported cooperative work (CSCW), and tabletop display interfaces.
 * Wayne Piekarski is the inventor of the Tinmith System.
 * Oliver Bimber and Ramesh Raskar are the leading researchers in the field of spatial augmented reality (SAR)

Main Computer Vision topics of Augmented Reality

 * 3D reconstruction
 * Bundle adjustment
 * Exponential map
 * Fiduciary markers
 * Image registration
 * Structure from motion
 * Video tracking

Examples
Commonly known examples of AR are the yellow "first down" line seen in television broadcasts of American football games, and the colored trail showing location and direction of the puck in TV broadcasts of hockey games. The real-world elements are the football field and players, and the virtual element is the yellow line, which is drawn over the image by computers in real time. Similarly, rugby fields and cricket pitches are branded by their sponsors using Augmented Reality; giant logos are inserted onto the fields when viewed on television.

Another type of AR application uses projectors and screens to insert objects into the real environment, enhancing museum exhibitions for example. The difference to a simple TV screen for example, is that these objects are related to the environment of the screen or display, and that they often are interactive as well.

Many first-person shooter video games simulate the viewpoint of someone using AR systems. In these games the AR can be used to give visual directions to a location, mark the direction and distance of another person who is not in line of sight, give information about equipment such as remaining bullets in a gun, and display a myriad of other images based on whatever the game designers intend.

Most of the possible applications of AR will, however, need personal display glasses.

In some current applications like in cars or airplanes, this is usually a head-up display integrated into the windshield.

Current applications

 * Advertising
 * promoting a new product by providing impressive and interactive AR application on Internet
 * Support with complex tasks, in assembly, maintenance, surgery etc.:
 * by inserting of additional information into the field of view (for example, a mechanic getting labels displayed at parts of a system and getting operating instructions)
 * by visualization of hidden objects (during medical diagnostics or surgery as a virtual X-ray view, based on prior tomography or on real time images from ultrasound or open NMR devices, e.g., a doctor could "see" the fetus inside the mother's womb). See also Mixed Reality
 * Navigation devices
 * in buildings, e.g. maintenance of industrial plants
 * outdoors, e.g. military operations or disaster management
 * in cars (headup displays or personal display glasses showing navigation hints and traffic information)
 * in airplanes (headup displays in fighter jets are one of the first AR applications anyhow; meanwhile fully interactive as well, with eye pointing)
 * Military and emergency services (wearable systems, showing instructions, maps, enemy locations, fire cells etc.)
 * Prospecting in hydrology, ecology, geology (display and interactive analysis of terrain characteristics, interactive three-dimensional maps that could be collaboratively modified and analyzed)
 * Visualization of architecture (virtual resurrection of destroyed historic buildings as well as simulation of planned construction projects)
 * Enhanced sightseeing : labels or any text related to the objects/places seen, rebuilt ruins, building or even landscape as seen in the past. Combined with a wireless network the amount of data displayed is limitless (encyclopedic articles, news, etc...).
 * Simulation, e.g. flight and driving simulators
 * Collaboration of distributed teams
 * conferences with real and virtual participants. See also Mixed Reality
 * joint work at simulated 3D models
 * Entertainment and education
 * virtual objects in museums and exhibitions. See also Mixed Reality
 * theme park attractions (Such as Cadbury World)
 * games (e.g. ARQuake or The Eye of Judgment). See also Mixed Reality

Future applications

 * Expanding a PC screen into the real environment: program windows and icons appear as virtual devices in real space and are eye or gesture operated, by gazing or pointing. A single personal display (glasses) could concurrently  simulate a hundred conventional PC screens or application windows all around a user
 * Virtual devices of all kinds, e.g. replacement of traditional screens, control panels, and entirely new applications impossible in "real" hardware, like 3D objects interactively changing their shape and appearance based on the current task or need.
 * Enhanced media applications, like pseudo holographic virtual screens, virtual surround cinema, virtual 'holodecks' (allowing computer-generated imagery to interact with live entertainers and audience)
 * Virtual conferences in "holodeck" style
 * Replacement of cellphone and car navigator screens: eye-dialing, insertion of information directly into the environment, e.g. guiding lines directly on the road, as well as enhancements like "X-ray"-views
 * Virtual plants, wallpapers, panoramic views, artwork, decorations, illumination etc., enhancing everyday life. For example, a virtual window could be displayed on a regular wall showing a live feed of a camera placed on the exterior of the building, thus allowing the user to effectually toggle a wall's transparency
 * With AR systems getting into mass market, we may see virtual window dressings, posters, traffic signs, Christmas decorations, advertisement towers and more. These may be fully interactive even at a distance, by eye pointing for example.
 * Virtual gadgetry becomes possible. Any physical device currently produced to assist in data-oriented tasks (such as the clock, radio, PC, arrival/departure board at an airport, stock ticker, PDA, PMP, informational posters/fliers/billboards, in-car navigation systems, etc. could be replaced by virtual devices that cost nothing to produce aside from the cost of writing the software.  Examples might be a virtual wall clock, a to-do list for the day docked by your bed for you to look at first thing in the morning, etc.
 * Subscribable group-specific AR feeds. For example, a manager on a construction site could create and dock instructions including diagrams in specific locations on the site.  The workers could refer to this feed of AR items as they work.  Another example could be patrons at a public event subscribing to a feed of direction and information oriented AR items.

Specific applications

 * LifeClipper, a wearable AR system
 * Characteroke, a portable AR display costume, whereby the head and neck are concealed behind an active flat panel display.
 * BBC's Merlin MagicSymbol, a free download (U.K. only)from BBC's Merlin site giving access to exclusive Merlin content
 * MARISIL, a media phone user interface based on AR
 * CyberCode, a visual tagging system where real-world objects are recognizable by a computer.
 * Wikitude an application for the Android Phone which makes Wikipedia a location based service. The actual camera view is mixed with information from Wikipedia. ING offers an app with almost the same interface for locating the nearest ATM see Springwise for more info.
 * Imgaugasse, a tractor engine's hydraulics block assembly project by VTT research center.

Advertising
At the 2008 LA Auto Show, Nissan unveiled the concept vehicle Cube and presented visitors with a brochure which, when held against a webcam,showed several versions of the vehicle interacting with the brochure. The brochure is also available from the company website http://www.nissanusa.com/cube/.

On 16 Dec 2008, at a Volvo Ocean Race 2008-2009 event, Volvo Car Malaysia demonstrated the use of this same technology with its media partners a 3D Volvo Open 70 racing yacht. This virtual 3D Volvo Open 70 racing yacht can now be built on their teaser website at: http://www.vcc.com.my/oceanrace/.

In January 2009 Toyota used Augmented Reality to provide an interactive demo of the new Toyota iQ. The program was created by Inition using their MagicSymbol system and can be downloaded from Toyota's website : http://www.toyota.co.uk/cgi-bin/toyota/bv/frame_start.jsp?id=iQ_reality

Conferences

 * 1st International Workshop on Augmented Reality (IWAR'98), San Francisco, Nov. 1998.
 * 2nd International Workshop on Augmented Reality (IWAR'99), San Francisco, Oct. 1999.
 * 1st International Symposium on Mixed Reality (ISMR'99), Yokohama, Japan, March 1999.
 * 2nd International Symposium on Mixed Reality (ISMR'01), Yokohama, Japan, March 2001.
 * 1st International Symposium on Augmented Reality (ISAR 2000), Munich, Oct. 2000.
 * 2nd International Symposium on Augmented Reality (ISAR 2001), New York, Oct. 2001.
 * 1st International Symposium on Mixed and Augmented Reality (ISMAR 2002), Darmstadt, Oct. 2002.
 * 2nd International Symposium on Mixed and Augmented Reality (ISMAR 2003), Tokyo, Oct. 2003.
 * 3rd International Symposium on Mixed and Augmented Reality (ISMAR 2004), Arlington, VA, Nov. 2004.
 * 4th International Symposium on Mixed and Augmented Reality (ISMAR 2005), Vienna, Oct. 2005.
 * 5th International Symposium on Mixed and Augmented Reality (ISMAR 2006) Santa Barbara, Oct. 2006.
 * 6th International Symposium on Mixed and Augmented Reality (ISMAR 2007) Nara, Japan, Nov. 2007.
 * 7th International Symposium on Mixed and Augmented Reality (ISMAR 2008) Cambridge, United Kingdom, Sep. 2008.