Augmented reality system useful for construction and beyond

1/23/2013

New technology delivers customized information just by taking a photo on a mobile device.

Written by

Yeh Center
Yeh Center

 

MARS system in use
MARS system in use
Above, the MARS system in use on a smart phone.

What if you could get information about something immediately, just by taking a picture of it with your cell phone or iPad? Whether you were house-hunting, trying to jump-start your car, or managing a construction project, a wealth of customized information would be available to you, just by taking a photograph with your mobile device. That’s the promise of new technology developed by CEE Assistant Professor Mani Golparvar-Fard and collaborators Professor Jules White and graduate research assistant Hyojoon Bae of Virginia Tech. 

Mani Golparvar-Fard
Mani Golparvar-Fard
The team’s Mobile Augmented Reality System (MARS) makes use of core technology called HD4AR, an extension of the work that was developed by Golparvar-Fard during his time as a Ph.D. student at Illinois working under his advisers, Feniosky Peña-Mora, now on the engineering faculty at Columbia University, and Silvio Savarese, an engineering professor at the University of Michigan. Using advanced computer vision and image recognition techniques and a store of customized data on a server, the system offers the user access to detailed information presented as overlays on the photo. The stored information can be as elaborate as desired. A person looking at real estate could touch on various features of a property and get information, additional photos or even video. Someone trying to jump a car could see step-by-step instructions referencing the user’s own engine photo. In the case of construction project monitoring, the user could access 3-D project models, project specifications, or schedule information. 

The system can be customized for a variety of uses, but a proposed construction example might work like this: An initial physical 3-D model of the construction site is created using photographs and aligned to a cyber 3-D model on the MARS server. When a user uploads a new photo, MARS determines the user’s location by detecting features within the new image and comparing them to the physical model. With a high degree of accuracy, the system can determine a user’s precise location without GPS or Wifi and display all the relevant information about objects within the person’s view. Overlaid information can be targeted with millimeter-level precision. Mobile devices are plentiful on job sites, making it easy for construction personnel to use MARS for a variety of construction field activities, Golparvar-Fard said, such as providing specifications of different project elements, daily documentation on the job site, quality inspection and progress monitoring. 

Golparvar-Fard and his colleagues launched a start-up in August called PAR Works Inc. with a million dollar investment from Allied Minds in Boston. The company was acknowledged at the 2013 Consumer Electronics Show in Las Vegas in January with an Innovations Design and Engineering Award in the Software and Mobile Apps category.  A research paper by Golparvar-Fard, White and Bae, published in December in the proceedings of the 12th International Conference on Construction Applications of Virtual Reality 2012 in Taipei, Taiwan, won the conference’s Best Paper award.

The team is currently discussing pilot studies of MARS with Zachary Construction for a job in San Antonio, Texas, where more than 100 iPads are in use by project personnel. Similar discussions are ongoing with Turner Construction about testing the technology on high profile projects such as the San Francisco 49ers Stadium, the World Trade Center Transportation Hub in New York City, or the Istanbul Development Center in Turkey. 

The company is running a contest with a $15,000 prize for developers to dream up new applications or interesting concepts for the use of MARS, offering an opportunity for others to explore the truth behind PAR Works’ slogan: “Life is better on MARS.”

The fundamental research aspects of this technology are supported by the National Science Foundation.

 


Share this story

This story was published January 23, 2013.