For my Masters Thesis ,I looked at different ways in which we can use a handheld device (like a smartphone) with a headmounted display for Augmented Reality.

Why ?

Mobile phones offer significant advantages over Head Mounted Displays for AR .Mobile phones have a Larger Pixel Density,Text is easier to read on Mobile Phones compared to a HMD,Text Input is easier on Smartphones and Moving a cursor on the Phone Screen is easier compared to a 1-D Touch Panel on HMDS like Google Glass.

Research Areas

1) Situation Adaptive Visualization Management

When using both HMD and HHD (Hand held device) as displays, the view can be cluttered with too much information. To avoid this, we investigated situation adaptive visualization where the information shown on the HMD and HHD depends on the relationship between the devices and device orientation. The overall goal is to provide different viewing functionalities in a single application while preserving the fluidity of the experience, and not requiring the user to explicitly change viewing modes.

To achieve this, we track the motion of both HMD and HHD to recognize their relative pose.
When the user looks down at the HHD, the HMD turns transparent to help user have clearer view of the HHD. One example use of this could be switching from the AR view shown on the HMD to a 2D map shown on the HHD. When the user looks away from the HHD, the HMD view shows the AR View and the smartphone can be used as a gestural input device.

The video below shows the different configurations:

2) Use Gestures on Smartphones to interact with Virtual Elements displayed on HMD

When the user is looking at an AR scene and wants to move the 3D Model around,he/she can use the phone as a controller to move things around or scale the Augmentations.The following video shows 4 Gestures that I have implemented in my thesis.

The last gesture in the video is the cross-dimensional gesture which shows additional information on the HHD about the objects in the HMD view. Swiping is similar to dragging an object from one display onto another and feels natural on a monocular HMD and HHD.

3) Multi-layered Visualization

When the user puts the HHD in the upright position, the HHD can be used for showing additional details of the virtual object on the HMD or can be used as for multi-layered visualization.

In our scenario, the user could hold the HHD in front of their face which is visible through the HMD, and the handheld display shows another style of visualization (e.g. x-ray vision) of the current AR scene. This is useful in situations where you want to look at different versions of the virtual objects in the AR scene. For example, if you want to look at the interior of a building or an older version of an augmented building. The HHD thus works like a portable magic lens [6] that when brought in front of the users view “augments the augmented” scene.

The images below show the idea for this hybrid system.

The image below shows a view from a prototype system which we built for demonstrating the multi-layered visualization. The colored papers attached on the corners of the HHD is used for tracking the HHD, while the large display on the back is showing the view on the HMD .

modified_half

    -->