Tuesday, February 22, 2011

Paper Reading #10: Enabling beyond-surface interactions for interactive surface with an invisible projection

Comments:



Reference Information:

Title: Enabling beyond-surface interactions for interactive surface with an invisible projection

Authors: Li-Wei Chan, Hsiang-Tao Wu, Hui-Shan Kao, Ju-Chun Ko, Home-Ru Lin, Mike Y. Chen, Jane Hsu, Yi-Ping Hung, all from National Taiwan University, Taipei, Taiwan Roc

Presentation Venue: UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology

Summary:

The researchers are presenting a programmable infrared technique that will allow mobile devices to have interaction with multi-touch and tangible surface objects. The process utilizes both infrared and visible light projectors to display visible content and invisible markers. Infrared cameras connected to the mobile devices can perceive this input and compute their positions in 3-space. This is supplemented by three tools that the researchers refer to as i-m-View, i-m-Lamp, and i-m-Flashlight that can also be interacted with.


Early results have proven that users enjoy working with the interface. They have found that the users report to them that the larger displays provide easier navigation through the tools, etc. The users found that they intuitively wanted to be able to drag the map around when in use, which was a feature that was not included in the current Trial. The researchers thought it was a good idea and plan to implement this feature in the future.


The researchers also revealed some problems the users reportedly experienced when using the interface. One of these was a sense of disconnection between their use of the tabletop system and their connection to other users. The researchers plan on addressing ways in which to minimize this phenomenon. Their goal is to better facilitate group interaction with the system.

Discussion:


I think this goes a long way towards making certain science fiction interfaces we have seen a reality. I think it could make for some very interesting user interface designs. It could potential free the users up from cumbersome equipment and allow more freedom of usage. It could allow natural gestures and body language to be incorporated into the set of interface tools for the system to recognize and make the users experience more organic.


This type of interface could lead to many distinct applications. With 3D projection, I can see how a geneticist could manipulate DNA strands floating in the air in front of them. An architect could have a 3D projection of their latest building that they could move, scale and rotate to view at different angles with different environmental conditions applied in real time. For game development, modelers could sculpt some of their models with their fingers as if they were sculpting in clay.

No comments:

Post a Comment