Chih-Pin Hsiao, Nicholas Davis, Jeremy Duvall, Megha Sandesh, Ravi Karkar
3D GeM explores the potential practices of an immersive work environment afforded by the convergence of 3D models of interaction and vision based user interfaces. Our system applies these technologies in a novel combination to support a redesigned and efficient 3D workspace for daily use. Traditional desktop user interfaces restrict the manipulation of objects to the 2D paradigm. There have been several attempts at 3D gesture inputs using various technologies such as data-gloves and pen based markers, but these are cumbersome to use and often un-intuitive to the user.
Our design scenario for the Workspace assumes a single user interacting with the system on his desk, with one depth camera taking the user input and 2 projectors forming the unified display of the workspace. In this paper we present our design rationale and describe how the user can interact with the workspace in 3D with his bare hands. The emerging researches show that this style of interactions gives users an interactive environment where the UbiComp system can be deployed [lightspace, omni touch, 6th sense]. Many projects also adopts the projected images for assisting users in this paradigm. In this projects, we both explore another possibility of this style of interaction and discuss the ways of how the projecting images can help users.