September 26th, 2013 | Published in Google Research
Previously, we presented Deep Shot, a system that allows a user to “capture” an application (such as Google Maps) running on a remote computer monitor via a smartphone camera and bring the application on the go. Today, we’d like to discuss how we support the opposite process, i.e., transferring mobile content to a remote display, again using the smartphone camera.
Although the computing power of today’s mobile devices grows at an accelerated rate, the form factor of these devices remains small, which constrains both the input and output bandwidth for mobile interaction. To address this issue, we investigated how to enable users to leverage nearby IO resources to operate their mobile devices. As part of the effort, we developed Open Project, an end-to-end framework that allows a user to “project” a native mobile application onto an arbitrary display using a smartphone camera, leveraging interaction spaces and input modality of the display. The display can range from a PC or laptop monitor, to a home Internet TV and to a public wall-sized display. Via an intuitive, projection-based metaphor, a user can easily share a mobile application by projecting it onto a target display.
Open Project is an open, scalable, web-based framework for enabling mobile sharing and collaboration. It can turn any computer display projectable instantaneously and without deployment. Developers can add support for Open Project in native mobile apps by simply linking a library, requiring no additional hardware or sensors. Our user participants responded highly positively to Open Project-enabled applications for mobile sharing and collaboration.