Zooming and Panning are the most basic forms of interaction for the navigation and exploration of user interfaces and information spaces. Mobile devices often use touch gestures such as „Drag“ oder „Pinch“ to accomplish these tasks. Different approaches have shown that the spatial movements of a mobile device can be used as an alternative to touch input. In this paper Project Tango is used as the basis for the implementation of a software module for the recognition of spatial gestures. The prototypes are able to handle a basic vocabulary of gestures without relying on additional stationary tracking systems, showing that mobile devices can act as tangible views. This provides information visualizations with an additional natural means of interaction. Even though the integration of spatial gesture recognition technologies into mobile devices is still in its infancy, it opens up many exciting fields of research.