7 Ekim 2014 Salı

Microsoft Research develops FlexSense, RichView and In-Air Gestures that could radically improve how we use our mobile devices

Over a thousand scientists at Microsoft Research are involved in making significant technological breakthroughs in multiple areas worldwide. Numerous interesting concepts are developed, some get converted into applications or products while some don’t. Microsoft Research has recently uploaded three interesting videos about new research that could impact the way we use mobile devices in future.


1. FlexSense: A Transparent Self-Sensing Deformable Surface


FlexSense is a thin transparent flexible film having piezoelectric sensors on its surface that can sense and re-construct complex and continuous deformations accurately without the need for cameras and other external sensing devices. This offers paper-like interactions on mobile devices. 16 transparent piezoelectric sensors are placed on the periphery of the thin sheet that take 16 measurements and reconstruct the detailed shape of deformations done on the flexible sheet i.e. every movement done on the sheet is copied on the screen accurately.


FlexSense can be used several for 2.5D interactions such as a transparent cover for Tablets where bending and touch can create magic lens type effects, layered input and mode switching. It can also be used as an input controller with a high degree-of-freedom for gaming and other activities.



 


2. RichReview: Blending Ink, Speech, and Gesture to Support Collaborative Document Review


RichReview is a document annotation system for rich communication like the one occurring in personal meetings. Here, users can create notes upon digital documents by using three modalities – freeform inking by lightweight ink marker, voice and deictic gestures – supporting the voice in a synchronized manner. This will be helpful for annotation of documents while reading or reviewing them. RichReview enables users to mix and use multiple modalities at a time. These annotations can be easily removed by the editing tool if needed.




3. In-Air Gestures Around Unmodified Mobile Devices


In-Air Gestures is a machine-learning algorithm for extending the interaction space around mobile devices. The technique uses the commonly used RGB camera and the algorithm recognizes a large a variety of in-air gestures, user variations and different lighting condition. This algorithm can run in real time on unmodified mobiles devices including smartphones and smartwatches. The touchscreen’s function is augmented by the gestures and many tasks can be better performed by using in-air gestures. These include- mode switches, menu selection application and task management, and certain navigations. This eliminates problems in having a small touchscreen by expanding the input area to the 3D space around the device. Various hand positions can be used as in-air gestures to manipulate actions such as mapping, gaming or drawing.




Microsoft Research develops FlexSense, RichView and In-Air Gestures that could radically improve how we use our mobile devices

Hiç yorum yok:

Yorum Gönder