Microsoft opened its doors to its Beijing research center celebrating the division's 15 year history in the region. In this video report, we'll look at a few of the projects Microsoft showcased during its open house.
Among the projects we saw was one that employed a Kinect sensor to translate sign language. In another project, 3D graphic images on a screen were made more clear wherever the user was looking, as other areas were rendered in less detail with the idea of optimizing the GPU capabilities. A third project tracks facial expressions using a traditional webcam and was able to mimic those in an on screen character. And finally, using a smartphone, researchers took traditional haptic feedback and made it so that certain areas of the screen will have a sensation of friction; it could eventually help blind people use touchscreen devices.
ConversionConversion EmoticonEmoticon