Project Soli and Multimodal Interaction Futures
June 2015 focused on analyzing Google's Project Soli announcement and exploring how radar-based gesture recognition could integrate with visual touch concepts to create more robust multimodal...
Dive into my thoughts on coding, tech trends, and developer life. Explore all posts, filter by category, or search by tag.
June 2015 focused on analyzing Google's Project Soli announcement and exploring how radar-based gesture recognition could integrate with visual touch concepts to create more robust multimodal...
May 2015 brought significant progress in combining eye gaze tracking with visual touch interaction, exploring how these complementary input modalities could work together to create more natural and...
March 2015 focused on integrating eye tracking technology with visual touch systems, exploring whether gaze data could enhance touch interaction accuracy and user experience.
February 2015 focused on evaluating high-resolution cameras for visual touch typing applications while finalizing patent drawings and completing Intel RealSense SDK migration work.
January 2015 brought experimentation with ultra-wide-angle fisheye lenses and alternative camera approaches, exploring whether extreme fields of view could solve the proximity sensing challenges...
December 2014 centered on migrating the Visual Touchscreens codebase from Intel's Perceptual Computing SDK to the new RealSense 2014 platform—a technically necessary but time-consuming transition.