June 2014 marked a transformation from technical experimentation to product presentation. The month involved refining prototypes, creating professional demonstration videos, establishing the Visual Touchscreens brand, and beginning outreach to potential partners.

Calibration Refinement

Early June focused on improving the accuracy of touch point correlation. The depth camera sensors were remounted in correct orientation relative to the touch panels after discovering misalignment issues. Even with proper positioning, visual hand representations didn’t always perfectly align with touch points across the entire screen surface. This was acknowledged as a limitation of the current sensor technology rather than a solvable software problem.

A significant improvement came from addressing glass reflections. Initial attempts to filter reflections by setting minimum distance thresholds proved impractical—distances varied too much across the surface. The solution involved creating a reflection mask: capturing the reflection pattern when no hands were present and storing it as part of calibration settings. Working around XAML serialization limitations with multi-dimensional arrays required using single-dimension arrays as a workaround.

The calibration process was streamlined to use keyboard shortcuts rather than touch gestures:

  • Left mouse click for top-left mapping rectangle
  • Right mouse click for bottom-right mapping rectangle
  • ‘C’ key to clear reflection mask
  • ‘M’ key held down to define reflection mask from multiple scans
  • ‘B’ key to set bottom/minimum distance after holding a business card against the panel
  • ‘T’ key to set top/maximum distance with hand at cutoff level

Visual feedback was enhanced through a color gradient: purple for far distances transitioning through blue to the hand approaching the surface. Later refinement added transparency, allowing users to see content beneath their hands—a more effective interaction model.

Calibration user interface with keyboard shortcuts

Color gradient visualization from purple to blue

Transparency mode showing hand overlay on content

Leap Motion Platform Testing

Testing the next Leap Motion V2 beta release on the acrylic prototype stands showed reduced reliability with the touch panel present. Removing the touch panel dramatically improved performance, confirming that glass reflections adversely affected both depth scanner and Leap Motion approaches. This insight influenced future prototype design considerations.

Professional Video Production

Demonstration video creation became a major focus mid-month. A mini PC running Windows 8.1 was configured specifically for demos. A TV trolley and coffee table stand were purchased to enable professional-looking video recording with clean backgrounds. The Epson Moverio BT-200 AR glasses were set up successfully with iDisplay for screen mirroring, enabling augmented reality demonstrations.

Camtasia was upgraded to version 8.4 across multiple machines for video editing. Recording sessions involved careful audio setup with a Samson microphone, multiple takes to get levels right, and extensive post-production editing. The goal: create compelling visual demonstrations that could communicate the technology’s potential to non-technical audiences.

Professional demonstration setup for video recording


Video recording session in progress

Multiple videos were produced showcasing different use cases: TV tablet interaction, AR glasses integration, and various touch interface scenarios. These videos would become the primary communication tool for outreach efforts.

Brand Identity and Web Presence

The name “Surface Symbolics” proved problematic—too abstract and difficult for average people to understand. Extensive domain research led to the acquisition of multiple Visual Touch-related domains:

  • visualtouchtv.com
  • visualtouchglasses.com
  • visualtouchdisplay.com
  • visualtouchscreens.com
  • visualtouchcomputing.com
  • visualtouchscreen.com (negotiated purchase)

The business name “Visual Touchscreens” was registered and established as the primary brand. A new website was developed on Squarespace featuring the demonstration videos hosted on Wistia with analytics tracking. The site was designed to quickly show real-world prototype demonstrations rather than leading with abstract concepts.

Competition Entry

The Intel Make It Wearable competition presented an opportunity to gain visibility. The application required articulating the business case, target users, technical differentiation, and hardware requirements—including mandating integration of Intel System on a Chip from the Quark or Atom families. Preparing this submission forced clarity around product positioning and market opportunity. The application deadline of June 24th drove focused effort on both written materials and video submissions.

Deployment Challenges

Making the depth camera tablet application deployable proved surprisingly difficult. Initial attempts with WiX installer technology had excessive learning curve requirements. Advanced Installer eventually proved more successful, though resolving runtime DLL dependencies required careful attention. Successfully getting the installer working on test machines was necessary for demonstrations on different hardware.

Platform Expansion Ideas

Research into the Amazon Fire Phone and its development by Lab126 suggested potential platform opportunities. The device’s additional cameras and sensors aligned well with visual touch concepts, representing an interesting direction for future integration.

The month closed with professional demonstration materials ready and brand identity established. The technical work of months prior had been packaged into polished demonstration videos and a functional website showcasing the Visual Touchscreens technology.