Home / Embedded Systems / Crowd-Funded Startup Develops Touchless Controller

Crowd-Funded Startup Develops Touchless Controller

Flow, a programmable wireless controller with motion detection, proves a great funding success on Indiegogo.

Flow, a programmable wireless controller with motion detection, proves a great funding success on Indiegogo.

Flow is expected to offer out-of-the-box control functions for AutoCAD and Rhino, among others.

Flow is expected to offer out-of-the-box control functions for AutoCAD and Rhino, among others.

On Indiegogo, the campaign to fund Flow, described as “the world’s most magical controller,” has gone extremely well. The team — a group with a mix of electrical engineering and industrial design backgrounds — wanted to raise U.S. $50,000. Instead, they raised $252,000+. The project has also attracted mainstream press coverage from cool-tech spotting sites like TechCrunch and engadget.

The so-called magic may be in Flow’s touch-free approach. It’s a programmable wireless controller that detects and responds to motions and gestures. Philip Michaelides, the team’s technical lead, has “worked on quantum computers in Berkeley, worked for companies like Audi, and has built microchips from scratch,” according to the team bio. He explains, “The gesture recognition happens at low level on the IC (integrated circuits) directly. We implemented this in the firmware so the recognition runs really fast. For us it is important the the user does not feel a time delay or needs a high processing power (like the Leap motion) to use Flow. Furthermore, energy consumption and BLE (Bluetooth low energy) timings demand low level integration of gestures. The sensor works like a beam that gets reflected by objects like your hand. Gestures like variation of distance and different direction of swipes are possible.”

The 360-degree dome-like sensor uses infrared to pick up hand gestures, so your natural hand movements can be programmed to trigger a variety of functions. For Photo-editing programs like Photoshop or smart music players, Flow could offer new ways to adjust color contrasts or browse playlist — simply by a wave of your hand in the air. Flow’s designers are promising out-of-the-box controls for, among others, AutoCAD and Rhino.

In gaming and entertainment, Wii’s motion-triggered games make it possible for players to interact with or control virtual objects without a joystick or a mouse. Can motion-triggered interaction be used to, say, perform 3D modeling tasks? Many engineers and designers remain skeptical of the possibility, but Michaelides says, “Our industrial designer uses Flow a lot for Grasshopper which is a Plugin for [surface-modeling program] Rhino. Flow helps a lot when sliders need to be adjusted, geometries baked, or simply switched between options. We were doing a lot of user testing the last couple of months and tried a huge variety of shapes, buttons and sliders. We learned a lot from the feedback of our users improved a lot our design and sensor layout. This all took place during the last half a year.”

The emergence of relatively affordable motion-sensing gadgets like Leap Motion and Flow could prompt the design software industry to rethink their menu-dependent software interfaces, developed for the mouse-and-keyboard combo. But the touch-free approach may not take off unless CAD developers are willing to make radical changes at the code level. Michaelides says, “I would encourage software vendors to work together with us so we can introduce new, inspiring, and intuitive interfaces that are open for development. That way, established software products can be combined with new hardware and a new way of precise and natural interaction”

 Visit Flow’s Indiegogo campaign page here.

If you enjoyed this post, make sure you subscribe to my RSS feed!

About Kenneth

Kenneth Wong has been a regular contributor to the CAD industry press since 2000, first an an editor, later as a columnist and freelance writer for various publications. During his nine-year tenure, he has closely followed the migration from 2D to 3D, the growth of PLM (product lifecycle management), and the impact of globalization on manufacturing. His writings have appeared in Cadalyst, Computer Graphics World, and Manufacturing Business Technology, among others.

2 comments

  1. So essentially, from what I saw in the video and the site you assign some common shortcuts or macros to this device and you hold your hand above it making guestures. Not sure this is better than using any 3D Logitech mouse. I also heard the clicking of a mouse in the video, so I am assuming the person still uses the mouse with the other hand. I would call this a parallel technology and not interesting to me. I am going with using two multi-touch monitors (one part of a 2-1 laptop). I recently purchased a 2-1 and find it feels like using crude/coarse paper; i.e. the stylus (at a fraction of the cost) is not precise like a Wacom device, but hoping in the future companies will provide something flexible for coarse and fine control at an inexpensive price point.

  2. Jim, thanks for the comment! For tasks that require precision, there obviously is still a need for keyboard input (like specifying the height of an extrusion). But I think the assembly rotation, inspection, and spinning that feels extremely awkward and unnatural with mouse and keyboard could become so much more intuitive with gesture-driven input. One would simply move or rotate the virtual object just as in real life. The ergonomic advantages are undeniable–I expect it would certainly reduce carpal tunnel injuries from repeated clicks and drags.

Leave a Reply

Your email address will not be published. Required fields are marked *

*