NuiLogix™ Perceptual Computing for Network-Based Device Control

NuiLogix is a middleware app that adapts emerging perceptual computing technologies to device and machine control applications. It transports natural interaction sensor data (body coordinate, gesture and speech recognition) over an Ethernet network for remote monitoring and control applications. NuiLogix uses industry-standard Ethernet-based communication protocols like Modbus/TCP to accomplish this goal. Modbus/TCP is an open data communications standard that is ubiquitous in the field of industrial automation. There are hundreds of off-the-shelf data acquisition and control devices available on the market that support a Modbus/TCP server interface, and countless thousands of deployed devices worldwide communicating on Modbus networks. This is our five minute introductory video for the Intel Perceptual Computing Challenge:

The goal of NuiLogix is to introduce emerging natural interaction sensor technologies to fields and applications that currently rely on conventional "hands-on" user interface techniques. One potential use case for NuiLogix-enabled systems is machine control in potentially explosive work environments (refineries, chemical processing plants, etc.)

In many industrial settings, field service engineers and technicians have limited access to interacting with electronic equipment deployed in hazardous environments. Depending on the area classification, interface electronics must use intrinsically safe (IS) barriers and other safeguards to prevent the ignition of a hazardous atmosphere.

With the advent of perceptual computing, it is now feasible to mount a computer and gesture recognition device behind an ATEX-rated window in a purged enclosure and allow hands-free operation. This would be particularly useful when a worker is confined to using gloves and has limited dexterity for interacting with conventional controls.

Furthermore, we see countless other applications that would benefit from this technological approach:

  • Training, rehabilitation, behavior modification, assistive technology, and alternative communication applications that allow a person to be monitored for performance, fatigue, distress and other bodily indicators using gesture and voice recognition. This data can be used as part of a feedback loop for training and rehabilitation equipment, or to help patients with mobility constraints to control lighting, temperature and other physical entities in their environment.
  • Immersive entertainment applications that blend natural interaction data with the control of external lighting, fans, temperature, floor tilt and other physical parameters to enhance the user experience.
  • Augmented and virtual reality applications that have the ability to affect change on the physical environment in response to body position, speech and gestures.
  • Enhanced telepresence - the ability to interact physically with a person fitted with Nitinol ("muscle wire") woven clothing and gloves or similar wearable technologies. Imagine transmitting an embrace to a loved one in another state, or experience the firmness of a handshake from an associate in another country, all through natural interaction data transmitted over the Internet.

NuiLogix™ - Leap Motion for Machine Control

This video demonstrates a proof-of-concept experiment for NuiLogix, an app for merging natural user interface (NUI) technologies with commercial off-the-shelf control hardware for machine-control applications. In this demo we are using a Leap Motion Dev Board V.05 (this is a pre-release developer model; commercial models will be available in Spring 2013).

The video demonstrates three modes of operation: 1) local control of physical switches and a potentiometer for turning devices on and off and adjusting voltage levels; 2) remote control through a touch screen based human-machine interface (HMI); and 3) hands-free remote control using the Leap hand and finger gesture recognition device.

One potential use case for this is industrial machine control when operating in a hazardous area. In many industrial settings, field service engineers and technicians have limited access to interacting with electronic equipment deployed in potentially explosive environments. Depending on the area classification, interface electronics must use intrinsically safe (IS) barriers and other safeguards to prevent the ignition of a hazardous atmosphere. In our proposed scenario, the computer screen and gesture recognition device could be mounted behind an ATEX-rated window in a purged enclosure and allow hands-free operation.

In this demonstration, the user's x-y-z finger coordinates are measured in real-world millimeters. An imaginary window frame is created in software; when the user's finger is outside of the window frame area (x and y axes) or is penetrating too deeply through the window frame (z axis), the software will not respond to hands-free control. When the user's finger is within the imaginary window frame, the on-screen control that is being pointed to will be highlighted with a yellow border. If the user holds that position and then moves the finger in the z-axis, the control will respond and transmit data over the wireless network connection.

The target data acquisition and control hardware platform used in this experiment is an Opto 22 programmable automation controller (PAC) and I/O rack. Opto 22 is a leading manufacturer of advanced industrial automation technologies. We have used their donated SNAP PAC Learning Center (shown in the video) in several previous proof-of-concept experiments because it provides a convenient hardware platform for demonstrating physical user interface interactions.

In this experiment the PAC is connected to a wireless access point. All data transfer to and from the PAC is done over a wireless Wi-Fi network. We developed a custom control strategy that runs on the PAC, which implements a Modbus/TCP server as well as the interactive logic to allow both physical (local) interactions and remote data acquisition and control via the Modbus communications protocol.

According to many sources, Modbus has become a de facto standard communication protocol and is currently among the most commonly available means of connecting industrial electronic devices. NuiLogix will support multiple Ethernet-based protocol standards like Modbus/TCP (used in this demo) and OPC Unified Architecture (UA).

The remote HMI client is a C#/WPF application that works in conjunction with NuiLogix, our middleware prototype product currently under development for introducing emerging natural user interface (NUI) and brain-computer interface (BCI) sensors to machine and device control applications. The HMI client is a Windows 8 desktop app running on an Ultrabook computer with touch screen capability.