We are in the midst of a revolution in the way humans and computers interact. Two of the key factors in this revolution are new sensors and intelligent software that interprets the sensor data. A prime example of this is the Microsoft Kinect sensor, which has enabled a new generation of gestural and voice interaction and led to a groundswell of community innovation.
Joshua Blake of the OpenKinect community lead the discussion. Joshua is one of the thought-leaders in this space. He founded the OpenKinect community, which created the first open source drivers for the Kinect sensor. He is the author of “Natural User Interfaces in .NET” and believes that computers should learn how humans work, rather than humans having to learn how computers work. He is the Technical Director of the InfoStrat Advanced Technology Group, where he leads a team of consultants dedicated to designing and developing Natural User Interfaces (NUI) using devices like the Kinect sensor.
Joshua shared his experiences with designing NUIs and gestural interfaces, and demonstrated how Kinect can be used to create a variety of experiences from touchless interaction to 3D scanning. He showed the original Kinect and the new Kinect v2 sensor side-by-side, and discussed what the near future of natural interaction may look like.