Build an application or an entire API for other developers to use the LEAP motion controller to be able to use ArcGIS using hand gestures in the air instead of having to use a mouse or keyboard. This would be useful for field workers who might be wearing gloves or hands are caked with mud or grease. The Git repo is here:
Create a simple interface that lets a user do the basic pan, zoom, and identify tasks using gestures.
Add more options and gestures that allows a user to access a menu of other tasks they can do with gestures. This could include turning layers on and off, doing a rectangle select on multiple features, or to collect data, edit feature attributes, and even add new points and data to the map. This might get clumsy since it requires the use of a keyboard to enter attribute data. Another option would be voice recognition to enter data or making everything a drop down/multiple choice type entry. Using an on screen keyboard with gestures could be too time consuming and/or error prone.
Turn the code into an API that would allow users to add their own gestures then add in the actions to take based on ArcObjects API.