Faculty Member: Rick Stone (IMSE; rstone@iastate.edu)

Mentors: Tom Schnieders (tms@iastate.edu) and Kevin Push (undergrad; kapush@iastate.edu)

REU Interns: Inshira Seshie, Mary Truong, and Stephan Terry

Advances in technology have enabled one person to control a small fleet of observation drones, e.g., for security surveillance, recording of important events, or exploration of new environments. To control a drone fleet well, the operator needs to be able to perceive 1) the state of the fleet and 2) the state of the environment as sensed by the fleet. Finally, the operator needs to be able to command the fleet using contextualized cues, e.g., “Go scan behind that tree.”  This project will explore a variety of user interfaces for sensing the environment via a drone fleet and commanding that fleet.  

Most people are familiar with Google Street View, an montage of images captured from driving vehicles. A fleet of drones with cameras might capture similarly imagery, but from a bird’s eye view that is typically unfamiliar to humans. Also, the montage will likely have gaps due to incomplete camera coverage. How will these challenges affect human situational awareness? What interface techniques can be used to ease these challenges and extend the human operator’s perception in the field? 

 Participants will work closely with local law enforcement personnel as potential users. Note that being on this project does not guarantee that team members  will be able to fly the drones themselves; that depends on various local regulations that are in flux.