Interactive & Emergent Autonomy Lab


Active Learning and Data-Driven Control Active Perception in Human-Swarm Collaboration Algorithmic Matter and Emergent Computation Control for Nonlinear and Hybrid Systems Cyber Physical Systems in Uncertain Environments Information Maximizing Clinical Diagnostics Reactive Learning in Underwater Exploration Robot-Assisted Rehabilitation Software-Enabled Biomedical Devices

Active Perception in Human-Swarm Collaboration

Human-Swarm Collaboration

Human-swarm collaboration in an urban environment

Quadcopters ergodically exploring an information distriubtion

We are interested in improving human-swarm team performance by including the human directly into the control loop. While fully autonomous robotic exploration requires less supervision, it neglects the operator’s intuition. We have developed a shared control algorithm that leverages the capabilities of the human and the swarm by. This project focuses on determining how autonomy allocation affects exploration efforts, thus task performance of the human-swarm collaboration. Ongoing efforts utilize virtual reality (VR) environment build in Unity Software and HTC VIVE headset before transferring to field tests. At the same time, we are collecting biometric data such as eye gaze (Pupil Labs), EEG (Emotiv) and EKG (SOMNOmedics) to determine operator’s cognitive state.


Tommy Berrueta (Ph.D. Student)
Joel Meyer (Ph.D. Student)
Katarina Popovic (Ph.D. Student)
Milli Schlafly (Ph.D. Student)
Annalisa Taylor (Ph.D. Student)
Allie Pinosky (Ph.D. Student)




Real-time area coverage and target localization using receding-horizon ergodic exploration
A. Mavrommati, E. Tzorakoleftherakis, I. Abraham, and T. D. Murphey
IEEE Transactions on Robotics, vol. 34, no. 1, pp. 62–80, 2018. PDF, Video 1, Video 2


This project is funded by DARPA: Interaction & Perception: Multi-Source Spectral Framework for Human-Swarm Collaboration.