...to help us navigate challenging situations and complex environments? This is the premise for SkyCall - an autonomous flying quadcopter and personal tour guide - operating in one of mankind’s most difficult and disorientating labyrinths: MIT campus. We tested this technology on someone you would typically expect to be lost within MIT...
Our Lab is exploring two distinct development paths of UAV technology: a quadcopter's capacity to autonomously sense and perceive its environment, and its ability to interface and interact with people. These parallel aims steered the development of SkyCall's tour-guide system, resulting in a platform that can efficiently locate, communicate with, and guide visitors around MIT campus, specifically along predetermined routes or towards user-determined destinations.
A custom SkyCall app was developed for human/UAV interface, enabling the visitor to make specific requests, and the UAV to both locate and wirelessly communicate with them. When the user presses the ‘call’ button, SkyCall instantaneously accesses the GPS location of the visitor’s phone and relays spatial coordinates to the nearest available UAV.
The quadcopter itself utilises onboard autopilot and GPS navigation systems with sonar sensors and WiFi connectivity (via a ground station), enabling it to fly autonomously and communicate with the user via the SkyCall app. The UAV also integrates an onboard camera as both an information gathering system (relaying images to a ‘base’ location upon encountering the user), as well as a manually-controlled camera, accessible to the visitor-come-tourist again via the SkyCall app.
SkyCall is Phase I of a larger development program that is currently underway at Senseable City Lab, with the broader aim of exploring novel, positive uses of UAV technology in the urban context. This project offers a case study within our ongoing research initiative, and suggests promising new infrastructure potentials.
User presses the 'CALL' button in the app. User’s GPS location from the phone is relayed via server to the guide, which uses this to locate and fly to them.
The Skycall app was developed as the integral mode of communication between the user and the guide. Whilst the guide flies autonomously, the app allows the user a certain degree of control (such as calling, pausing and resuming the guide), whilst allowing the guide to communicate with the user where necessary (such as informing them to close up if they get too far behind).
The material on this web site can be used freely in any publication provided that
1. it is duly credited as a project by the MIT Senseable City Lab
2. a PDF copy of the publication is sent to email@example.com
Sep 14, 2013 – Download high resolution video (MP4, 243M)
Carlo Ratti, Director
Assaf Biderman, Associate Director
Yaniv Jacob Turgeman, R & D Lead
Chris Green, Project Lead
Mike Xia, Technical Lead
Chihao Yo, Visual Design
Shan He, Web Developer
Eugene Kino Lee, Design Assistant
UAV Development Team
Chris Green, Director
Marshall Wentworth, UAV Coordinator
Matthew Claudel, Harvard Student
Nazanin Nayini, UAV Voice
Sam Norland, Editor & Colourist
Alex Kelly, Graphics
Edo Van Breemen, Sound Design
Barry Pugatch, A/V Consultant
For more information, please contact: