How can we reimagine UAV technology...

...to help us navigate challenging situations and complex environments? This is the premise for SkyCall - an autonomous flying quadcopter and personal tour guide - operating in one of mankind’s most difficult and disorientating labyrinths: MIT campus. We tested this technology on someone you would typically expect to be lost within MIT...

Development


Our Lab is exploring two distinct development paths of UAV technology: a quadcopter's capacity to autonomously sense and perceive its environment, and its ability to interface and interact with people. These parallel aims steered the development of SkyCall's tour-guide system, resulting in a platform that can efficiently locate, communicate with, and guide visitors around MIT campus, specifically along predetermined routes or towards user-determined destinations.

A custom SkyCall app was developed for human/UAV interface, enabling the visitor to make specific requests, and the UAV to both locate and wirelessly communicate with them. When the user presses the ‘call’ button, SkyCall instantaneously accesses the GPS location of the visitor’s phone and relays spatial coordinates to the nearest available UAV.


The quadcopter itself utilises onboard autopilot and GPS navigation systems with sonar sensors and WiFi connectivity (via a ground station), enabling it to fly autonomously and communicate with the user via the SkyCall app. The UAV also integrates an onboard camera as both an information gathering system (relaying images to a ‘base’ location upon encountering the user), as well as a manually-controlled camera, accessible to the visitor-come-tourist again via the SkyCall app.

Future

SkyCall is Phase I of a larger development program that is currently underway at Senseable City Lab, with the broader aim of exploring novel, positive uses of UAV technology in the urban context. This project offers a case study within our ongoing research initiative, and suggests promising new infrastructure potentials.

1 : Calling

User presses the 'CALL' button in the app. User’s GPS location from the phone is relayed via server to the guide, which uses this to locate and fly to them.

diagram1
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8

App

The Skycall app was developed as the integral mode of communication between the user and the guide. Whilst the guide flies autonomously, the app allows the user a certain degree of control (such as calling, pausing and resuming the guide), whilst allowing the guide to communicate with the user where necessary (such as informing them to close up if they get too far behind).

app1
app2
app3
app4

Press

The material on this web site can be used freely in any publication provided that
1. it is duly credited as a project by the MIT Senseable City Lab
2. a PDF copy of the publication is sent to senseable-press@mit.edu

Sep 14, 2013 – Download high resolution video (MP4, 243M)

MIT Senseable City Lab

Carlo Ratti, Director
Assaf Biderman, Associate Director
Yaniv Jacob Turgeman, R & D Lead

Chris Green, Project Lead
Mike Xia, Technical Lead

Chihao Yo, Visual Design
Shan He, Web Developer
Eugene Kino Lee, Design Assistant


UAV Development Team

Harihar Subramanyam
Nikita Rodichenko
Blake Chambers
Marshall Wentworth
Sam Udotong
Alex Breton
Alex Bost
John Bowler

Production - MIT Senseable City Lab

Chris Green, Director
Marshall Wentworth, UAV Coordinator


Featuring

Matthew Claudel, Harvard Student
Nazanin Nayini, UAV Voice

Post Production - squint/opera

Sam Norland, Editor & Colourist
Alex Kelly, Graphics


Sound Design

Edo Van Breemen, Sound Design
Barry Pugatch, A/V Consultant

For more information, please contact:

senseable-contacts@mit.edu

Accessibility