Autonomous collection of building data

Research project, February to June 2017

Data collection using the Segway Robot for optimising the design and use of buildings.

Troels Rasmussen, Ryan Hughes and Jens Pedersen​

Aarhus School of Architecture


We plan to develop an app for autonomous collection of building data on the Segway Robot platform. Using the Segway locomotion system, the robot will be capable of:

a) autonomously roaming and

b) replaying a path taught by an operator,

and while doing so, collect sensor data about a room using a microcontroller operated sensor array. The data is used to inform the computational design of an architectural installation for the room.

Use Case

Architectural Practice ‘X’ The robot moves around a room at [anonymized school of architecture] at different times in the day while collecting highresolution data on humidity, temperature, light levels, no of people etc. An intelligent system identifies patterns in the data from the robot and proposes a computational design for an architectural installation in the room.

By using the system in a selection of rooms, we get to explore how the robot “perceives” different rooms as manifestations of differing architectural installations.

Technical Implementation

We plan to use the locomotion system and IMU sensors of the Segway robot to record a path in an indoor environment by manually driving the robot through the path. We record the wheel rotations and the state of the IMU sensors at different wheel rotations, which may then be replayed autonomously, thus reproducing the movement of the robot along the path. This type of programming is often referred to as ‘teaching’, as the operator is active in the demonstration of the motion to the robot. We plan to use simple depth sensing and computer vision algorithms to detect if any obstacles obscure the prerecorded path (taking evasive action and continuing its task) and to make sure the robot returns to its starting position, essentially making the path a loop to prevent it from drifting out of position after continuous runs.

The collection of data will take place on a microcontroller operated sensor array which will relay data to the app deployed on the Segway robot. This data is simultaneously made accessible to a system running on a standard PC, which identifies patterns in the data using machine learning and feeds the interpreted data to a digital, parametric model of an architectural installation in Grasshopper, a parametric design environment used by architects and engineers.

Possible Metrics

Oxygen, Carbon Dioxide, Lux & Light levels, Acoustics, Infrared, UV, Smartphoneradiation, Moisture, Temperature, Obstacles, Unknown Objects, Persons, Moving Objects, WiFi Signals

Technology Partner:

Industry Partner:

Aarhus School of Architecture

#Allprojects #Previousprojects