sgarticles

Authorswilkinson

Samuel Wilkinson is an EngD candidate at University College London, in the Virtual Environments, Imaging and Visualisation centre and the Bartlett School of Graduate Studies. The research project is co-sponsored by Bentley Systems (USA, PA) and PLP/Architecture (London), and focuses on prototyping machine learning tools to allow fast environmental performance evaluation for GenerativeComponents. He holds a BArch / MEng in Architecture and Environmental Design, an MRes in Adaptive Architecture and Computation, and has a background in environmental modelling and computational geometry.

view all articles

"power of the air all worked out with counting…"

by Samuel Wilkinson, 16 April 2013

The inventor of the barometer, Evangelista Torricelli, once wrote that we live submerged at the bottom of an ocean of air. This ocean extends leagues above our heads where it dissolves into space, is dragged around by the spinning earth, and pushed about by density differentials caused by the heat of the sun. In the lower atmosphere surface friction from mountains, trees and cities causes largely unpredictable turbulence. Turbulence remains one of the great unsolved complexities in physics as there is no known theory to fully describe the phenomenon (Clay Mathematics Institute, 2000). So how can we build in an environment we do not fully understand, and make predictions of the future in the face of theoretical, methodological and design uncertainties? And, more practically, how can these different types of uncertainty be reduced to improve performance today and in the future?


The numerical methods used for predicting wind flow, namely computational fluid dynamics, discretise space and time to solve the Navier-Stokes equations with the typical addition of a turbulence model. The simulation, as in all scientific modelling, represents the best understanding of a phenomenon to provide a prediction given known boundary conditions. However, without a complete theory of the fundamental physics it can only attempt to approximate solutions, which are slowly improved over time by validating against reality. The uncertainties (synonymous with error in simulation) are numerical, through the process of discretising and approximating, and theoretical, from the lack of a fundamental theory. The errors can therefore, at least, be reduced someway by running higher fidelity simulations.


Given that the simulation accuracy itself can be varied, i.e. a coarse approximation is less accurate than a finer one, within the current modelling paradigm there are tradeoffs between speed and accuracy. The least accurate solution will be the fastest, and vice versa. This is in fact generally true for most decision-making processes: we usually deliberate over important tasks and make snappy guesses over trivial ones (Chittka et al., 2009). When it comes to design, time constraints necessitate fast decisions but accurate results also. This can be exemplified in the design of a skyscraper: there may be a number of contending options that must be considered and evaluated. The aerodynamics can have significant implications for structural efficiency and cost, so an accurate and instant idea of the performance of each option is valuable. By creating and evaluating more options quickly there is more chance of finding better solutions. The uncertainty in the design stages can be reduced through exploration and testing.


Flow patterns for any real situation are characterised by their temporal nature, by chaotic and non-linear vortices, standing at the edge of current theory and computational power. Skyscrapers, more than other building typologies, are subject to this flow behaviour. They extend upwards into the atmosphere where wind speeds are greater, creating large transient structural moments anchored deep in the ground. The rate of skyscraper construction has not been slowed by recent economics, instead it has shifted where they are built from developed countries to emerging economies (CTBUH, 2013). And their height is increasing, pushing the limits of technology, materials, vertical transportation and structural engineering. In addition, the trend towards parametric design and digital manufacturing tools in architecture have enabled higher levels of complexity to be realised.


wilkinson fig1

Turbulent flow around a procedurally generated tall building, various views (Wilkinson, 2013).


Uncertainties, or errors, in fluid simulation can be mitigated by improving the discretisation resolution, at the cost of increasing computation time. And uncertainties, or unknowns, in design can be reduced through exploring and evaluating more options at early stages, which again requires more time. Methods are required that can achieve the highest levels of simulation accuracy with the speed of low-fidelity ones. Returning to the notion of speed-accuracy tradeoffs, the speed can be improved with minimal change to the accuracy by introducing experience or skill. The question now becomes, how can experience, or learning, be incorporated computationally into fluid simulation to improve speed and maintain accuracy.


Machine learning algorithms can be defined as those that can learn from experience over time so as to improve their performance at a given task (Mitchell, 1997). Such inductive learning is a fundamental concept in artificial intelligence and shares many common aspects with statistical regression. Existing learning algorithms, such as artificial neural networks, support-vector machines and decision trees, can all be trained to predict the outcome of a fluid simulation under certain constraints. The traditional hypothesis that a fundamental understanding of a system’s behaviour is required to make predictions has been refuted, instead it has been shown that it is possible to approximate this behaviour, at similar levels of accuracy and greater speeds, from observations alone (Hanna, 2011).


In our case, a procedural tall building model is used to generate a range of potential designs. The challenge of trying to generate a set of models that cover the entire potential design-space is complex, although there are more easily identifiable parameters than others to start with, such as height, taper, orientation and the number of facets. The procedural model is used to generate a large number of instances, each of which is evaluated and the wind-induced surface pressure extracted (metaphorically, this pre-computed data set correlates to prior experiences). For each, local shape features are extracted with their corresponding pressures (feature identification), and used as training data for the learning algorithm. The artificial neural network then seeks to generalise from these specific input examples in order to make new predictions. When a new model is now presented, the shape features are analysed and a prediction of the surface pressure is given. Once trained, the network can provide results almost instantaneously, avoiding the need for costly simulations and allowing the real-time performance evaluation sought for iterative design exploration.


Using machine learning to provide this ability can allow for higher resolution and lower error fluid simulations at speeds necessary for the fast-paced generative design exploration common in practice, to test more options and reduce design uncertainty. We see it becoming increasingly common in our daily lives, albeit hidden from view: internet searches collect previous entries as training data to improve response accuracy; shops gather data on previous purchases to target new products. In design, similar tools can be used to learn performative behaviours and optimal configurations while we sleep. They can be integrated into parametric tools such as GenerativeComponents so as to allow the architect or engineer to tinker freely in a pre-optimised design-space and receive instantaneous feedback on how the digital model will behave in reality.


To conclude somehow, and to perhaps explain the title, I believe it is in our nature to attempt to predict the future, to prepare for the unexpected and avoid shocks. The computational methods we use for this have gotten immensely complicated, often with results beyond our understanding. To solve this problem we are in the stages of passing the understanding itself to the machine.


wilkinson fig2

Machine learning prediction of wind-induced surface pressure on real cases: upper row, predicted pressure; lower row, prediction error (Wilkinson, 2013).


 

view all

No active groups.

view all

SG Activity

James Hennessey created a new topic Advances in Architectural Geometry 2014 in the forums.
15 days ago
36 days ago
55 days ago
Ashris Choudhury uploaded a new avatar
66 days ago

SG Share

facebook twitter Google+ digg delicious

SG Mailing list