Levels of Uncertainty in Design
by Sean Hanna, 3 April 2013
We live in a globally interconnected world in which unforeseen events have unprecedented impact, and uncertainty seems to be a defining trait of the age. While newsworthy events testify that economic instability, disease epidemics and natural disasters present a world of great risk, the case is equally relevant for architecture, which has to build in this world. Speaking historically, the design of buildings (the design of anything, really) has been essentially conservative. Innovation has always existed, but in manageable quantity, and with a strong background of tradition that could be relied upon for generations at least and modified. Now, this is much less the case. We are forced to deal with rapid changes in technologies of construction. Our design teams are bigger than ever before, and often coordinate work across continents and time zones. Longstanding traditions may be at odds with one another. The speed of growth is faster than ever.
How do we handle this? Big industry tends to rely on standardization—of manufactured building components, of file formats, of guidelines and regulations dictating minimum environmental, structural or efficiency requirements. These will suffice up to a point, but will never be as sensitive to the changing needs of site, client or environment that make each building unique. The building industry is perhaps unique in that it produces en masse, but cannot mass produce. It is the original and largest case of what we’ve recently begun to call mass customisation. Three decades ago, Rittel and Weber described this essentially unique situation we face in each new design as the “wicked problem”. Not even problems in the traditional sense, wicked problems resist a clear formulation. They begin with an ill defined brief, require decisions that are untestable beforehand because their context is complex and unique, and have unforeseeable consequences in the future. The argument is just as clear today, that flexible, adaptive methods of design and building are essential.
Only today we routinely take advantage of immense computational resources to guide design by an idea of performance, employing simulation, and then optimisation. The tools are ubiquitous in domains like structural analysis, and it is difficult to imagine practice without them, but as they become more pervasive we should also keep in mind that they too have uncertainties embedded within. We may perceive calculations to several decimal places, but the result is ultimately limited to the accuracy of assumptions—the behaviour of structural concrete ultimately derived from statistically varying tests of real samples. Sensitivity analysis, better communication of potential errors, and an understanding of complexity can help. Whereas a loaded beam might be well understood, Space syntax, for example, deals with modelling and predicting the notoriously unpredictable behaviour of people in space. The problem is tamed at the largest scale because few assumptions are made, and differences between individuals cancel out for crowds, leading to stability over time. In optimisation, we might similarly be best to aim for the most stable solution, not necessarily the peak performer.
At yet another level, as has been popularly pointed out with respect to the economy (e.g. by Nassim Nicholas Taleb) and perhaps painfully so with respect to foreign policy (e.g. by Donald Rumsfeld), uncertainty arises again in the “unknown unknowns” of design. Optimisation relies entirely on being able to quantify the goal we seek, but it then ignores all other considerations. Computational methods in general tend to lend themselves to easily quantifiable phenomena, but it may often be the case that these are not the most important. Brian Lawson terms this a ‘numerical measuring disease’, in which we might be blinded to what is really crucial by that which is simply easy to measure.
Perhaps the most robust approach is to acknowledge that we are building with change in mind. Here too, technology gives us advantages that could not have existed before, incorporating feedback, adaptive systems, and taking advantage of real time scanning and large data. The quest to bring robots to the construction site is an illustrative case, as it contrasts the high precision of the machine with the chaos of human activity. No option exists here to operate within the kinds of highly controlled and isolated environments for which pre-programmed machine fabrication is intended. Gramazio and Kohler’s R-O-B bricklaying robot, displayed at the Bartlett in two years before Smartgometry, is an experiment in robot vision and sensing, beginning only then to incorporate the technology necessary to do this.
The methods by which we will tackle uncertainty in each of these situations lead in each case to open questions. It is precisely for this reason that research in the field is necessary, and ever more crucial. It is also why it is interesting. What is more certain is that progress in these areas would seem to demand at least two different ways of thinking. A scientific approach generally demands hard, quantifiable hypotheses, and addresses them by examining isolated and repeatable phenomena. A design approach usually requires open speculation, but is embedded in complexity of specific cases. These questions of uncertainty sit in both worlds, and will require both kinds of thinking to answer. Smartgeometry, consisting of part academic conference and part open ended workshop, is a rare combination of both, and so seems as good a place as any to start.
Centrality road map of Eurasia. Tasos Varoudis and Joan Serras