2
Models

Computer programmers and architects both build models. While they are quite different from one another in terms of the media they use, they do share some features. We usually think of architectural models as miniature scaled-down physical objects, built to show the overall geometry and sometimes the colors of a proposed building. The models that programmers build describe relationships of cause and effect within a software system. Both kinds of models are simplified or abstract representations of information and processes, simpler than the reality they mean to capture, focused on the essential aspects of the situation, used to enable control or make predictions, because they behave correctly in the limited reality they represent. For instance, an architectural study model is often proportionally correct, but materially inaccurate (few buildings are built of cardboard, balsa, or basswood), and while it may have the correct exterior geometry it often has no interior partitions.

Models dispose of unnecessary complexity, and in doing so they enable us to ask “what if” questions in economical, reversible ways. They offer us a substitute for full-sized, full-cost, and confusing experiences.

Models don’t have to be true in any objective sense, as long as they are useful. Models can be used to focus attention; architects build white pasteboard models as a way to study form, intentionally suppressing color considerations. Models can be conventional; consider the persistence of the words sunrise and sunset to describe a particular relationship of planet, sun, and observer, in spite of the fact that we know the sun is not moving as the words suggest. Models may be created simply because they are convenient. Such is the case with “Hem-Fir,” a fictional tree species for which the Western Wood Products Association publishes allowable stress values (WWPA 2016). A close reading of the description reveals that each individual board in a bundle of Hem-Fir delivered to a job-site is either hemlock or some true fir, such as Douglas fir. The two species of tree grow together and have similar appearance and strength values, so separating them during the harvesting and milling process seems an unnecessary expense. Since most of this lumber is used in stud walls and joists, repetitive-use situations, a statistical abstraction called “Hem-Fir” was created to describe it. The species is a fiction—a model—but a useful one.

Symbolic Models

Drawing on laws of physics, and assuming uniform material properties, uniform dimensions, and uniform temperatures (called steady state), it is often possible to derive algebraic equations to describe aspects of architecture, such as the thermal performance of a wall, or compute the shear and moment stresses in a beam. These are symbolic models. Their results can be computed directly when the equations are combined with empirically measured material properties.

Finite Element Models

Symbolic models break down when the steady-state assumptions underpinning them are violated—that is, when they are subjected to dynamic (time-varying) influences (wind, earthquake, and impact for structures; diurnal temperature changes, or solar irradiation for thermal). Under such conditions the basic rules of physics still apply, but since it takes time for change to propagate through the structure, the simple steady-state equations don’t produce correct results. To work around this difficulty we slice time (and space) into smaller pieces, returning them to a near-steady-state situation, using finite element analysis (FEA) to create a finite element model. Computing results from the model requires lots of computation for all those small slices of time and space, but computers are good at that.

In the case of thermal behavior, the usual approach to this problem is to divide the building into a network of individual elements, called nodes, connected together by their ability to store and exchange heat. Included in this network are the interior air volume, the outside environment, the ground, the various wall materials, and so on. At any given moment of time the laws of physics apply, so if we divide time into short increments (perhaps an hour instead of a year or day), we can compute how much heat flows from node to node. An hour later there may be different weather, a different thermostat setting, and a change in the number of occupants. Once those changes are taken into account, we can once again compute heat flows in the network of nodes. Repeating this cycle, we can project, or simulate, how much energy will be used by the building over a period of time.

Models like this, consisting of a finite number of nodes and a finite number of time slices, are called finite element models. The technique is approximate, in both space and time. More nodes or more slices will produce a marginally better result at the expense of larger data files and more time spent computing the additional interactions. As with other models, making the right simplification is an important part of getting good results; the art of finite element modeling resides in knowing what the appropriate level of abstraction is for the problem at hand. Walls in shade and walls in sun are different, as are north- and east-facing walls, or walls around the garage versus walls around the occupied parts of the house. Distinguishing among those elements makes sense, but dividing the amount of east-facing wall that is sunlit into more parts just to have more nodes does not. Similarly, since simulations such as this usually rely on measured weather data from a nearby weather station, and such measurements rarely occur more often than hourly, it may not make sense to divide time more finely.

Statistical Models

Simulating energy use for a city using a finite element model of each building would be cumbersome and extremely slow due to the large number of variables. However, there is probably a supply of historical weather and consumption data, as well as property tax data about construction and size. By extracting data from similar buildings in similar circumstances, we can build a statistical model of the built environment under study. Both real-estate appraisers and tax appraisers use such techniques to set a theoretical sale price on your home, while contractors use it to estimate the cost of doing a kitchen remodel. Estimates of bus ridership, traffic density, pollution, and weather all involve statistical models of the phenomena in question.

Analogue Models

As it happens, the equations for material resistance to heat flow and storage turn out to be mathematically the same as those for electrical resistance and capacitance. Using common electrical components—capacitors and resistors—and replacing temperature differences with voltage differences, we can literally build an electrical circuit to simulate the thermal behavior of a building (Robertson and Gross 1958). This is an analogue model; it need bear no physical or visual resemblance to the real thing, but its behavior is used to predict, by analogy, how that system will perform.

Some analogue models do look like their subjects. Because light behaves the same in a full-size environment as it does in a small one, we can build scale models of buildings using materials such as cardboard, colored paper, and fabric that mimic the visual properties of finish materials. When lit with light from an artificial or real sky-dome, the physical model functions as a powerful simulation of the design’s visual environment.

Of course, architects routinely build scale models as part of assessing the visual character of designs, but we don’t always consider why such models are actually meaningful. Not all physical models scale accurately. For example, because thermal characteristics are proportional to volume and area, a dimensionally scaled physical model using real materials does not perform the same as proportionately identical full-size objects. Thus, models are not all equivalent, and any model may also produce inaccurate results if applied to the wrong situation.

Sources of Error

Finite element simulation of lighting, heat transfer, structures, and air-flow is now routine (Kensek 2014), but such models can produce erroneous results. They depend on simplifications chosen, the duration of the time-slice used in the simulations, the accuracy of material properties assigned to elements, and the accuracy of the applied data—weather, traffic, etc. Further, the model may exclude a phenomenon that is, in fact, linked to the one being studied. This is why heat and light are often simulated together; people need warm spaces and light to work, and lights produce waste heat, so lighting and thermal performance are tightly coupled with occupancy patterns in most buildings and are also linked to building operation strategies.

Selecting appropriate test data can be a challenge. For example, thermal (energy) simulations need weather data to work with, either typical (for average performance) or extreme (to test worst-case scenarios like a heat wave). Historical data is best, but no randomly selected interval is really typical or extreme. Getting, or synthesizing, appropriate weather data is something of a challenge (Degelman 2003). Further, weather found on a real building site may vary fairly markedly from that of the nearest airport, where the local weather data is collected. Localizing weather data and accounting for site conditions such as trees or buildings on adjacent property becomes more important as simulations get more precise.

Overly simplified or misapplied models can be a problem for inexperienced users. Models are, by their nature, simplifications, often explicitly intended to make complex problems easier for non-experts to work with, making them “designer friendly.” At the same time, questions that motivate consultation of the model may be simple or complex. Overly complex tools don’t fit simple questions, and simple tools may not fit complex situations. The potential always exists for a mismatch or misapplication due to some unexpected characteristic of the building being studied. For example, one sophisticated thermal simulation of a Seattle-area home that included a glazed atrium showed very elevated mid-winter temperatures. The mechanical engineer running the simulation doubted the validity of these results. After increasing the frequency with which the convective heat-transfer coefficient in the atrium was re-computed (a time-consuming calculation that was minimized for normal conditions), results returned to the more typical Seattle range. Experiences such as this lead some researchers to question the whole idea of “designer friendly” simulation tools (Augenbroe 2003).

Even when appropriate tools are paired with informed questions, there is a risk that some of the information going into the simulation will be incorrect. This is true even in an increasingly connected and online work environment. Dennis Neeley, whose companies have been doing electronic data publishing for manufacturers since the 1990s, has stated that “the challenge is getting correct information…. On some projects getting the information right takes longer than any other part of the project” (Neeley, personal communication).

Even when the input data is correct and the algorithm is appropriate, the fact that digital models use approximate representations of some numbers can cause errors to creep into calculations that are repeated thousands or millions of times. These have receded in recent years as larger system memories, better processors, and sophisticated software are applied to the problem, but these “round off” errors do occur (Goldberg 1991).

Another source of error arises from the common, possibly necessary, duplication of files. What might be called “divergent evolution” happens when data is duplicated for any reason, and then edited independently from the original, as when a consultant is given a “background” drawing of a design. Sometimes it is done to “test an idea” or “try something out.” At other times it happens because a backup is mistaken for the primary file, or someone needs to work at home over the weekend. It can even happen within a file, as when work is duplicated from one layer to another “temporarily.” In CAD systems, merging (or rejecting) the changes is usually visual, time consuming, and error prone. BIM software tends to be better at alerting users to incompatible edits as long as they are working on the same file (which is rarely the case with consultants). In the software and document management world there are tools for finding and highlighting differences between versions. More development is needed in the world of geometry and design.

Summary

Models are important, there are several types, and they need to fit the problem. Choosing the right model and matching it to the level of detail available in the design data and to the expectations of the designer is important, and there is much room for improvement. While we need to know the limits of our models, the parts of reality they accurately predict and the parts they do not, being computer code they are often opaque and under-documented or poorly understood. When we make an intercontinental phone call we need to remember that sunset in London does not happen at the same time as sunset in Seattle, and when we pick up a Hem-Fir board we should not be surprised that it is one or the other. With digital models, opaque to casual observation, it is harder to discover the limits and harder to correct errors, an observation that has motivated a move to develop tools with more transparent interfaces that allow users to see into internal software processes (Tanimoto 2004). In the absence of such software, knowledgeable users remain the primary defense against errors.

References

Augenbroe, Godfried. 2003. Trends in building simulation, in Advanced building simulation. Edited by A. Malkawi and G. Augenbroe, 4–24. New York, NY: Spon.

Degelman, Larry. 2003. Simulation and uncertainty: Weather predictions, in Advanced building simulation. Edited by A. Malkawi and G. Augenbroe, 60–86. New York, NY: Spon.

Goldberg, David. 1991. What every computer scientist should know about floating-point arithmetic. Computing Surveys (March): 5–48.

Kensek, Karen. 2014. Analytical BIM: BIM fragments, domain gaps and other impediments, in Building information modeling: BIM in current and future practice. Edited by K. Kensek and D. Noble, 157–172. Hoboken, NJ: Wiley.

Robertson, A. F. and Daniel Gross. 1958. Electrical-analog method for transient heat-flow analysis. Journal of Research of the National Bureau of Standards 61 (2): 105–115.

Tanimoto, S. L. 2004. Transparent interfaces: Models and methods. Proceedings of the AVI 2004 Workshop on Invisible and Transparent Interfaces, Gallipoli, Italy. New York, NY: ACM.

WWPA. 2016. Framing lumber: Base values for western dimension lumber. Western Wood Products Association. www.wwpa.org/Portals/9/docs/pdf/dvalues.pdf.