Architects have historically had great respect for and a very deep attachment to the process of drawing by hand, reflected in the many skeptical and concerned responses to office computerization offered by architects and educators such as Michael Graves (2012) and David Scheer (2014). In spite of these concerns, digital document production in the form of 2D CAD or 3D BIM has almost completely displaced hand-drawn documents, transforming the image of architecture and many processes of construction in the last two decades. The reason for this can be seen, in part, in the graphic shown in Figure 10.1, which is often attributed to HOK’s chairman and CEO Patrick MacLeamy, though a similar graphic can be found in several mid-1970s texts, including William Mitchell’s 1977 Computer Aided Architectural Design (Davis 2013).
The graphic illustrates the idea that an initially malleable schematic design becomes increasingly inflexible and expensive to modify as more and more interlocking decisions are made (and recorded in the design documents). Traditional document production techniques required significant manpower during the construction documents phase. Much of this effort went to coordinating redundant representations such as plans drawn at different scales, showing the same features in plans and elevations, and propagating changes through those various alternative representations—all relatively expensive processes. The promise of computing, especially BIM, is that a single data repository will be internally consistent and software can be used to perform the coordination required to produce the different-but-related documents, allowing more effort to be focused on design, thereby producing a better design at a similar or reduced cost.
This chapter will consider some of the ways in which practice is changing under the influence of digital technologies, changes that go far beyond the disappearance of drafting tables in offices. Digital tools have been introduced to practices in pursuit of greater productivity, in response to rising client expectations, and in recognition of new opportunities. Design computing technology presents the opportunity to respond to more data during design, to apply more computation to design decisions, and to explore morphologies whose complexity presents fabrication or construction problems. At the same time, opportunity does not guarantee results. Culture and practice changes may depend on the emergence of additional tools.
It is no longer enough to satisfy client budgets or aesthetics alone, not when as much as 40 percent of global energy is used in the production and operation of buildings whose construction may consume irreplaceable resources. Designers are being held morally responsible for community well-being and the architecture, engineering, and construction (AEC) industry is increasingly being challenged to demonstrate an ethical response to these facts through careful design and documentation, responsible material selection and reuse, appropriate detailing (e.g., for disassembly), and similar strategies. Designers are exploring ways of preventing or responding to natural disasters, even designing buildings that actively redistribute forces in the case of earthquake, tsunami, or flood, or that establish ad-hoc networks to supplement damaged information infrastructure. These concerns have placed additional demands on expertise, and increased the moral-hazard related to errors, whether that is building collapse, long-term exposure to carcinogenic materials, destruction of far-off hardwood forests, or simply inefficient cooling.
While the MacLeamy Curve is largely a rhetorical construct, an analysis of labor productivity data over the last 50 years by Stanford University showed declines in AEC productivity compared to US manufacturing overall (Teicholz et al. 2001). Though challenged on the grounds that it does not take into account a shift from on-site to off-site construction (Eastman and Sacks 2008), or reflect the challenges of an increasingly complex and demanding product (Davis 2013), a recent reconsideration reaffirms the basic claim that productivity in the AEC industry has declined, while general manufacturing productivity has increased dramatically (Teicholz 2013), largely through the application of automation and information. The implication is that the AEC industry is not making the most effective use of these technologies.
Large clients such as the General Services Administration (GSA 2007) and the Construction Users Round Table (CURT 2010) in the United States have responded to this conclusion by pressuring the profession to go beyond 2D CAD and make greater use of BIM in the belief that this will improve results. Architects, schools, and new students have invested in developing the required knowledge and software expertise. At the same time the traditional design–bid–build model of construction, in which designers and builders are often cast in antagonistic roles, has been evolving to make better use of digital technology through new practice models emphasizing collaboration. The iconic architect’s drawing board and parallel rule have disappeared from most offices, completing the changes begun when 2D CAD was introduced. They have been replaced, at a much higher cost, by multiple large monitors connected to fast PCs licensed to run expensive CAD and/or BIM software.
While CAD and BIM have revolutionized the way documents are prepared, the role that paper drawings (or their digital equivalent, PDF files) retain as the dominant legal document in the building process is just beginning to change. The GSA now requires delivery of BIM data allowing automated checking of a schematic design against the design program. The emergence of new practice models such as integrated project delivery (IPD), which seeks to implement the MacLeamy shift by bringing construction expertise into the design process earlier, has begun to erode the traditional boundaries between design and construction. The result is that detail design and even fabrication now increasingly occur directly from design documents provided by the architect, eliminating a layer of shop drawings and data re-construction.
Thus, while it is hard to see how ICT could, in itself, cause a decline in productivity, the evidence certainly invites investigation into the ways design and construction knowledge are represented, prepared, compared, refined, and applied in practice. In the fragmented US AEC marketplace, the uneven productivity boost may well benefit some players more than others. Clearly, as architects are responsible for design coordination, the increasing complexity of buildings has made coordination both more critical and more difficult. Compounding this, greater emphasis on performative architecture (Kolarevic 2005), the effort to forecast and control project costs and performance, has placed additional documentation expectations on designers. Where digital fabrication makes unique design responses possible, integrated digital fabrication workflows may well have greater financial benefit to the builder. Adjusting compensation schemes will take time.
Within offices the increasing emphasis on the quantifiable aspects of design may also be causing a drift in the role of architects (Scheer 2014). Licensed for their attention to life-safety issues, but usually selected for their form-making expertise as well, architects have only recently embraced technologies of production as indicators of professional prowess. Prior to the arrival of CAD in offices, hiring a new employee did not require significant capital, as drafting tools were relatively cheap. Further, when project schedules required, senior members of a firm could contribute directly to drawing productivity by “jumping on the boards.” Skill in drafting and freehand drawing established status within the profession, and skillfully drawn details were often ascribed higher reliability (Robbins 1997). Now that the production of legal documents has largely shifted to digital tools, there is stratification within architecture offices between those who can and those who cannot perform the work, as well as the emergence of new behaviors, such as pair-work, in which an entry-level technologically savvy user works alongside an experienced but technically less-savvy designer, thereby leveraging both digital skills and disciplinary experience (Hancock 2013).
Today’s building process is also changing. Where earlier surveying techniques required a time-consuming chain of positions linking a site to benchmarks, the Total Station electronic theodolites in use today can establish position and measure distances and angles with great precision, often under automated control. Such speed and accuracy, coupled with data drawn from the design database, have contractors talking about large-scale site work being conducted automatically using earth-moving equipment without human drivers. Digital technology enables contractors to fabricate and position building elements with great precision. Using these technologies, M.A. Mortenson, the contractor on the Denver Art Museum’s Frederic C. Hamilton Building, designed by Daniel Libeskind, actually completed steel erection three months ahead of schedule and saved the city US$400,000 in the process (Mortenson 2005). The ubiquitous tower-crane operator follows a “lift schedule” that is carefully planned and optimized to deliver construction material to the appropriate part of the project at the ideal time, functioning very much like a very large 3D printer.
Easy access to correct 3D models by designers and contractors is more than a cosmetic benefit. Consider the challenge of correctly placing steel in formwork where it will be embedded in the walls of a concrete building core, possibly weeks before erection of exterior walls or placement of floor beams. At that stage in construction there is little else on the site to provide guidance, and drawings can be somewhat abstract. Use of 3D renderings during daily work planning has been found to reduce placement and orientation errors for steel connectors embedded in poured-in-place concrete construction (Campbell 2004).
Design documents are necessarily somewhat approximate or vague. Traditional 2D drawings may not explicitly represent the true shape of curved or angled building elements. In addition, contract document drawing sets bring together the work of many specialized disciplines, work that is often done independently and presented on separate sheets within a set. Finally, in order to allow for sub-contractor choice and contract alternatives, design documents may include some intentional ambiguity. While architects are usually responsible for coordination of the overall work, the final site of coordination is the building itself, and it is not unusual for construction projects to encounter spatial conflicts or clashes between systems such as reinforcing steel and embedded conduit in a concrete column, HVAC ducts and beams or elevator shafts, etc. Resolving such clashes in the field can be very expensive and time-consuming. Complete 3D models (BIM or otherwise) greatly facilitate clash detection. Further, some industry segments, such as HVAC contractors, can convert 3D data directly into fabrication data for automated tools. For this reason, contractors have often been keen to adopt BIM technology across the industry. Architects, in their coordination role, would seem to be the party that should be responsible for collecting or providing this data, but contractors may see the greatest benefit. Motivating efficient work and data sharing beyond simple contract documents needs to occur. Profit and risk sharing through non-traditional contracting practices might be one way to address this evolving situation.
When contemplating the productivity puzzle in light of the above, it is perhaps not surprising that the most visible group of early adopters of BIM and related technology has been contractors. Tools and data that reduce the work that must be done on-site pay dividends, and 3D digital models offer many opportunities to benefit. Many contractors will build their own BIM model as a means of testing their construction process, looking for clashes, establishing construction schedules, and subsequently managing sub-contractors, billing, and progress. The 3D model can be used in bending reinforcing steel, cutting structural steel, and fabricating sheet-metal elements for HVAC systems. In some more recent cases, contractors have relied on data to enable them to fabricate wall panels or entire units of the completed building off-site in climate-controlled environments, with greater productivity, reduced risk to workers, and improved quality.
Implicit in the pursuit of productivity gains is the reuse of data—often someone else’s data. In a fragmented industry such as the US AEC industry, this requires trust and amounts to a risk and reward transfer, as noted above, often with the architect coordinating or providing information that others benefit from. Sorting out acceptable contracts and ground rules remains a work in progress, one that design computing might well influence or be influenced by.
Being able to access and use the design team’s BIM for their planning and testing would be a great benefit to the contractor, but does not happen often. It is a huge risk transfer, involving issues of trust and organizational compatibility as well as accuracy and content, but it also relies on compatible system representations and requires data interoperability that might not be present. Better data interoperability is emerging from industry foundation classes (IFCs) and similar projects, but can also be a challenge. Both of these concerns are reduced for design–build firms, where the risk is shared across the entire firm and there is central control of ICT platforms. Incompatible representations tied to the world-view of the user remain an issue—where a designer sees a ten-story curtain wall as a single element, the contractor may see floor-by-floor panelized delivery, installation, and alignment items in their scheduling. Somewhere between design and construction the single conceptual “curtain wall” element must be turned into parts in a “construction process.”
This situation may be encouraging the emergence of vertically integrated design–build firms that can control and leverage the data at every stage (Sznewajs and Moore 2016). However, firms in the United States have responded by creating short-term project-centered corporations that share risk and reward among the design and construction team. Continued evolution of new legal and business models seems likely.
From industrial designers to architects and engineers, much of what goes into a design consists of selection or discovery rather than invention: screws, cabinets, hinges, façade systems, electrical cover-plates and switches, mechanical systems, and window glazing are among the items selected from a palette of alternatives. Knowing what is available, what it costs, and what will fit the situation best is a significant challenge in the internet era, with possible suppliers and products scattered worldwide but often left to general wording in a specification or selection by a consultant. Tracking, coordinating, and recording the many decisions that go into the finished building is one of the architect’s main jobs. The principal mechanism for doing this remains the traditional “drawings and specifications” of standard contracts, but identifying appropriate products remains a challenge. There are a large number of products available, but there is variation in the quantity and quality of information about the products; descriptions differ in format and content; and there are no integrated software tools to facilitate, coordinate, and document selection (even in this age of the World Wide Web!). These are all factors that limit productivity, forcing designers to repeatedly climb steep product-learning curves or steering them to familiar safe solutions or consultant-selected components. Figure 10.2 illustrates diagrammatically the loss of knowledge that is generally conceded to accompany the handoff from one phase of design and construction to the next—knowledge that might not have to be reconstructed if it were suitably recorded during the design and construction process (Eastman et al. 2011). Standardized digital product descriptions would allow products to be easily imported into differing CAD/BIM systems, tested for connection to products by different vendors, incorporated correctly into engineering analyses, etc. While this has been recognized for many years, especially in Europe, it has proven difficult to achieve the desired level of standardization. Current efforts are focused on IFC representations and their integration into widely used productivity systems.
The problem is not trivial. While good data about products has become more important and the internet has provided a means of access, vendor-supplied product data is not always of a high quality. One consultant noted: “Most manufacturers have data spread out in multiple locations and when we try to enter the information into databases we continually discover errors or conflicts in their information” (Neeley, private communication with the author 2013). A push by large owners and managers, analogous to the push to get designers to employ BIM technologies, may be needed to establish standardized computable representations of components, but design computing research is most likely to identify appropriate strategies.
Design is not just about acquiring and recording data. Designers must provide data to their clients too, in a series of progressively more detailed views of the project. These are not just data dumps, they are often carefully crafted presentations in which specific questions are addressed and others avoided, not in an effort to deceive the client or sell the project, but in order to guide the project development.
Design is not a linear process. It is a process of successive approximation wherein a candidate solution is refined, revised, and possibly replaced with another. While necessarily cyclical, the goal in each cycle is to move the design forward, so that you are refining and advancing the design, not rehashing. While the design team’s focus and attention revisits aspects of the design repeatedly over time, at each point in the cycle of refinement, the designer needs to bring decision-making focus to a particular set of issues, and—equally importantly—ignore others.
Orchestrating and directing this process, avoiding distraction and indecision, is an important skill in the designer’s tool-kit. The availability of visualization tools such as photo-real renderings helps tremendously in this process, but the availability and ease of use attached to this technology also raises focus-management issues, especially when clients are given access to a design model that might not be 100 percent ready for prime time, or consultants spend time on analyses using an old model because all data looks the same. Everyone looking at a document set infers which parts are immutable and which are open to change. Strategies for using the technology to minimize these problems by explicitly identifying preliminary ideas or suggestions “under review” within the design are needed. In the 1980s there was a product called Squiggle that added irregularity to straight-line drawings to make them look more tentative and mutable. Something similar might be done for photo-real renderings to make them more “conceptual,” such as that used in non-photo-realistic rendering (Winkenbach and Salesin 1994).
Design not only develops in cycles of attention; work is often layered-in-time as well, with multiple overlapping jobs going on in an office over a period of time, possibly disappearing into a file drawer for months at a time as financing, governmental permission, or client planning processes play out. Maintaining appropriate project status information so that a design organization can return to (or pick up) a project weeks or months later is not as easy as preserving the design documents. Formal meeting minutes, request-for-information (RFI), and request-for-proposal (RFP) mechanisms are intended to clarify inter-organizational communications, but offices also utilize personal email archives and individual memories to maintain project state. Whether such data can be efficiently captured on the fly, or retrieved when needed, represents another research opportunity.
The US government’s GSA is one of the largest landlords and construction clients in the United States. To help ensure that proposed buildings meet the requirements of the government agencies funding them, the GSA began requiring “spatial program BIMs” of preliminary building proposals in 2007, with exploratory programs to deepen the use of BIM over time (Hagen et al. 2009). Since 2011, the UK government has made an even stronger commitment to use of BIM in the construction industry “with the key objective of: reducing capital cost and the carbon burden from the construction and operation of the built environment by 20%” (UK BIM Task Group 2013).
These efforts are built on the expectation that traditionally graphic information about designs, such as floor plan layouts, have computable aspects (adjacency, area, etc.) that can be extracted automatically and compared to goals. Developing systems to utilize the resulting data and understanding how these government-sponsored technology investments shift the AEC industry, and the ways in which ICT more broadly might further, or hinder, these initiatives, is important to the future of design computing.
The value of data does not end when a building is finally occupied. Large, complex buildings such as hospitals seem to be remodeled continuously, often commencing before the building is fully occupied. Given the complexity of services contained in such facilities, it is not surprising that estimates suggest that maintenance and operations costs of buildings over their lifetime may be triple the initial construction cost (Public Technology, Inc. 1996, citing Romm 1994). The information needed for facilities management (FM) includes not only information about the location of ducts and pipes inside walls (which might be found in as-built drawings), but also information about materials used in construction, equipment manuals, and post-occupancy replacement and maintenance records. This information is intimately tied to the design, and is increasingly part of the package delivered to the building owner during building commissioning. But what form should it take? Including a BIM workstation and project database as an FM deliverable might seem like a good idea, but where do you put it? Who trains the facility operators? Is a BIM program really the right tool? Building operators have different data needs from either designers or contractors. At a GSA workshop attended by the author in 2008, the consensus of the experts attending was that differing information needs mean most projects require three different BIMs: design, construction, and facilities management. Understanding the information lifecycle and information access needs of building operators, emergency responders, and other infrequent building users represents another opportunity or challenge within design computing.
The pressure to manage and the need for data in connection with design and construction are creating new opportunities and challenges for practitioners. Chapter 13 will go over some of the ways in which buildings are contributing to and benefiting from emerging data streams. Fine-grained, site-specific information is guiding the form of buildings in new ways. For those drawn to digital concepts and tools there is an opportunity to develop data consulting services for both architects and clients. In an article for Architect Magazine, Daniel Davis touches on ways in which clients are demanding data from their designers (Davis 2015).
Anecdotal evidence exists that greater use of 3D visualization in construction-management meetings and drawing sets reduces errors. Digital imagery, from both fixed cameras and mobile devices, is increasingly useful in site-inspection, progress monitoring, and problem resolution.
A lurking problem for any modern offices is file-format obsolescence, in which perfectly good data becomes useless simply because there is no software to read it (or no operating system to run the software, or hardware to run the operating system). New releases of software may employ new representations in order to deliver new functionality. This requires new file formats. The new software will read old files, as well as new, but after a time that old code will be retired. Old files that have not been opened and saved to the new format will become unreadable. This process might take a decade or more to unfold, but most design firms keep archives that span longer periods of time. As the life spans of digital technologies in the workplace grow, data-preservation issues are likely to become more abundant, not just in design practices, but across the world (Sample 2015).
Traditional designs and design documents recorded decisions about materials, geometry, and the relationships between them, often in the form of typical plans, sections, or details. Designers sought to establish a limited “vocabulary” of conditions within the geometry and appearance of the building. Uniform beam spans, column spacing, and material conditions meant efficiency in drawing, efficiency in fabrication, and uniformity in appearance—all features of the “typical” modern building. Buildings that violate the uniformity can become very expensive. The 1991 Seattle Art Museum went roughly 20 percent over budget, at least in part because “not a single steel column runs straight from the basement to the top of the fifth floor [with the result] that hardly any of the individual pieces that make up the building are repeated” (McDermott and Wilson 1991).
In construction, uniformity in formwork, jigs, molds, and dimensions simplifies fabrication. It gives the contractor a chance to learn on one part of the project and use that knowledge repeatedly. While Jørn Utzon’s original 1957 design for the Sydney Opera House was relatively simple visually, the iconic shells were initially defined using parabolic curves, which meant varying curvature (and thus varying formwork) across the many rib and panel sections, making them prohibitively expensive to fabricate and forcing that scheme to be abandoned. They were ultimately executed as spherical forms precisely because uniform curvature simplified construction.
Today, design and fabrication systems have become robust enough to support execution of complex designs, such as the Gehry Partners’ Guggenheim Museum Bilbao, consisting of many non-uniform parts. While conceived without the computational and data management efficiencies afforded by computing, information technology makes it economically feasible to design, detail, produce, track, deliver, and assemble buildings with such an explosion of atypical parts. The widespread adoption of such technological tools has allowed more and more architects to experiment with gentle undulations in their façades, variation in their glazing or cladding patterns, or geometry in their structural steel. Accurate digital models can feed data to precise digital fabrication machinery that can cut and label individual members and stack them on the truck in the right order, delivered just in time to be placed, while digital survey equipment can ensure correct placement. The result may be atypical, but remains efficient.
Not only are buildings increasingly atypical; modern design practice has become more varied in terms of project location and team composition. In the 1990s academics began exploring distributed or virtual design studios, using digital networks for data exchange and interpersonal communication (Wojtowicz 1995; Hirschberg et al. 1999). Modern high-speed communication enables routine design collaboration across wide areas of the globe, and creates the need to mesh information practices in order to work efficiently. While such global practices are possible, they are found to require a careful selection of collaboration partners (Franken 2005). Systematic study of such collaborative partnerships and their technological foundations would provide another design computing opportunity going forward.
Architects and designers are now able to access or generate significant amounts of data in the normal course of a day. Detailed weather, demographic, circulation, and other kinds of data are available for many sites, and both primary tools (CAD or BIM) and secondary tools (analyses, etc.) can be used to produce detailed reports. Paradoxically, two responses are notable: Some designers combine detailed site data and analysis algorithms, often in genetic algorithm or optimization systems, to perform form-finding exercises on the site (Davis 2015), while at the same time an online survey of designers and their use of green building analysis tools found that many who know about such tools opt not to use them in early design (Sterner 2011).
Those who embrace data often seem to want to transduce some quantifiable quality of the site into building form. Bernhard Franken describes a computational process by which project-specific fields of physical and notional (“not strictly physical”) forces act on geometric forms to produce a “master form” from which the design documents and detailing are derived (Franken 2005).
In light of broad awareness regarding the impact of early decisions on design performance, it is surprising that greater use is not made of energy simulation during schematic design. The online survey mentioned above revealed that while roughly half of respondents indicated familiarity with green building analysis tools and frequent participation in green building projects, very slight actual use was reported of such tools during design, even among those who considered themselves experts in green building design. The report identified dissatisfaction with “ease of use” and “time required” as major challenges, leading the report’s author to conclude, “most practitioners are presumably relying on rules of thumb, prescriptive checklists, and/or consultants.” Similar results have also been reported by others, who also found “Architects today rarely use whole building energy analysis to inform their early design process” (Smith et al. 2011). They further identified “understanding the results and using them in presentations” as inhibitors. For these users, data, in the form of analysis results, seems to be more problem than opportunity.
Why might this be so? The truth is that building a good simulation model is something of an art and not simply a translation of building geometry—“There [are] too many opportunities to fail with an energy model that [is] derived from a detailed building model” (Smith et al. 2011). Simulation is not routine. It is not common. It is a realm of professional practice that is being ceded to an evolving collection of expert consultants, which locates simulation adjacent to design practice, not central to it.
Different authors have noted that the domain of design has grown narrower over the years as architects “spin off” decisions to related specialists—structural engineers, materials consultants, etc. Of this “retreat to the protected core” John Archea wrote
it seems that whenever a body of knowledge develop[s] to the point that the effects of a specific building attribute or system could be fully understood prior to and independently of the initiation of the design process, architects have redefined their profession to exclude responsibilities for achieving those effects.
(Archea 1987)
Whether this behavior is a response to increasing complexity, a response to risk, or a means of preserving identity, it raises sociological and psychological questions about how best to combine design and computing, and how best to inform an integrative design process.
In light of the above, one of the interesting changes in the last few years has been the re-emergence of in-house research groups within architecture firms. In the 1960s and 1970s a number of firms had active software development efforts, including Skidmore, Owings & Merrill (SOM) and Hellmuth, Obata & Kassabaum (now HOK) in the United States, and Gollins Melvin Ward (now GMW Architects) in the United Kingdom. These projects were largely abandoned or sold as the rising tide of micro-computer CAD washed over the industry in the 1980s, but offices in the early years of this century have witnessed a rebirth of support for in-house research, development, and training in the face of rapid technological change in the industry, including the growth in data exchange options and the advance of end-user programming or scripting functionality in commercial software systems.
This re-awakening coincides with an increased emphasis on STEM (science, technology, engineering, and math) studies and “code” across the broader culture, and increased access to and awareness of open-source software-sharing communities, to the point where—concerning technology use—“many students and practitioners have reached a certain fluency with the existing tool sets and strive to distinguish themselves through customized design processes” (Kilian 2006).
The addition of continuing-education requirements to architectural licensing also energizes the professional and business case that can be built for in-house training and sharing regarding design technology, reinforcing and broadening the tradition of sharing and reflection on design topics that has been practiced in many offices over the years. The net result is a proliferation of both external and internal blogs, knowledge communities, corporate universities, and conference opportunities. Developing and supporting the knowledge and culture necessary to identify and harness these emerging best practices provides another avenue for design computing practitioners.
In this chapter we have seen opportunities in the routine praxis of design and construction for those who study design computing to contribute not only to the efficient and knowledgeable operation of today’s technology, but to engage in activities as disparate as creating new software, re-forming the legal foundations of the industry, and shaping the internal social and external knowledge-sharing communities of firms and communities. At a deeper level are long-term challenges regarding the best way to use data in design practice when each analysis is just one source of information in a complex multi-faceted juggling act. Design is about integration and coordination, where the details are often left to consultants; yet those details have significant impact on the overall project and may need to be revisited at a later date. Capturing, archiving, curating, searching, and accessing information in the design process is increasingly valuable. Modern practice relies heavily on digital technology, but we are still searching for the best fit between creative design and the devilish details.
CURT. 2010. UP-1203 BIM Implementation: An owner’s guide to getting started. Construction Users’ Round Table. April 20.
Davis, Daniel. 2015. How big data is transforming architecture. Architect Magazine (April 23) www.architectmagazine.com/technology/how-big-data-is-transforming-architecture_o.
GSA. 2007. GSA BIM Guide Series 01 (ver 0.6). US General Services Administration. www.gsa.gov/bim
Hagen, Stephen, Peggy Ho, and Charles Matta. 2009. BIM: The GSA story. Journal of Building Information Modeling (Spring): 28–29.
Archea, John. 1987. Puzzle-making: What architects do when no one is looking, in Computability of Design. Edited by Y. Kalay, 37–52. New York, NY: John Wiley.
Campbell, Dace. 2004. Building information modeling in design–build. Talk presented at the University of Washington, December 2.
CURT. 2010. UP-1203 BIM Implementation: An owner’s guide to getting started. Construction Users’ Round Table. April 20.
Davis, Daniel. 2013. Modelled on software engineering: Flexible parametric models in the practice of architecture. Unpublished PhD dissertation, RMIT University.
Davis, Daniel. 2015. How big data is transforming architecture. Architect Magazine (April 23) www.architectmagazine.com/technology/how-big-data-is-transforming-architecture_o
Eastman, Charles M. and Rafael Sacks. 2008. Relative productivity in the AEC industries in the United States for on-site and off-site activities. Journal of Construction Engineering Management 134 (7): 517–526.
Eastman, Chuck, Paul Teicholz, Rafael Sacks, and Kathleen Liston. 2011. BIM handbook: A guide to building information modelling for owners, managers, designers, engineers and contractors. Hoboken, NJ: John Wiley
Franken, Bernhard. 2005. Real as data, in Architecture in the digital age: Design and manufacturing. Edited by Branko Kolarevic, 123–138. New York, NY: Taylor & Francis.
Graves, Michael. 2012. Architecture and the lost art of drawing. New York Times, September 1.
GSA. 2007. GSA BIM Guide Series 01 (ver 0.6). US General Services Administration. www.gsa.gov/bim
Hagen, Stephen, Peggy Ho, and Charles Matta. 2009. BIM: The GSA story. Journal of Building Information Modeling (Spring): 28–29.
Hancock, Lillian. 2013. Visualizing identity: Perspectives on the influences of digital representation in architectural practice and education. Unpublished Master’s thesis, University of Washington.
Hirschberg, U., G. Schmitt, D. Kurmann, B. Kolarevic, B. R. Johnson, and D. Donath. 1999. The 24 hour design cycle: An experiment in design collaboration over the Internet. Proceedings of CAADRIA ’99, the fourth conference on computer aided architectural design research in Asia, 181–190.
Kilian, Axel. 2006. Design innovation through constraint modeling. International Journal of Architectural Computing 4 (1): 88–105.
Kolarevic, Branko and Ali Malkawi (eds.). 2005. Performative architecture: Beyond instrumentality. New York, NY: Spon Press.
M.A. Mortenson. 2005. Case study: Denver Art Museum. www.mortenson.com/approach/virtual-design-construction/~/media/files/pdfs/denver-art-museum.ashx.
McDermott, Terry, and Duff Wilson. 1991. Museum officials feared, discussed big cost overruns. Seattle Times, July 14.
Public Technology, Inc. 1996. Sustainable building technical manual: Green building design, construction, and operation. Washington, DC: Public Technology, Inc.
Robbins, Edward. 1997. Why architects draw. Cambridge, MA: MIT Press.
Romm, Joseph J. 1994. Lean and clean management. New York, NY: Kodansha International.
Sample, Ian. 2015. Google boss warns of “forgotten century” with email and photos at risk. Guardian, February 13. www.theguardian.com/technology/2015/feb/13/google-boss-warns-forgotten-century-email-photos-vint-cerf
Scheer, David Ross. 2014. The death of drawing: Architecture in the age of simulation. New York: Routledge.
Smith, Lillian, Kyle Bernhardt, and Matthew Jezyk. 2011. Automated energy model creation for conceptual design, in Proceedings of SimAUD ’11 symposium on simulation for architecture and urban design. Edited by Ramtin Attar, 13–20. Society for Modeling and Simulation International.
Sterner, Carl S. 2011. Architecture software survey: Results & analysis. Sterner Design. www.carlsterner.com/research/2011_architecture_software_survey.shtml
Sznewajs, Timothy and Brian Moore. 2016. Construction industry mega trends emerging from the recession: What every CFM needs to know. Construction Financial Management Association. www.cfma.org/content.cfm?ItemNumber=2533.
Teicholz, Paul. 2013. Labor-productivity declines in the construction industry: Causes and remedies (another look). AECbytes Viewpoint 67 (March 14).
Teicholz, P., P. >Goodrum, and C. Haas. 2001. U.S. construction labor productivity trends, 1970–1998. Journal of Construction Engineering Management 127 (5): 427–429.
UK BIM Task Group. 2013. Home page. www.bimtaskgroup.org.
Winkenbach, Georges, and David Salesin. 1994. Computer-generated pen-and-ink illustration, in Proceedings of Siggraph’94, 91–100. New York, NY: ACM.
Wojtowicz, Jerzy. 1995. Virtual design studio. Hong Kong: Hong Kong University.