In 2011, I was fortunate enough to attend the Smart Geometry conference in Copenhagen (http://smartgeometry.org/). After another successful conference this year in London, I revisited the report I wrote for the 2011 event entitled ‘Building the Invisible’.
As technological advances in the digital realm continue to drive forward, architects and engineers are offered the opportunity to investigate more complex forms and structures; ultimately producing designs with improved functionality. The SmartGeometry group and its partners aim to extract this digital design and apply it to the physical world, bridging the gap between working practice, research and academia.
This year, established and emerging practitioners convene to tackle ‘Building the Invisible: Informing Digital Design with Real World Data’; the idea of utilising site data to prominently inform an optimised design solution, from inception through to production. We investigate how this concept can be incorporated into working practice; and whether new technologies will eventually become necessity, or do we create complexity for complexities sake?
A challenging and thought-provoking first day provides an opportunity to share perspectives and open debates on this year’s brief. Split into four round-table discussions, the Talkshop questions the types and reliability of data we use throughout the design process. Can we present our findings impartially to truly use data to design, or is all data doctored to reinforce a pre-defined outcome?
Session 1: Data by Design
Bruno Moser (Foster + Partners) suggested that architects need to be responsible when working with gathered data. Interpretation of multiple streams of data collected for any one site is processed into knowledge. In the same way, mis-interpretation of this data can also be converted to knowledge, albeit false. And so we should be aware of the source and reliability of ‘dumb’ data; we often produce accurate conclusions from inaccurate information. Designers should be critical of raw data, using the filters of ‘logic’ and ‘reasoning’ to produce a usable and informative conclusion.
Masdar Institute, Abu Dhabi. Foster + Partners.
The discussion developed into ‘how much information we need’, and ‘how much can we actually use’? Professor Ole Sigmund (Technical University of Denmark) argued the case that an optimised structure is one of optimum beauty. If we were to satisfy a single variable (or single data stream), we soon find an optimised solution. But applying this to a construction project creates a much more complex problem. We do not have a single variable to understand, we must increase the ‘range of optimum’ for each variable to allow equilibrium, within which a design solution should lie.
Session 2: Form Follows Data
If form follows data, what is the role of the designer? Kasper Guldager (3XN Architects) stated that the first 3 weeks of a project are the most important. Using real site data goes beyond rules of thumb; it informs and provokes, seeding the unexpected in the mind of the designer, instigating a subconscious process in converting data to inbuilt wisdom.
The real world becomes integral to design, and increasingly so with advances in analysis and simulation software. Giovanni Betti (Foster + Partners) presented the need for analysis throughout the design process. We now have software that can critique our proposals, creating visually rich outputs which underline design rationale and improve communication with the use of graphical representation. These simulations should be used in a circular workflow with computational design to create a dialogue of negotiation, translation and evolution, whereby data is the informer.
Session 3: Performative Data
In using live information as a design tool, and as project briefs and client’s needs fluctuate, we must build flexibility into our digital systems; Lines, arcs and dimensions in a traditional 2D workflow result in a rigid model which becomes difficult and time consuming to modify as the scheme progresses. Parametric modelling offers intelligence to the geometry, linking the lines, arcs and dimensions into a more singular entity. This allows iterations to be captured more readily and provides us with an all-round more comprehensive design tool.
Session 3 investigates the playfulness of design, memorably portrayed by Daniel Piker (creator of Project Kangaroo software). If we are able to engage with data, we explore solutions more thoroughly. Project Kangaroo mimics real world forces to allow behaviour and fluidity to become essential contributors to early design decisions, fading the distinction between the physical and virtual realms and enhancing our scope to design.
Randers Museum of Art, 3XN.
Session 4: The Data Promise
The final session of the day analyses some of the processes in transforming data into design and production. The discussion touched on the recent proliferation of Rapid Prototyping (RP), allowing iterative design to be readily documented in a comparable and tangible format. An interesting concept emerging from RP is the RepRap machine; a desktop 3D printer. As most of its components are plastic, it can in effect self-replicate. This method of design and fabrication means that the current RepRap (Version 1) can print the next generation machine (Version 2), with the potential to allow working practice to keep up with technological advances indefinitely.
This self-replication creates a constant and seamless feedback loop of data and design and, applying this concept to the construction industry, brings real meaning to the ‘data argument’; without post-occupancy evaluations, we drive forward blind – it is this performance assessment that allows logical and informed progression, creating evolutionary architecture.
Day two’s Symposium aims to showcase how the ideologies of the Talkshop have been put to practical use successfully around the globe, and what challenges lie ahead for the discipline. The following outline a selection of relevant presentations, all of which can be found in detail at: www.smartgeometry.org.
Keynote – Ben Van Berkel
Ben Van Berkel, co-founder of UNStudio, continues to research, apply and adapt the latest computational design technologies within his practice, exploring digital forms and their potential to test possibilities. Generative design helps to find forms more fit for purpose; it allows architects to explore infinite variations and to do the previously impossible. UNStudio house a team of 6 users per project to manage the parametric model, allowing rapid issue of updated information when a change is made. Van Berkel also introduced UNStudio’s Knowledge Platforms, four groups focussed solely on research in the field (each engaged in a different stage of design), feeding innovation into live projects.
One of the studio’s major construction projects, the Mercedes-Benz Museum in Stuttgart, utilised this workflow from an early stage through to completion, to produce an extremely complex yet buildable form. Two intertwining ramps define the form of the building, which create a unique circulation that is almost impossible to describe in traditional floor plans and sections. A collaborative parametric model allowed all parties to clearly visualise the space, as well as producing thousands of drawings to co-ordinate the construction sequence.
Personalised Architecture – Usman Haque
Usman Haque provided an alternative architecture, whereby people-centred design takes precedent. He has designed installations around the world and questions our understanding of space and our interaction with the built environment around us. Haque is also the founder of Pachube, an online, open-source infrastructure for data. This allows real time information from sensors networked to the site to be viewed and processed into easy-to-communicate platforms.
Open Burble Project, Usman Haque.
One of Haque’s projects, contrasting his large scale interactive public spaces, was based around sustainability using “Natural Fuses”, harnessing the carbon-sinking capabilities of plants. Each Natural Fuse unit consists of a plant and a power socket, which stores generated power until a usable amount is stored. As no single plant has the ability to power even a low-power light bulb, the Natural Fuses are networked via the internet, creating a community of units collaboratively generating power that can be shared.
Each user then has a switch with three options, Off/Selfless/Selfish. Choosing the ‘Selfless’ option provides power long enough as to not damage the communities carbon footprint. However, if a user needs power indefinitely, the dial is set to ‘Selfish’, which has the potential to kill another user’s plant. This aims to raise awareness of energy expenditure and encourage people to share their energy use and nurture their plants, the generators.
Adaptive Engineering – Buro Happold
Craig Schwitter of Buro Happold concluded the event with a practical view on the research undertaken at the conference, and portrays how emerging technologies are being integrated into the construction industry at present. High Performance buildings demand more sophisticated design, something which does not necessarily equal ‘high-tech’. When questioned, Craig promoted the necessity for passive design from the outset; once a building is designed to fundamentally optimise its immediate environment, we can then look to technology for added value.
The Adaptive Building Initiative was set up to tackle this issue, researching low energy, high performance buildings. The envelope controls 50% of a building’s energy, and soothe practice engage with building fabric first and foremost. A number of adaptive facade and roof designs were showcased, including retractable sheet metal panels, adaptive fritting and openable roofs. Control of Flexible structures provides an opportunity to create self-optimising buildings – roofs that open more or less to follow the sun, frits to increase opacity to control glare, and double-skin facades that can retract to increase natural ventilation. If we consider a static, optimised performance to be achieved, and factor in the daily and annual dynamic data experienced, an adaptive response is the only true solution.
Ciudad de Justicia, Madrid. Foster + Partners
Technology provides exciting opportunities to develop buildings with greater performance, aiding designers to develop, analyse and simulate concepts in substantial detail. Bentley’s Generative Components and Rhino software provide parametric design tools with unparalleled flexibility for design iterations, although are still in minority use. Renowned architectural practices have been utilising the technology for a decade, and have realised benefits as the technology reaches fruition. UNStudio’s ‘Knowledge Platforms’ and Foster + Partner’s ‘Specialist Modelling Group’ provide exemplar research groups which feed innovation into live projects to reap great benefits.
With the current proliferation of Building Information Modelling in the wider industry, we are in a time of transition whereby practices need to invest in technology in fear of being left behind – Augmented Reality and Touch Screen technology are changing the way clients interact with building models, offering a tangible product to freely interrogate; while parametric design and analysis software is changing the way designers create building models, offering new dimensions of exploration and simulation.
The SmartGeometry conference inspires and provokes. It covered a range of forward-thinking topics, although this year the need to design from first principles, with solid rationale, to bring a high level of performance without demanding an excessive price tag emerged paramount. Analysis tools such as AutoDesk’s Ecotect help back-up these arguments through the analysis of models, which can potentially protect integrated design features from Value Engineering, and should be utilised through the lifecycle of all projects. Architects need to makes moves to emerge on top of competitors by designing more efficiently, and in a time where work winning is essential to survival, value for money becomes paramount.
The conference questioned the role of data in design and asked;
Are we really heading towards an era where data is the designer?